Carpenter & McLuhan. (1956). The New Languages.

Carpenter, E. & McLuhan, M. (1956) [The new languages](http://scholar.google.com.ezproxy.lib.ucalgary.ca/scholar?hl=en&lr=&q=info:sGI-I_PNL8oJ:scholar.google.com/&output=search). Chicago Review. 10(1) pp. 46-52.

on the format of newspapers, and the effect on perception:

>The position and size of articles on the front page is determined by interest and importance, not content. Unrelated reports… are juxtaposed; time and space are destroyed and the *here* and *now* are presented as a single Gestalt. … Such a format lends itself to simultaneity, not chronology or lineality. Items abstracted from a total situation are not arranged in causal sequence, but presented in association, as raw experience.

on communication channels:

>Thus each communication channel codifies reality differently and thereby influences, to a surprising degree, the content of the message communicated.

**DN:** both concepts apply nicely to educational technology, and to online discussion. How does the format of the online discussion platform shape the presentation, perception, and shape of the message(s) communicated?

Notes: Coulthard, M. (1974). Approaches to the Analysis of Classroom Interaction

Coulthard, M. (1974). [Approaches to the analysis of classroom interaction](http://www.informaworld.com/index/746427261.pdf). Educational Review. 26(3). pp 229 — 240.

On directing discourse:

>Participants with equal rights and status, as in everyday conversation, negotiate in very subtle and complex ways for the right to speak, to control the direction of the discourse and to introduce new topics. We therefore determined to reduce the number of variables by choosing a situation in which one of the participants has an acknowledged right to decide who will speak, when they will speak, what the topic of the discourse will be, and the general lines along which it will progress. The classroom was an ideal situation.

on linguistic analysis vs. educational analysis:

>For instance, Gallagher and Ashner (1963)1 and Taba et al (1964)2 both focus on thinking, defined as ‘an active transaction between the individual and the demands of his environment, which is neither fully controlled by environmental stimulation, nor wholly independent of some mediating interaction’. Their categories are attempts to analyse one of the purposes of the interaction, but are several stages removed from the linguistic data and cannot be directly related to it.

on linguistic description of classroom discourse:

>Verbal interaction inside the classroom differs markedly from desultory conversation in that its main purpose is to instruct and inform and one would expect this difference to be reflected in the way in which the discourse progresses. One of the functions of the teacher is to choose the topic, to decide how the topic will be sub- divided into smaller units and to cope with digressions and mis- understandings.

on patterns of interaction in a face-to-face classroom:

>We expected eliciting exchanges to consist of a Teacher question followed by Pupil reply, T-P, T-P, T-P, but this was not the case— the structure is rather T-P-T, T-P-T, T-P-T. In other words, the teacher almost always has the last word, and has two turns to speak for every one pupil turn. This, of course, partly explains the consistent finding that teachers talk, on average, for two thirds of the talking time. The teacher asks a question, the pupil answers it and the teacher provides evaluative feedback before asking another question.

**DN:** does this pattern show up online? Is it different, based on the environment/platform?

  1. Gallagher, J. J. & Aschner, M . J. (1963), ‘A preliminary report on analyses
    of classroom interaction’ Merrill-Palmer Quarterly,9, 1963. []
  2. Taba, H., Levine, S. & Elzey, F. F. (1964), Thinking in Elementary School Children. Report, U.S. Department of Health, Education and Welfare Co-operative Research Project No. 1574. San Francisco State College. []

Notes: Garrison, D. Online Community of Inquiry Review: Social, Cognitive, and Teaching Presence Issues

Garrison, D.R. (2007). Online community of inquiry review: Social, cognitive, and teaching presence issues. Journal of Asynchronous Learning Networks. pp. 61-72.

from abstract:

>The early research in the area of online communities of inquiry has raised several issues with regard to the creation and maintenance of social, cognitive and teaching presence that require further research and analysis. The other overarching issue is the methodological validity associated with the community of inquiry framework.

on communities of inquiry:

>Higher education has consistently viewed community as essential to support collaborative learning and discourse associated with higher levels of learning. Moreover, the asynchronous nature of online communication and the potential for disconnectedness has focused attention on the issue of community.

more talk about the background/theory of social/cognitive/teaching presences. blahblahblah.

on teaching presence:

>Interaction and discourse plays a key role in higher-order learning but not without structure (design) and leadership (facilitation and direction).

dialog and discourse are different:

>From a teaching perspective, this is the difference between dialogue and discourse [39]. Facilitation supports dialogue with minimal shaping of the course of the discussion. Discourse, on the other hand, is disciplined inquiry that requires a knowledgeable teacher with the expectation that discourse progresses in a collaborative constructive manner and students gain an awareness of the inquiry process.

on coding and validity:

>There is the question, however, as to why we would want to code at the indicator level? Coding at the indicator level is difficult [45]. Is it not a bit premature considering the early stage of this research and testing of the framework? What research questions would coding at the indicator level answer? How does being able to distinguish among the indicators add to the validity of the model? Are indicators too context specific to expect a standard set of indicators across all online educational environments?

conclusions:

>A community of inquiry needs to have clear expectations as to the nature of critical discourse and their postings. Participants need to be aware of the academic objectives, the phases of inquiry, and the level of discourse. These educational challenges raise the importance and role of teaching presence. The distinction between facilitation and direction must also be clear from a design perspective. Teaching presence must consider the dual role of both moderating and shaping the direction of the discourse. Both are essential for a successful community of inquiry.

Notes: Vaughan & Garrison: Creating cognitive presence in a blended faculty development community

Vaughan, N. & Garrison, D.R. (2005). [Creating cognitive presence in a blended faculty development community](http://www.sciencedirect.com/science/article/B6W4X-4FPDRGW-2/2/4fc7b0658409bfe5002581de0ba0d383). The Internet and Higher Education. 8(1). pp 1-12.

This study compares face-to-face and online discussions in a professional development course on blended learning. Specifically looking at the three forms of presence as defined as part of the community of inquiry model (but with an emphasis on how participants move through the 4 phases of the inquiry process (triggering event, exploration, integration and resolution) as part of their cognitive presence)

>Social presence creates a sense of belonging that supports meaningful inquiry. Social presence provides the context that makes possible critical discourse and reflection.

*DN: Would various platforms that may offer different tools to represent social presence effect the critical discourse? Same question for teaching presence and cognitive presence…*

on cognitive presence and blended learning:

>Rovai (2002)1 has shown a significant link between a sense of community and cognitive presence in that community can facilitate quality learning outcomes. However, this is not a simple and invariant relationship. In a study of informal professional development forums, Kanuka and Anderson (1998)2 found high interaction (i.e., social presence) but only a low level of cognitive exchange. Garrison and Cleveland-Innes (unpublished manuscript)3 also found that interaction by itself does not necessarily create cognitive presence. They also suggest that asynchronous online learning has considerable potential to create cognitive presence.

Methodology:

>Data were collected and transcribed from the transcripts of the online discussion forums, audio recordings of the face-to-face sessions, and a post-study interview with each participant. Online and face-to-face transcripts were coded for cognitive presence. The coding protocol from Garrison et al. (2000)4 community of inquiry model was used.

on analysis:

>The unit of analysis was a single message for the online discussion transcripts and a single participant response for the oral transcripts. Two trained graduate students completed the coding for cognitive presence and inter-rater reliability of the coding process was assessed.

Discussion:

>In a community of inquiry, it is essential that critical discourse be encouraged. Considering the reflective nature of online communication, there is a real opportunity to facilitate reflective critique. Because of the reflective potential, Meyer (2003)5 found that the threaded online discussion comments were “often more thoughtful, more reasoned, and drew evidence from other sources” than those made within the face-to-face sessions (p. 61).

Conclusions:

>The results of this research raise many questions about blended learning designs in support of a community of inquiry. However, it can be concluded from the results reported here that blended learning was successful in supporting a faculty development community of inquiry. A worthy topic for further research would be to focus on high level learning processes and outcomes using blended learning designs.

  1. Rovai, A. P. (2002). Building a sense of community at a distance. International Review of Research in Open and Distance Learning, 3(1). Retrieved on July 21, 2004, from http://www.irrodl.org/content/v3.1/rovai.html []
  2. Kanuka, H., & Anderson, T. (1998). Online social interchange, discord, and knowledge construction. Journal of Distance Education, 13(1), 57-75. []
  3. Garrison, D. R., Cleveland-Innes, M. (unpublished manuscript). Facilitating cognitive presence in online learning: Interaction is not enough. has this been published since 2005? []
  4. Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical thinking in a text-based environment: Computer conferencing in higher education. Internet and Higher Education, 11(2), 1-14. Retrieved July 21, 2004, from http://communitiesofinquiry.com/documents/CTinTextEnvFinal.pdf []
  5. Meyer, K. A. (2003). Face-to-face versus threaded discussions: The role of time and higher-order thinking. Journal of Asynchronous Learning Networks, 7(3), 55-65. []

Notes: Niu, H. & van Aalst, J.: Participation in Knowledge-Building Discourse: An Analysis of Online Discussions in Mainstream and Honours Social Studies Courses

Niu, H. & van Aalst, J. (2009). [Participation in Knowledge-Building Discourse: An Analysis of Online Discussions in Mainstream and Honours Social Studies](http://www.cjlt.ca/index.php/cjlt/article/view/515). Canadian Journal of Learning and Technology. 35(1). pp. 1-23

>**Abstract**: Questions about the suitability of cognitively-oriented instructional approaches for students of different academic levels are frequently raised by teachers and researchers. This study examined student participation in knowledge-building discourse in two implementations of a short inquiry unit focusing on environmental problems. Participants in each implementation consisted of students taking a mainstream or an honours version of a tenth grade social studies course. We retrieved data about students’ actions in Knowledge Forum® (e.g., the number of notes created and the percentage of notes with links), and **conducted a content analysis of the discourse by each collaborative group**. We suggest the findings provide cause for optimism about the use of knowledge-building discourse across academic levels: there was moderate to strong evidence of knowledge building in both classes by Implementation 2. We end with suggestions for focusing online work more directly on knowledge building.

**Intro**

>Knowledge building shares certain features with these approaches, including emphasis on collaboration, metacognition, distributed expertise, and use of computer-supported inquiry. As elaborated below in the section entitled “knowledge building,” its distinctiveness follows from the commitment to make processes of expertise and innovation prominent in school. In a class operating as a knowledge- building community, students are agents of their own learning, work toward goals of collective knowledge advances, and treat ideas as real things that can be improved by means of discourse (Bereiter, Scardamalia, Cassells, & Hewitt, 1997)1 . Advocates for knowledge building assert that it fosters a host of 21st century skills.

but, what about:

>“the belief that instruction of higher order thinking is an appropriate goal mainly for high-achieving students and that low- achieving students, who have trouble with mastering even basic facts, are unable to deal with tasks that require thinking skills” (Zohar & Dori (2003) p. 146)2 .

**study design:**

>The goal of this study was to examine participation in asynchronous online discourse as an aspect of knowledge building, with a view to understanding its scalability across courses differing in academic level. To this end, we analyzed server-log data and the content of students’ contributions to an online knowledge-building environment (Knowledge Forum®, see http:www.knowledgeforum.com) from two implementations of a short inquiry unit in which students investigated environmental problems.
>
>…
>
>The study examined two kinds of questions in the context of two successive implementations of knowledge building: How do participation levels in the mainstream classes compare with those in the honours classes, and to what extent can we conclude students engage in knowledge-building discourse?

**knowledge building:**

>Though knowledge building involves many types of interactions, discourse in Knowledge Forum plays a fundamental role; it provides a reliable and permanent record of experiments, classroom activities, ideas and questions that can be used to review progress and to develop understanding at progressively more complex levels.

>According to van Aalst, students need to think of their work in the Knowledge Forum database as building a communal learning resource that has lasting utility rather than as online conversations. By contrast, teachers most often use asynchronous environments to promote the sharing, discussion, or debate of ideas.

**analyzing participation in knowledge-building discourse:**

>We contrast two perspectives on participation in asynchronous online discourse: one focusing on individual students’ actions in the online environment and one focusing on the identification of evidence for emergent and collective phenomena within the discourse of a community.

keeping in mind:

>Analysis of individual actions gives an incomplete picture. As many authors have pointed out, the actions in a discourse are mutually dependent (Sawyer, 20063 ; Stahl, 20024; Wells, 19995 ). For example, when students are asked to write notes to summarize what they have learned from their discourse, some comment that others have already stated their most salient learning and that they therefore do not state it again.

**variables that influence participation:**

>* prior domain knowledge
>* motivation
>* goal orientation
>* writing apprehension
>* epistemological beliefs
>* ability to analyze arguments
>
>but infeasible to measure all of these… this study focused on *writing apprehension* and *ability to self-assess contributions* (reflect on discourse)

> Writing apprehension reflects a student’s attitude and emotion towards writing tasks and written communication. … What they write is available for everyone to see and critique, which for some could create apprehension, causing them to write little and avoid spontaneity and sophisticated language (Faigley, Daly, & Witte, 1981)6 .
>
> Ability to reflect on discourse is also important to knowledge building, especially for evaluating the progress of a line of inquiry and for setting communal learning goals.

for the reflection on discourse part, students produced portfolios of their contributions, made of notes in the discussion board. Students summarized evidence of the 4 phases of knowledge building (working at the cutting edge, progressive problem solving, collaborative effort, and identifying high points) with hyperlinks to relevant notes as evidence.

**the study:**

>examined participation in online discussions in the context of two successive implementations of a three-week inquiry unit

research questions:

1. To what extent do students in mainstream and honours social studies courses participate in online discussions?
2. To what extent can the online discussions in both academic levels be characterized as knowledge-building discourse?

**methodology:**

first iteration analyzed post-hoc, latent analysis, studying only the discussion board posts after the implementation was over. second iteration involved the teacher, and gathered additional data (a questionnaire, and a writing apprehension test).

**Procedures:**

Teacher set up new discussion boards for each study iteration. Use of the discussion board was demonstrated by teacher. Students had one full class (70 minutes) per week in the computer lab to participate, and could participate from home as well.

>The teacher designed the collaborative work to proceed in several phases: (a) showing the area of concern on a world map; (b) identifying the problem with historical and current information; (c) identifying causes, consequences, and solutions to the problem; (d) and explaining difficulties one might face in implementing a proposed solution. He had used this design for several years, and now created a view in Knowledge Forum for each phase. After setting up the database this way, the teacher did not systematically analyze the discussions or comment on them in Knowledge Forum. However, he regularly read notes when students had them open during class and asked students if they were making progress or needed assistance.
>
>Participation in Knowledge Forum was not included in the formal assessment scheme for the unit. Instead, students were required to individually create portfolios using several of their own notes as artifacts; these were assigned at the end of the unit. Students were asked to identify two to three of their own notes and explain why they considered these notes as exemplary knowledge-building contributions. Thus, we may assume that students’ productivity in Knowledge Forum was not influenced by the need to meet a quota for note creation and reading. When students began preparing their portfolios, the teacher related the topics of investigation to the prescribed learning outcomes provided by the Ministry of Education to provide synthesis across the work by different groups.

limitations:

>Perhaps the most significant limitation was that the course commenced only a few weeks before the inquiry unit, thus there was little time for students to develop as a community and to acquire values and practices conducive to knowledge building; the decision to assign the students to groups (thought necessary by the researchers in a large class) also limited community development. In addition, three weeks seemed short for observing emergent knowledge-building phenomena such as progressive problem solving and the articulation of general principles from the solutions proposed by the various groups. Thus, this study examines knowledge building in collaborative groups and during relatively short periods of time.

**Measures & Analysis:**

server logs were crunched, looking at:

* notes created (productivity)
* percentage of notes read (productivity)
* percentage of notes with links (responses to other notes)
* note revision (ideas as improvable objects) (*really? is that what this demonstrates?*)
* scaffold use (metacognitive prompts)

**Content Analysis of Knowledge-Building Discourse:**

all writing was analyzed as with the collaborative group – group discourse – as the unit if analysis.

5 principles used in analysis (others were dropped because of low inter-rater reliability):

1. **working at the cutting edge** – community value to advance the state of knowledge. new stuff.
* *(Collective responsibility, community knowledge; epistemic agency; real ideas, authentic problems)*
2. **progressive problem solving** – reinvesting efforts in response to new ideas, building on previous work
* *(Improvable ideas; rise above)*
3. **collaborative activity** – students making an effort to help each other understand ideas. includes service to the community.
* *(Collective responsibility, community knowledge; idea diversity; democratizing knowledge)
4. **identifying high points** – personal insight, metacognitive insight, insight into knowledge advancement processes (self and community)
* *(Epistemic agency; rise above)*
5. Constructive uses of authoritative sources – keeping in touch with the present state and growing edge of knowledge in the field
* *(identifying inconsistencies and gaps in knowledge sources and using resources effectively for extending communal understanding)*

**Content Analysis**
>Though the mean scores were not high, they do suggest moderate evidence for participation by the groups in the mainstream class for three of the principles: working at the cutting edge, progressive problem solving, and collaborative effort. It is worth noting that while the scores were higher for the honours class, the data did not suggest large differences between the two classes compared with the large between-class differences for the server-log data. (**Due to the small number of groups no statistical tests are done for the content analysis. The findings must therefore be interpreted with caution.**)

**Conclusions**

>It is important to understand why there was not a strong relationship between the server-log indices and the results of the content analysis of knowledge-building discourse in this study.

—-
**DN Notes:**

* study had small sample size – but probably difficult to get a larger n. so statistical tests are pretty much useless.
* content analysis has problems with inter-rater reliability.
* transaction analysis may be more useful than straight (latent) content analysis – what are they doing, rather than an aggregate of what are they saying?

  1. Bereiter, C., Scardamalia, M., Cassells, C., & Hewitt, J. (1997). Postmodernism, knowledge-building, and elementary science. Elementary School Journal, 97(4), 329- 340. []
  2. Zohar, A. & Dori, Y. J. (2003). Higher order thinking skills and low achieving students– are they mutually exclusive? Journal of the Learning Sciences, 12(2), 145-182. []
  3. Sawyer, R. K. (2006). Analyzing collaborative discourse. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 187-204). New York, NY: Cambridge University Press. []
  4. Stahl, G. (2002). Rediscovering CSCL. In T. Koschmann, R. Hall, & N. Miyake (Eds.), CSCL 2: Carrying forward the conversation (pp. 169-181). Mahwah, NJ: Lawrence Erlbaum Associates, Inc. []
  5. Wells, G. (1999). Dialogic inquiry: Toward a sociocultural practice and theory of education. New York, NY: Cambridge University Press. []
  6. Faigley, L., Daly, J. A., & Witte, S. P. (1981). The role of writing apprehension in writing performance and competence. Journal of Educational Research 75, 16-21. []

Notes: Murphy, E. – A framework for identifying and promoting metacognitive knowledge and control in online discussants

Murphy, A. (2008). [A framework for identifying and promoting metacognitive knowledge and control in online discussants](http://www.cjlt.ca/index.php/cjlt/article/view/491). Canadian Journal of Learning and Technology. 34(2) pp. 1-18.

>**Abstract:** The effectiveness of computer-based learning environments depends on learners’ deployment of metacognitive and self-regulatory processes. Analysis of transmitted messages in a context of Computer Mediated Communication can provide a source of information on metacognitive activity. However, existing models or frameworks (e.g., Henri, 1992)1 that support the identification and assessment of metacognition have been described as subjective, lacking in clear criteria, and unreliable in contexts of scoring. This paper develops a framework that might be used by researchers analysing transcripts of discussions for evidence of engagement in metacognition, by instructors assessing learners’ participation in online discussions or by designers setting up metacognitive experiences for learners.

The paper is just a framework for approaching online discussions wrt metacognition. it’s basically a beefy lit review, which is handy, but no methods to use, per se.

***DN: my summary of the intro section:** effectiveness of online learning is tied to learner’s regulation of their own learning, and to deployment of self-regulation and metacognitive processes. content analysis can provide info about metacognitive activity. Henri’s model of analysis makes it difficult to capture metacognitive activities. Interaction analysis gets at metacognitive activity better (see Gunawardena, Lowe & Anderson’s Interaction Analysis Model2 , ferinstance)*

>In general, there is an abundance of literature related to analysis of online discussions (…). However, MC has received meagre attention in this literature especially as compared to other skills, such as, critical thinking.

>The paper relies on foundational work in the area of MC, including that of Flavell (1987)3 , Jacobs and Paris (1987)4 and Brown (1987)5. It also draws on Schraw and Dennison (1994)6 , who themselves built on Flavell’s work to create their Metacognitive Awareness Inventory. This paper also builds on Anderson et al.’s (2001)7 taxonomy of Mc knowledge and on Henri’s (1992)8 work.

Metacognitive variables (after Henri and Flavell):

>1. **Person**: All that is known or believed about the characteristics of humans as cognitive beings.
> indicators:
> * Comparing oneself to another as a cognitive agent
> * Being aware of one’s emotional state
>2. **Task**: All information acquired by a person in terms of the task or different types of tasks. Appreciation of the quality of available information.
> indicators:
> * Being aware of one’s way of approaching the task
> * Knowing whether the task is new or known
>3. **Strategy**: Means chosen to succeed in various cognitive tasks.
> indicators:
> * Strategies making it possible to reach a cognitive objective of knowledge acquisition
> * Metacognitive strategies aimed at self-regulation of progress

lots of examples in the paper of various indicators used by other researchers.

prompts:

> If we conceptualize Mc knowledge as Declarative, Procedural and Conditional, what might be some actual examples of these types of knowledge in a context of an online discussion, i.e., if a researcher or instructor wanted to identify instances of Mc thinking, what types of statements might constitute signs or evidence?

  1. Henri, F. (1992). Computer conferencing and content analysis. In A. R. Kaye (Ed.), Collaborative learning through computer conferencing: the Najaden papers (pp. 117– 36). Berlin: Springer-Verlag. []
  2. Gunawardena, C., Lowe, C., & Anderson, T. (1997). Interactional analysis of a global online debate and the development of constructivist interaction analysis model for computer conferencing. Journal of Educational Computing Research, 17(4), 395-429. []
  3. Flavell, J. H. (1987). Speculations about the nature and development of metacognition. In F. E. Weinert & R. H. Kluwe (Eds.), Metacognition, motivation and understanding (pp. 21-29). Hillsdale, NJ: Erlbaum. []
  4. Jacobs, J. E., & Paris, S. G. (1987). Children’s metacognition about reading: Issues in definition, measurement, and instruction. Educational Psychologist, 22(3 & 4), 235-278. []
  5. Brown, A. L. (1987). Metacognition and other mechanisms. In F. E. Weinert & R. H. Kluwe (Eds.), Metacognition, motivation and understanding (pp. 65-116). Hillsdale, NJ: Erlbaum. []
  6. Schraw, G., & Dennison, R. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19, 460-475. []
  7. Anderson, L., Krathwohl, D., Airasian, P., Cruikshank, K., Mayer, R., Pintrich, P., Raths, J., Wittrock, M. (2001). A taxonomy for learning, teaching and assessment: A revision of Bloom’s taxonomy of educational objectives. New York: Longman. []
  8. Henri, F. (1992). Computer conferencing and content analysis. In A. R. Kaye (Ed.), Collaborative learning through computer conferencing: the Najaden papers (pp. 117– 36). Berlin: Springer-Verlag. []

Notes: Meyer, K.A. A Study of Online Discourse at The Chronicle of Higher Education

Meyer, K.A. (2010). [A Study of Online Discourse at The Chronicle of Higher Education](http://www.springerlink.com/content/g7188g27k35ql765/?p=87c10419fe5a417b8d4e9c49580dd5ba&pi=2). Innovative Higher Education (2010) vol. 35 pp. 143-160.

>**Abstract**: Given the explosive growth of online communications, new forms of discourse are an intriguing topic of study. This research focused on ten online discussions hosted by The Chronicle of Higher Education, using content and discourse analysis of the postings to answer several questions. What is the “conversational scaffolding” used by posters in higher education-related online discussions? Are academic online discussions more like speech or writing? Additional questions dealt with how posters identify themselves, who their audience is, what motivates them, how accurate and political they are, and what the experience of reading these online discussions is like. Based on the analyses, these posters were more likely to write correctly although with diary-like personal insights. Through the analysis I also identified both positive and negative aspects of the online discussion experience.

on discourse analysis:

>Baron’s research (2008)1 offers a type of “discourse analysis” (Fulcher n.d.), which has a long research tradition in the communications and media literature. Discourse analysis is a way of understanding social interactions, which online discussions are although they depend on writing to communicate and a web site as the medium. To engage in discourse analysis, a conversation is transcribed and deconstructed, each utterance is examined (in this case, a posting to the discussion served as an utterance), and themes are noted.

and

>Discourse analysis is defined by Palmquist (n.d., ¶ 4)2 as the “application of critical thought to social situations and the unveiling of hidden (or not so hidden)” politics, motivations, issues, and perceptions. It is a form of deconstruction, identifying features in the text (such as themes and word choices) in each sentence or thought (Fulcher n.d.)3 . In discourse analysis, several items are analyzed, including sentence construction, grammar, stress, tone, and word choice; there are no set formulae for conducting discourse analysis (Rogers 2004)4 .

sample:

>In May 2009, I pulled ten online discussions from the Forums section of The Chronicle of Higher Education; these ten discussions were less than 10% of the 124 discussions available at the time.

ethics clearance:

>Since they were posting to a public web site and they opted to use screen names that may or may not identify them, human subjects review and approval was not needed. This also means that the subjects did not consent to be involved in this study although their choice to post to a public site assumes that they were willing to have their words read and analyzed.

analysis:

>Data collected on each discussion included:
>
> 1. total number of posts
> 2. number of sentences
> 3. average number of sentences per post
> 4. total number of words in the discussion
> 5. average number of words per post
> 6. range in number of words in posts
> 7. percent of abbreviations (e.g., NSF for National Science Foundation or “lol” for “laughing out loud”) per discussion
> 8. percent of acronyms
> 9. percent of contractions
> 10. percent of emoticons
> 11. percent of spelling errors
> 12. percent of punctuation errors

***DN:** so? none of these 12 types of data get at the discourse. what is the pattern of interaction? levels of engagement? etc… this is simple content analysis. After talking about discourse analysis in the intro, they revert to content analysis?*

conclusion:

***DN:** basically, that people post stuff online. they may or may not do it anonymously. they may or may not post about academic stuff. they are, apparently, human, and do stuff that humans do. The paper isn’t much use to what I’m doing, but some of the references are interesting for background.*

  1. Baron, N. S. (2008). Always on: Language in an online and mobile world. Oxford: Oxford University Press. []
  2. Palmquist. R. (n.d.). Discourse analysis. Retrieved June 4, 2009 from http://www.ischool.utexas.edu/ ∼palmquis/courses/discourse.htm []
  3. Fulcher, E. (n.d.). What is discourse analysis? Retrieved June 4, 2009 from http://www.eamonfulcher.com/ discourse_analysis.html []
  4. Rogers, R. (2004). An introduction to critical discourse analysis in education. In R. Rogers (Ed.), An Introduction to Critical Discourse Analysis in Education (pp. 1–18). Mahwah, NJ: Lawrence Erlbaum Associates, Inc. []

Notes: Kanuka & Anderson. Online Social Interchange, Discord, and Knowledge Construction

Kanuka, H. & Anderson, T. (1998). Online Social Interchange, Discord, and Knowledge Construction. Journal of Distance Education. 13 (1) pp. 57-74.

>This study presents the results of an exploratory multi- method evaluation study and transcript analysis of an online forum. The researchers used a constructivist interaction analysis model developed by Gunawardena, Lowe, and Anderson (1997)1

Gunawardena et al.’s phases:

1. Sharing/comparing of information
2. Discovery and exploration of dissonance or inconsistency among the ideas, concepts, or statements advanced by different participants.
3. Negotiation of meaning and/or co-construction of knowledge.
4. Testing and modification of proposed synthesis or co-construction.
5. Phrasing of agreement, statenient(s), and applications of the newly constructed meaning.

Method:

>The research study focused on the analysis of data obtained from participants in the online forum. We read postings, but did not participate in the forum. At the end of the two-week forum, an online survey was distributed to all participants and a transcript analysis was undertaken. Finally, a telephone survey was conducted with a stratified sample of participants.

Results:

> the forum was perceived by the participants as successful in providing opportunities for reflection and exposure to multiple perspectives on topics that were relevant to the participants. There seemed less agreement, however, with the notion that the forum provided opportunity for application of new knowledge and deeper understanding of the issues.

Transcript analysis:

>The unitizing process involved a coding operation that separated the participants’ online interactions (postings that fell in phases I through V) from other postings, such as the moderator’s summaries or other general announcements.

>The transcript analysis procedure consisted of reading each message and assigning it to one or more phases. A message that contained two or more distinct ideas or comments was coded in two or more phases (the messages were coded independently by both researchers). Discrepancies were discussed, and a single coding was determined from these discussions.

So many of the messages wound up being categorized as Level 1 (basic knowledge) that a grounded theory method was added to the study.

>Grounded theory provided a useful collection of strategies (such as constant comparison and analytic meaning) when little is known about a phenomenon—as was the case in this study where the focus was to investigate knowledge construction and social interaction in an online environment. Using grounded theory, we reassessed and then recategorized the postings.

*__DN:__ not sure what the addition of grounded theory does to the study. seems like they just threw up their hands, said “WTF?” and fell back on divining chicken entrails to get something out of the transcripts…*

>These two new categories were generated from the data: social interchange2 and social discord and knowledge construction3.

*__DN:__ the study turns out to not be directly applicable, but points to some things to watch for when coding – having too many things fall into the basic level or “other” category. How to design the coding to avoid having to fall back on grounded theory?*

  1. Gunawardena, L., Lowe, C, & Anderson, T. (1997). Interaction analysis of a global on-line debate and the development of a constructivist interaction analysis model for computer conferencing, Journal of Educational Computing Research, 17(4), 395-429. []
  2. basically, participants talking to each other. strange, that this would be observed in an online discussion board, designed to facilitate participants, well, talking to each other… []
  3. there were a few instances of interaction between the forum participants that involved inconsistencies or contradictions in information and/or ideas that resulted in a new or changed perspective. the effect of cognitive dissonance []

Notes: Guan et al. Content analysis of online discussion on a senior-high-school discussion forum of a virtual physics laboratory

Guan, Y.H., Tsai, C.C., & Hwang, F.K. (2006). [Content analysis of online discussion on a senior-high-school discussion forum of a virtual physics laboratory](http://www.springerlink.com/content/aj8u085378706178/). Instructional science. 34 (4) pp. 279-311

>In this study we content analyzed the online discussion of several senior-high-school groups on a forum of a virtual physics laboratory in Taiwan. The goal of our research was to investigate the nature of non-course-based online discussion and to find out some useful guidelines in developing such discussion forums for learning purposes.

*__DN:__ Studied extracurricular forum activity, not course-based. Results not applicable to what I need, but maybe some content analysis methodology could be useful…*

Researchers adopted Henri’s framework and models1 (like the previous Garrison study). As a result, used the same parameters:

>The content analysis was conducted in terms of participation rate, social cues, interaction types, and cognitive and metacognitive skills.

Why look at non-course-based discussion boards? Getting a self-direction and lifelong learning:

>The advantages of non-course-based online discussion lie in that the participants of the discussion are not limited to the members of a particular course, and the participation in the discussion is based on common interests shared by the participants. That is, the participation is totally voluntary and people may join or leave the discussion any time they want.

*__DN:__ How did they get ethical approval to gather data on minors in a public discussion board?*

>The discussion started with a message containing a question that could be posted by any participant. Whoever was interested in the topic could rely to it. The size of discussion groups ranged from 1 to 213 participants. The participation on the forum was voluntary.

>Two moderators supervised the discussion forum. A moderator was a physics professor in NTNU. The other one was a senior-high-school physics teacher.

These moderators filtered content, removing objectionable/rude words, correcting misleading concepts/statements.

>The content of a message was analyzed based on its idea(s). An idea expressed a complete thought, which might contain one or several sentences or even several paragraphs. A message might consist of more than one idea. The analysis was conducted according to five dimensions: the participation rate, social cues, interaction types, cognitive skills, and metacognitive skills.

Results:

>Altogether we analyzed 575 messages containing 634 ideas, which were posted by 349 participants.

>Overall, 19.72% of the ideas were not relevant to the subject under discussion. Only 11.49% of the ideas revealed metacognitive skills (i.e., those of ‘evaluation’, ‘planning’, ‘regulation’, and ‘self-awareness’), and 16.88% of the ideas did not reveal any cognitive or metacognitive components considered in the study.

There were 2 types of discussion board activities – NR where significant contribution was not required in order to participate, and R, where a significant and guided contribution was required in order to gain access to the board. Active participation in the Required group was low, even though number of participants was high (they needed to participate in order to gain access, but didn’t care about the board). The Non-Required board had much higher active participation (numbers of posts) although lower numbers of participants (because it was an optional discussion board).

Also, the Required board posts seemed to be lower, meta-cognitively, than the Non-Required board posts, although Required posts showed higher cognitive levels. *Not sure of the value of this distinction – non-required activities are ~more metacognitive, but required activities are ~more cognitive? so many variables. so many generalizations.*

>Overall, whether online discussion can help people to learn more deeply depends on the quality of discussion, which can be influenced by the features of participants and discussion topics, the interactions between the participants, the purpose, design and organization of the discussion forums, and not least the moderators coordinating the discussion.2

  1. Henri, F. (1992). Computer conferencing and content analysis. In A.R. Kaye, ed., Collaborative learning through computer conferencing: the Najaden papers, pp. 115–136. Springer: New York. []
  2. Duh. []

Notes: Garrison et al. Facilitating cognitive presence in online learning: Interaction is not enough.

Garrison, D.R. & Cleveland-Innes, M. (2005). [Facilitating cognitive presence in online learning: interaction is not enough](http://inquirygroup.edublogs.org/files/2007/10/cognitivepresence2005.pdf). The American Journal of Distance Education. 19(3). 133-148.

>This study assessed the depth of online learning, with a focus on the nature of online interaction in four distance education course designs.

Article provides a good background to course design, deep/surface/achievement-oriented learning. The study used a survey (Study Process Questionnaire) to compare changes in learning strategies selected by 75 students in 4 courses in different subjects and levels.

>Interaction is seen as central to an educational experience and is a primary focus in the study of online learning. The focus on interaction in online learning emerges from the potential and properties of new technologies to support sustained educational communication. Communication and Internet technologies provide a high degree of communicative potential through asynchronous interaction design options (Garrison and Anderson 2003).1 From an access perspective, participants are able to maintain engagement in a community of learners when and where they choose.

.

>The purpose of an educational experience, whether it is online, face-to-face, or a blending of both, is to structure the educational experience to achieve defined learning outcomes. In this context, interaction must be more structured and systematic. A qualitative dimension is introduced where interaction is seen as communication with the intent to influence thinking in a critical and reflective manner.

.

>Moore (19892 , 19903 ) was one of the first to focus on interaction issues in distance education. He identified transactional distance as consisting of dialogue (i.e., interaction) and structure (i.e., design).

and on why I’ll get to delve into qualitative discourse analysis…

>An interactive community of learners is generally considered the *sine qua non* of higher education. However, interaction is not a guarantee that students are cognitively engaged in an educationally meaningful manner. High levels of interaction may be reflective of group cohesion, but it does not directly create cognitive development or facilitate meaningful learning and understanding. Interaction directed to cognitive outcomes is characterized more by the qualitative nature of the interaction and less by quantitative measures. There must be a qualitative dimension characterized by interaction that takes the form of purposeful and systematic discourse.

on social presence and higher-order learning:

>…establishing social presence was more heavily shaped through peer interaction. With regard to successful higher-order learning, … teaching presence in the form of facilitation is crucial in the success of online learning.

on facilitated discourse and success of online courses:

>The design feature of successful online courses demonstrates structured discourse that facilitate clear discussion threads, avoid disjointed monologues, and move the discussion through the phases of inquiry (levels of thinking).

on deep vs. surface vs. achievement-oriented learning:

>In a **deep** approach to learning, material is embraced and digested in the search for meaning. **Surface** learning employs the least amount of effort toward realizing the minimum required outcomes. Surface learners are motivated to complete the task rather than assimilate the learning. **Achievement** approaches to learning are reflected by an orientation to the external reward for demonstrating learning. Strategies for the achievement orientation focus on the activities that will result in the highest marks.

*but there isn’t a mention of strategies/motivations for deep learning…*

>All students are capable of employing any of the three approaches and do so as required by the learning environment; they choose strategies deemed to be most effective based on the requirements in the environment. Students can move from one approach to another and do so in response to the climate and requirements of the course.

*so, would the choice of platform do anything to the approach selected by students?*

Method:

>The study was conducted from January 2003 to April 2004. It administered the Study Process Questionnaire to the online course participants (seventy-five students participated) to measure changes in how graduate students choose to strategize their learning in a particular learning setting.

**DN*: wait. the entire data set was a survey sent by email to students? no analysis of the actual online learning?*

Discussion:

>High levels of learning are dependent less on the quantity of interaction than on the quality, or substance, of interaction. That is, social presence may be a necessary but insufficient precondition for creating a community of inquiry and encouraging deep approaches to learning.

.

>Teaching presence must be available, either from the facilitator or the other students, to transition from social to cognitive presence.

.

>It appears that teaching presence contributes to the adoption of a deep approach to learning and that interaction by itself does not promote a deep approach to learning.

on social presence:

>What is critical to note here is that although education is certainly a social phenomenon, there is a much larger purpose of acquiring and extending societal knowledge. Social interaction and presence may create the condition for sharing and challenging ideas through critical discourse, but it does not directly create cognitive presence or facilitate a deep learning approach. High levels of learning are dependent less on the quality, or substance, of interaction. That is, social presence may be a necessary but insufficient precondition for creating a community of inquiry and encouraging deep approaches to learning.

on lurking (and why discourse analysis for online discussions is tricky):

>Meaningful engagement does not simply correspond to sending lots of messages. It may mean that a student is engaged vicariously by following the discussion, reflecting on the discourse, and actively constructing meaning individually. Ideally, interaction would be required to confirm understanding. However, students may be cognitively present while not interacting or engaged overtly. This reveals anther challenge in understanding the qualitative nature of interaction in an online context.

on the community of inquiry model:

>Quality interaction and discourse for deep and meaningful learning must consider the confluence of social, cognitive, and teaching presence – that is, interaction among ideas, students, and the teacher. Teaching presence provides the structure (design) and leadership (facilitation/direction) to establish social and cognitive presence (i.e., community of inquiry). The community of inquiry model has proven to be a useful framework to analyze and understand interaction in an online educational environment.

.

>Understanding a complex concept such as interaction must be viewed from a comprehensive perspective. The community of inquiry framework defines the context that can support quality interaction and deep learning. A deep approach to learning must consider all three elements of the community of inquiry: social, cognitive, and teaching presence. The findings here suggest that neither social presence alone nor the surface exchange of information can create the environment and climate for deep approaches to learning and meaningful educational exchanges.

**DN*: Turns out, this paper is only tangentially related to what I’m looking for. Some very handy background, but no applicable methodology.*

  1. Garrison, D. R., and T. Anderson. 2003. E-Learning in the 21st century: A framework for research and practice. London: Routledge Falmer. []
  2. Moore, M. G. 1989. Three types of interaction. The American Journal of Distance Education 3 (2): 1–6. []
  3. Moore, M.G. 1990. Recent contributions to the theory of distance education. Open Learning 5 (3): 10–15. []