w.o. mitchell

a different kind of memorial – my son attends an elementary school named after one of Canada’s most cherished authors – W.O. Mitchell. There’s a photo of Mitchell in the school, and it looks like he actually was able to visit the school before his death in 1998. It’s pretty cool to wander the halls and see photos, plaques, and other things tying the students to Mitchell and to writing and creativity in general. A fantastic legacy to have left.

I was at the school this evening for a School Council Meeting, and was early so had the chance to wander around a bit before the meeting. It’s a really incredible school, and the staff and administration are amazing. It’s a vibrant little community of active learners.

More info on W.O. Mitchell http://en.wikipedia.org/wiki/W.O._Mitchell

2010/05/31: It’s Memorial Day in the U.S. Make a photo that symbolizes today and what it means to you. #ds197

Notes: Kanuka & Anderson. Online Social Interchange, Discord, and Knowledge Construction

Kanuka, H. & Anderson, T. (1998). Online Social Interchange, Discord, and Knowledge Construction. Journal of Distance Education. 13 (1) pp. 57-74.

This study presents the results of an exploratory multi- method evaluation study and transcript analysis of an online forum. The researchers used a constructivist interaction analysis model developed by Gunawardena, Lowe, and Anderson (1997)1

Gunawardena et al.’s phases:

  1. Sharing/comparing of information
  2. Discovery and exploration of dissonance or inconsistency among the ideas, concepts, or statements advanced by different participants.
  3. Negotiation of meaning and/or co-construction of knowledge.
  4. Testing and modification of proposed synthesis or co-construction.
  5. Phrasing of agreement, statenient(s), and applications of the newly constructed meaning.

Method:

The research study focused on the analysis of data obtained from participants in the online forum. We read postings, but did not participate in the forum. At the end of the two-week forum, an online survey was distributed to all participants and a transcript analysis was undertaken. Finally, a telephone survey was conducted with a stratified sample of participants.

Results:

the forum was perceived by the participants as successful in providing opportunities for reflection and exposure to multiple perspectives on topics that were relevant to the participants. There seemed less agreement, however, with the notion that the forum provided opportunity for application of new knowledge and deeper understanding of the issues.

Transcript analysis:

The unitizing process involved a coding operation that separated the participants’ online interactions (postings that fell in phases I through V) from other postings, such as the moderator’s summaries or other general announcements.

The transcript analysis procedure consisted of reading each message and assigning it to one or more phases. A message that contained two or more distinct ideas or comments was coded in two or more phases (the messages were coded independently by both researchers). Discrepancies were discussed, and a single coding was determined from these discussions.

So many of the messages wound up being categorized as Level 1 (basic knowledge) that a grounded theory method was added to the study.

Grounded theory provided a useful collection of strategies (such as constant comparison and analytic meaning) when little is known about a phenomenon—as was the case in this study where the focus was to investigate knowledge construction and social interaction in an online environment. Using grounded theory, we reassessed and then recategorized the postings.

DN: not sure what the addition of grounded theory does to the study. seems like they just threw up their hands, said “WTF?” and fell back on divining chicken entrails to get something out of the transcripts…

These two new categories were generated from the data: social interchange2 and social discord and knowledge construction3.

DN: the study turns out to not be directly applicable, but points to some things to watch for when coding – having too many things fall into the basic level or “other” category. How to design the coding to avoid having to fall back on grounded theory?

  1. Gunawardena, L., Lowe, C, & Anderson, T. (1997). Interaction analysis of a global on-line debate and the development of a constructivist interaction analysis model for computer conferencing, Journal of Educational Computing Research, 17(4), 395-429. []
  2. basically, participants talking to each other. strange, that this would be observed in an online discussion board, designed to facilitate participants, well, talking to each other… []
  3. there were a few instances of interaction between the forum participants that involved inconsistencies or contradictions in information and/or ideas that resulted in a new or changed perspective. the effect of cognitive dissonance []

Notes: Guan et al. Content analysis of online discussion on a senior-high-school discussion forum of a virtual physics laboratory

Guan, Y.H., Tsai, C.C., & Hwang, F.K. (2006). Content analysis of online discussion on a senior-high-school discussion forum of a virtual physics laboratory. Instructional science. 34 (4) pp. 279-311

In this study we content analyzed the online discussion of several senior-high-school groups on a forum of a virtual physics laboratory in Taiwan. The goal of our research was to investigate the nature of non-course-based online discussion and to find out some useful guidelines in developing such discussion forums for learning purposes.

DN: Studied extracurricular forum activity, not course-based. Results not applicable to what I need, but maybe some content analysis methodology could be useful…

Researchers adopted Henri’s framework and models1 (like the previous Garrison study). As a result, used the same parameters:

The content analysis was conducted in terms of participation rate, social cues, interaction types, and cognitive and metacognitive skills.

Why look at non-course-based discussion boards? Getting a self-direction and lifelong learning:

The advantages of non-course-based online discussion lie in that the participants of the discussion are not limited to the members of a particular course, and the participation in the discussion is based on common interests shared by the participants. That is, the participation is totally voluntary and people may join or leave the discussion any time they want.

DN: How did they get ethical approval to gather data on minors in a public discussion board?

The discussion started with a message containing a question that could be posted by any participant. Whoever was interested in the topic could rely to it. The size of discussion groups ranged from 1 to 213 participants. The participation on the forum was voluntary.

Two moderators supervised the discussion forum. A moderator was a physics professor in NTNU. The other one was a senior-high-school physics teacher.

These moderators filtered content, removing objectionable/rude words, correcting misleading concepts/statements.

The content of a message was analyzed based on its idea(s). An idea expressed a complete thought, which might contain one or several sentences or even several paragraphs. A message might consist of more than one idea. The analysis was conducted according to five dimensions: the participation rate, social cues, interaction types, cognitive skills, and metacognitive skills.

Results:

Altogether we analyzed 575 messages containing 634 ideas, which were posted by 349 participants.

Overall, 19.72% of the ideas were not relevant to the subject under discussion. Only 11.49% of the ideas revealed metacognitive skills (i.e., those of ‘evaluation’, ‘planning’, ‘regulation’, and ‘self-awareness’), and 16.88% of the ideas did not reveal any cognitive or metacognitive components considered in the study.

There were 2 types of discussion board activities – NR where significant contribution was not required in order to participate, and R, where a significant and guided contribution was required in order to gain access to the board. Active participation in the Required group was low, even though number of participants was high (they needed to participate in order to gain access, but didn’t care about the board). The Non-Required board had much higher active participation (numbers of posts) although lower numbers of participants (because it was an optional discussion board).

Also, the Required board posts seemed to be lower, meta-cognitively, than the Non-Required board posts, although Required posts showed higher cognitive levels. Not sure of the value of this distinction – non-required activities are ~more metacognitive, but required activities are ~more cognitive? so many variables. so many generalizations.

Overall, whether online discussion can help people to learn more deeply depends on the quality of discussion, which can be influenced by the features of participants and discussion topics, the interactions between the participants, the purpose, design and organization of the discussion forums, and not least the moderators coordinating the discussion.2

  1. Henri, F. (1992). Computer conferencing and content analysis. In A.R. Kaye, ed., Collaborative learning through computer conferencing: the Najaden papers, pp. 115–136. Springer: New York. []
  2. Duh. []

testing markdown

getting tired of tediously fixing html in posts, so I’m trying out markdown.

what is markdown?

Markdown is a text-to-HTML conversion tool for web writers. Markdown allows you to write using an easy-to-read, easy-to-write plain text format, then convert it to structurally valid XHTML (or HTML).

why markdown?

I like the idea of not having to futz around with html. But, much of the markdown syntax feels like simple proxies for html, but with the added dependency on an additional plugin for rendering. Not sure I like that. Not sure it’ll be worth relearning the wheel…

testing

Lists?

  • item one
  • item two
    • item two point one
    • item two point two
  • item three

How about indexed lists?

  1. item one
  2. item two
  3. item three

interesting. No <ul><li>blah</li></ul> crap. nice.

dealing with images looks a bit messy, but whatever.

Does it play nicely with the footnote plugin1 I use? (apparently, but it doesn’t like embedding Markdown link syntax within the footer plugin markup… not the end of the world. just have to remember to use regular <a href="...">link</a> syntax within footnotes.)

Seems pretty streamlined, almost like wiki syntax. What the world really needs is yet another markup language with its own syntax…

  1. WP-Footnotes []

more on going stealth online

I’ve been trying to extricate myself from Google’s All Seeing Gaze. (for more info on why, see this article linked by @brlamb).

There are plugins and opt-out cookies etc… but all of those work only in the browser. Often, in just a specific browser. I think I’ve found a better way. No opt-out. Works for any app that touches The Tubes.

Just modify your /etc/hosts file to include the contents of this great shared .hosts file. All requests for nefarious tracking servers will be dumped to 127.0.0.1 (your own computer) rather than routed out to The Big Snoops In The Ether. Some semblance of privacy, without having to opt out in every browser you use.

The sample file had ad.doubleclick.net commented out because it breaks sears.com and other sites who somehow route actual content through the ad tracking network. I say, if a site is that evilly designed, screw ‘em. I’ve uncommented the line and am blocking all requests for known doubleclick servers.

Also, I switched my DNS away from the convenient and fast Google DNS servers. Sure, they’re fast, but using their DNS servers means they’re able to see everything I do online, no matter what app, no matter what protocol. No, thanks.

Finally, I’ve stopped using Google Quick Search Box. It’d probably be OK to just turn off the “send usage data to Google” and “suggest web pages…” settings, but I’m reverting to just using Spotlight instead. It’s local. It doesn’t report stuff to The Cloud.

Notes: Garrison et al. Facilitating cognitive presence in online learning: Interaction is not enough.

Garrison, D.R. & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: interaction is not enough. The American Journal of Distance Education. 19(3). 133-148.

This study assessed the depth of online learning, with a focus on the nature of online interaction in four distance education course designs.

Article provides a good background to course design, deep/surface/achievement-oriented learning. The study used a survey (Study Process Questionnaire) to compare changes in learning strategies selected by 75 students in 4 courses in different subjects and levels.

Interaction is seen as central to an educational experience and is a primary focus in the study of online learning. The focus on interaction in online learning emerges from the potential and properties of new technologies to support sustained educational communication. Communication and Internet technologies provide a high degree of communicative potential through asynchronous interaction design options (Garrison and Anderson 2003).1 From an access perspective, participants are able to maintain engagement in a community of learners when and where they choose.

.

The purpose of an educational experience, whether it is online, face-to-face, or a blending of both, is to structure the educational experience to achieve defined learning outcomes. In this context, interaction must be more structured and systematic. A qualitative dimension is introduced where interaction is seen as communication with the intent to influence thinking in a critical and reflective manner.

.

Moore (19892 , 19903 ) was one of the first to focus on interaction issues in distance education. He identified transactional distance as consisting of dialogue (i.e., interaction) and structure (i.e., design).

and on why I’ll get to delve into qualitative discourse analysis…

An interactive community of learners is generally considered the sine qua non of higher education. However, interaction is not a guarantee that students are cognitively engaged in an educationally meaningful manner. High levels of interaction may be reflective of group cohesion, but it does not directly create cognitive development or facilitate meaningful learning and understanding. Interaction directed to cognitive outcomes is characterized more by the qualitative nature of the interaction and less by quantitative measures. There must be a qualitative dimension characterized by interaction that takes the form of purposeful and systematic discourse.

on social presence and higher-order learning:

…establishing social presence was more heavily shaped through peer interaction. With regard to successful higher-order learning, … teaching presence in the form of facilitation is crucial in the success of online learning.

on facilitated discourse and success of online courses:

The design feature of successful online courses demonstrates structured discourse that facilitate clear discussion threads, avoid disjointed monologues, and move the discussion through the phases of inquiry (levels of thinking).

on deep vs. surface vs. achievement-oriented learning:

In a deep approach to learning, material is embraced and digested in the search for meaning. Surface learning employs the least amount of effort toward realizing the minimum required outcomes. Surface learners are motivated to complete the task rather than assimilate the learning. Achievement approaches to learning are reflected by an orientation to the external reward for demonstrating learning. Strategies for the achievement orientation focus on the activities that will result in the highest marks.

but there isn’t a mention of strategies/motivations for deep learning…

All students are capable of employing any of the three approaches and do so as required by the learning environment; they choose strategies deemed to be most effective based on the requirements in the environment. Students can move from one approach to another and do so in response to the climate and requirements of the course.

so, would the choice of platform do anything to the approach selected by students?

Method:

The study was conducted from January 2003 to April 2004. It administered the Study Process Questionnaire to the online course participants (seventy-five students participated) to measure changes in how graduate students choose to strategize their learning in a particular learning setting.

**DN: wait. the entire data set was a survey sent by email to students? no analysis of the actual online learning?

Discussion:

High levels of learning are dependent less on the quantity of interaction than on the quality, or substance, of interaction. That is, social presence may be a necessary but insufficient precondition for creating a community of inquiry and encouraging deep approaches to learning.

.

Teaching presence must be available, either from the facilitator or the other students, to transition from social to cognitive presence.

.

It appears that teaching presence contributes to the adoption of a deep approach to learning and that interaction by itself does not promote a deep approach to learning.

on social presence:

What is critical to note here is that although education is certainly a social phenomenon, there is a much larger purpose of acquiring and extending societal knowledge. Social interaction and presence may create the condition for sharing and challenging ideas through critical discourse, but it does not directly create cognitive presence or facilitate a deep learning approach. High levels of learning are dependent less on the quality, or substance, of interaction. That is, social presence may be a necessary but insufficient precondition for creating a community of inquiry and encouraging deep approaches to learning.

on lurking (and why discourse analysis for online discussions is tricky):

Meaningful engagement does not simply correspond to sending lots of messages. It may mean that a student is engaged vicariously by following the discussion, reflecting on the discourse, and actively constructing meaning individually. Ideally, interaction would be required to confirm understanding. However, students may be cognitively present while not interacting or engaged overtly. This reveals anther challenge in understanding the qualitative nature of interaction in an online context.

on the community of inquiry model:

Quality interaction and discourse for deep and meaningful learning must consider the confluence of social, cognitive, and teaching presence – that is, interaction among ideas, students, and the teacher. Teaching presence provides the structure (design) and leadership (facilitation/direction) to establish social and cognitive presence (i.e., community of inquiry). The community of inquiry model has proven to be a useful framework to analyze and understand interaction in an online educational environment.

.

Understanding a complex concept such as interaction must be viewed from a comprehensive perspective. The community of inquiry framework defines the context that can support quality interaction and deep learning. A deep approach to learning must consider all three elements of the community of inquiry: social, cognitive, and teaching presence. The findings here suggest that neither social presence alone nor the surface exchange of information can create the environment and climate for deep approaches to learning and meaningful educational exchanges.

**DN: Turns out, this paper is only tangentially related to what I’m looking for. Some very handy background, but no applicable methodology.

  1. Garrison, D. R., and T. Anderson. 2003. E-Learning in the 21st century: A framework for research and practice. London: Routledge Falmer. []
  2. Moore, M. G. 1989. Three types of interaction. The American Journal of Distance Education 3 (2): 1–6. []
  3. Moore, M.G. 1990. Recent contributions to the theory of distance education. Open Learning 5 (3): 10–15. []

Notes on Hara et al. Content analysis of online discussion in an applied educational psychology course

Hara, N., Bonk, C.J., & Angeli, C. (2000). Content analysis of online discussion in an applied educational psychology course. Instructional science. 28(2). pp. 115-152

The study looked at a graduate-level psychology course that used online discussion as a core graded activity. The researchers looked at:

  1. student participation rates
  2. electronic participation patterns (what form of interaction takes place when led by students? does it change over time?)
  3. social cues within the messages (“it’s my birthday.” etc…)
  4. cognitive & metacognitive components of student messages
  5. depth of processing – surface or deep – within message posts
While we were ultimately interested in how a community of learning can be built using online discussion, this study was more specifically focused on the social and cognitive processes exhibited in the electronic transcripts as well as the interactivity patterns among the students.

Content analysis was used to analyze the online discussion – “this particular study is more concerned with analysis and categorization of text than with the process of communication or specific speech acts, as in discourse analysis, it primarily relies on content analysis methodology.”

As indicated, Henri (1992)1 proposes an analytical framework to categorize five dimensions of the learning process evident in electronic messages: student participation, interaction patterns, social cues, cognitive skills and depth of processing, and metacognitive skills and knowledge.
By combining Henri’s criteria related to message interactivity (i.e., explicit, implicit, and independent commenting) and Howell-Richardson and Mellar’s visual representation of message interaction, we created weekly conference activity graphs illustrating the associations between online messages. Quantitative data, such as the number and length of student messages, were also collected.

DN: This combination of methods meant researchers could focus on content analysis while also looking at interaction patterns. Straight discourse analysis would have abstracted the content away. I need to think about how to set this up. I think discourse analysis (speech acts, interaction types) would get at what I’m looking for, but maybe a layer of content analysis is needed too…

Since any message could conceivably contain several ideas, the base “Unit” of the analysis was not a message, but a paragraph.

Quantitative data

  • researchers looked at server logs to see frequency of posts, total number of posts, and weekly posts/activity.

Qualitative data:

  • interaction patterns in the computer-mediated computer conferencing were mapped out. (explicit interaction, implicit interaction, independent statement)
  • social cues apparent in the FirstClass dialogue were coded. (social cues defined as “statement or part of a statement not related to formal content of subject matter.”)
  • both the cognitive and metacognitive skills embedded in these electronic conversa- tions were analyzed to better understand the mental processes involved in the discussions. (using a framework based on Bloom’s Taxonomy)
  • each message was evaluated for the depth of processing, surface or deep.

Results:

  • student-centred – students dominated the discussions, with relatively little contribution from instructor
  • most students only posted the one entry required per week, but they were long and substantive posts.
  • several unique patterns of interaction emerged:
    1. the second week had “starter-centered” interaction;
    2. the fourth week had “scattered” interaction, in part, because no one assumed the role of the starter in the discussions that took place that week;
    3. the eighth week had “synergistic” interaction (i.e., it had a cohesive feel with the interaction of the participants creating a combined effect that perhaps was greater than the sum of the individual efforts); and
    4. the tenth week had “explicit” interaction.
  • social cue findings: “In this graduate level course, the num- ber of social cues decreased as the semester progressed. Moreover, student messages gradually became less formal. These findings might be attributed to the fact that students felt more comfortable with each other as the semester continued (Kang, 1998).”2
  • cognitive skill findings: “in this particular research project, most of the messages were fairly deep in terms of information processing. Of the four weeks of detailed analysis, 33 percent of student messages were at the surface level, 55 percent were at an in-depth level of processing, and an additional 12 percent contained aspects of both surface and deep processing.”

Discussion:

“It appears that by structuring electronic learning activity, students will have more time to reflect on course content and make in-depth cognitive and social contributions to a college class than would be possible in a traditional classroom setting.”
“Not only did students share knowledge, but content analyses indicated that students were processing course information at a fairly high cognitive level. Social cues took a back seat to student judgment, inferencing, and clarification.”
  1.  Henri, F. (1992). Computer conferencing and content analysis. In A.R. Kaye, ed., Collaborative Learning Through Computer Conferencing: The Najaden Papers, pp. 115–136. New York: Springer. []
  2.  Kang, I. (1998). The use of computer-mediated communication: Electronic collaboration and interactivity. In C.J. Bonk & K.S. King, eds, Electronic Collaborators: Learner-centered Technologies for Literacy, Apprenticeship, and Discourse, pp. 315–337. Mahwah, NJ: Erlbaum. []