Notes: Clarke & Kinne (2012). Asynchronous discussions as threaded discussions or blogs

Clarke, L, & Kinne, L. (2012). Asynchronous discussions as threaded discussions or blogs. Journal of Digital Learning in Teacher Education, 29, 4-13.

The article looked at students publishing online discussions using Blackboard and WordPress, and their reported sense of community, etc…

Kinda perfect for use in my thesis.

But the article is embargoed from our library collection, and the ISTE website for the journal locks it behind a broken paywall. I’ve tried several times to buy the article, but can’t get near it.

Open access, people. Don’t lock your awesomeness behind a paywall. This article is perfect for my thesis, but won’t be used because I can’t get to it.

Notes: Jyothy, McAvinia & Keating: A visualisation tool to aid exploration of students’ interactions in asynchronous online communication

Jyothi, S., McAvinia, C., & Keating, J. (2012). A visualisation tool to aid exploration of students’ interactions in asynchronous online communication. Computers & Education, 58(1), 30–42. doi:10.1016/j.compedu.2011.08.026


This paper describes a visualisation tool to aid the analysis of online communication. The tool has two purposes: first, it can be used on a day-to-day basis by teachers or forum moderators to review the development of a discussion and to support appropriate interventions. Second, the tool can support research activities since the visualisations generated provide the basis for further qualitative and quantitative analysis of online dialogue.

The visualisation software is designed to encode interaction types simply and quickly. The software was tested and then used to analyse data from a sample of forums within the Moodle VLE. The paper discusses both the method of visualisation and analysis of the online interactions as a pilot for further research analysing interaction in discussion forums.


This paper describes the design and implementation of a diagnostic tool which provides simple visual representations of the exchanges in asynchronous discussion forum threads. The visual representation is shown within a webpage, with hyperlinked nodes displaying the body text of messages posted to discussion forums. These graphical images might assist a teacher or moderator to intervene in the discussions whenever necessary, and the visual representations of online discussions can support researchers undertaking further analysis1.

Analysing asynchronous discussions in online environments

Given the importance ascribed to dialogue and CMC in educational theory, it follows that a means of reviewing and potentially analysing CMC interactions would therefore be useful to teachers and researchers, and research would benefit from an evidence base showing that online interactions had positive effects on students’ learning. However, the best ways of analysing CMC are not clear. Studies that have analysed the content of the online discussions are also limited. This may be due to the time required to perform such analyses (Hara, Bonk, & Angeli, 2000) and the lack of a reliable instrument or an analytical framework to analyse the online discussions. As Goodyear (2001) notes:

Analysing the content of networked learning discussions is a troublesome research area and several commentators have remarked on the difficulty of connecting online texts to discourse to learning. (Goodyear, cited Mehanna 2004: 283)

on assessing online discussions:

Formal assessment offers one indication of students’ learning, and online dialogue may then be argued to have supported this. However, unless the method of assessment includes the forum discussion in some way, it is not usually clear where and how learning in forums may have happened. Course feedback and evaluation mechanisms, similarly, may highlight the use of discussion forums as a useful supplement or yield examples of how students have used them, but ‘use’ cannot be equated with learning. Some researchers have instead proposed treating forum messages as qualitative data, and thereby draw on qualitative methods for analysis.

why build a tool to automate analysis/visualization of discourse?

Even for people accustomed to using qualitative methods as part of their research activities, they may be time-consuming to use in the context of evaluating learning in CMC. The methodological difficulties of analysing discussion forum data are therefore compounded by the practical constraints of time and experience. These issues have wider implications for the evidence base in e- learning: it is difficult to build up case studies of appropriate and effective use of technology to enhance learning, where practitioners lack the tools to make these studies.

Screen Shot 2011 11 19 at 1 52 35 PM

So, VIMS looks pretty awesome at this… Unfortunately, I can’t seem to find a fracking thing about the tool itself…

WTF is VIMS? No project website found, but the paper describes it:

VIMS provides real-time, radial-tree visualisation of the forum interactions, realised using a combination of SVG (Scalable Vector Graphics) using Perl with JavaScript. Visualisation maps are presented as interactive scalable images, viewable using most web browsers; the version described here can be seamlessly incorporated into Moodle. The technologies combined in VIMS allow the visualisation to have ‘hot spots’, on which the mouse can hover to access full details of a message. There is a continuous link between the image and the web server, implemented using AJAX, which means that the visualisation will change according as new messages are sent to the forum. An algorithm within the software depicts borders, differentiating between the threads of a discussion forum.

and the visualizations look something like:

Screen Shot 2011 11 19 at 1 58 34 PM

on the role of VIMS:

VIMS has considerable advantages as a visualisation tool. First, the discussions are shown in a systematic way, with the people starting the discussion placed at the first level. There is no on-screen clutter from message text and all threads in a discussion forum can be viewed at a glance. Navigation on-screen allows the discussion to be viewed as a whole, or for the viewer to zoom in on certain areas. One or more threads can be compared easily. This visual aid could help the instructor develop a collaborative environment, by aiding him/her to visualise the active and inactive participants, and therefore inform appropriate interventions.

It is important to acknowledge the limitations of the VIMS tool too: it is in essence a support for coding and management of the data, rather than offering in and of itself a new method for analysing that data. For such analysis, we need to consider the wider model used by Schrire or indeed to pursue existing qualitative methods. VIMS does not yet allow us a way to analyse the multi-modal nature of the student discourse in unmoderated Forums, and the inclusion of images, sounds and other media which students are now accustomed to using. This is a further area of work we need to address, but one for which the other visualisation tools described in this paper are (similarly) unsuited.

Lots of other interesting papers cited in this one. Mine it.

But, I don’t understand how VIMS doesn’t appear to have a project website or information available. Is it secret sauce?

  1. I’m wondering if this might be a useful way to display the discourse in the data I’m gathering… []

Notes: Coulthard, M. (1974). Approaches to the Analysis of Classroom Interaction

Coulthard, M. (1974). [Approaches to the analysis of classroom interaction]( Educational Review. 26(3). pp 229 — 240.

On directing discourse:

>Participants with equal rights and status, as in everyday conversation, negotiate in very subtle and complex ways for the right to speak, to control the direction of the discourse and to introduce new topics. We therefore determined to reduce the number of variables by choosing a situation in which one of the participants has an acknowledged right to decide who will speak, when they will speak, what the topic of the discourse will be, and the general lines along which it will progress. The classroom was an ideal situation.

on linguistic analysis vs. educational analysis:

>For instance, Gallagher and Ashner (1963)1 and Taba et al (1964)2 both focus on thinking, defined as ‘an active transaction between the individual and the demands of his environment, which is neither fully controlled by environmental stimulation, nor wholly independent of some mediating interaction’. Their categories are attempts to analyse one of the purposes of the interaction, but are several stages removed from the linguistic data and cannot be directly related to it.

on linguistic description of classroom discourse:

>Verbal interaction inside the classroom differs markedly from desultory conversation in that its main purpose is to instruct and inform and one would expect this difference to be reflected in the way in which the discourse progresses. One of the functions of the teacher is to choose the topic, to decide how the topic will be sub- divided into smaller units and to cope with digressions and mis- understandings.

on patterns of interaction in a face-to-face classroom:

>We expected eliciting exchanges to consist of a Teacher question followed by Pupil reply, T-P, T-P, T-P, but this was not the case— the structure is rather T-P-T, T-P-T, T-P-T. In other words, the teacher almost always has the last word, and has two turns to speak for every one pupil turn. This, of course, partly explains the consistent finding that teachers talk, on average, for two thirds of the talking time. The teacher asks a question, the pupil answers it and the teacher provides evaluative feedback before asking another question.

**DN:** does this pattern show up online? Is it different, based on the environment/platform?

  1. Gallagher, J. J. & Aschner, M . J. (1963), ‘A preliminary report on analyses
    of classroom interaction’ Merrill-Palmer Quarterly,9, 1963. []
  2. Taba, H., Levine, S. & Elzey, F. F. (1964), Thinking in Elementary School Children. Report, U.S. Department of Health, Education and Welfare Co-operative Research Project No. 1574. San Francisco State College. []

Notes on Hara et al. Content analysis of online discussion in an applied educational psychology course

Hara, N., Bonk, C.J., & Angeli, C. (2000). Content analysis of online discussion in an applied educational psychology course. Instructional science. 28(2). pp. 115-152

The study looked at a graduate-level psychology course that used online discussion as a core graded activity. The researchers looked at:

  1. student participation rates
  2. electronic participation patterns (what form of interaction takes place when led by students? does it change over time?)
  3. social cues within the messages (“it’s my birthday.” etc…)
  4. cognitive & metacognitive components of student messages
  5. depth of processing – surface or deep – within message posts

While we were ultimately interested in how a community of learning can be built using online discussion, this study was more specifically focused on the social and cognitive processes exhibited in the electronic transcripts as well as the interactivity patterns among the students.

Content analysis was used to analyze the online discussion – “this particular study is more concerned with analysis and categorization of text than with the process of communication or specific speech acts, as in discourse analysis, it primarily relies on content analysis methodology.”

As indicated, Henri (1992)1 proposes an analytical framework to categorize five dimensions of the learning process evident in electronic messages: student participation, interaction patterns, social cues, cognitive skills and depth of processing, and metacognitive skills and knowledge.

By combining Henri’s criteria related to message interactivity (i.e., explicit, implicit, and independent commenting) and Howell-Richardson and Mellar’s visual representation of message interaction, we created weekly conference activity graphs illustrating the associations between online messages. Quantitative data, such as the number and length of student messages, were also collected.

DN: This combination of methods meant researchers could focus on content analysis while also looking at interaction patterns. Straight discourse analysis would have abstracted the content away. I need to think about how to set this up. I think discourse analysis (speech acts, interaction types) would get at what I’m looking for, but maybe a layer of content analysis is needed too…

Since any message could conceivably contain several ideas, the base “Unit” of the analysis was not a message, but a paragraph.

Quantitative data

  • researchers looked at server logs to see frequency of posts, total number of posts, and weekly posts/activity.

Qualitative data:

  • interaction patterns in the computer-mediated computer conferencing were mapped out. (explicit interaction, implicit interaction, independent statement)
  • social cues apparent in the FirstClass dialogue were coded. (social cues defined as “statement or part of a statement not related to formal content of subject matter.”)
  • both the cognitive and metacognitive skills embedded in these electronic conversa- tions were analyzed to better understand the mental processes involved in the discussions. (using a framework based on Bloom’s Taxonomy)
  • each message was evaluated for the depth of processing, surface or deep.


  • student-centred – students dominated the discussions, with relatively little contribution from instructor
  • most students only posted the one entry required per week, but they were long and substantive posts.
  • several unique patterns of interaction emerged:
    1. the second week had “starter-centered” interaction;
    2. the fourth week had “scattered” interaction, in part, because no one assumed the role of the starter in the discussions that took place that week;
    3. the eighth week had “synergistic” interaction (i.e., it had a cohesive feel with the interaction of the participants creating a combined effect that perhaps was greater than the sum of the individual efforts); and
    4. the tenth week had “explicit” interaction.
  • social cue findings: “In this graduate level course, the num- ber of social cues decreased as the semester progressed. Moreover, student messages gradually became less formal. These findings might be attributed to the fact that students felt more comfortable with each other as the semester continued (Kang, 1998).”2
  • cognitive skill findings: “in this particular research project, most of the messages were fairly deep in terms of information processing. Of the four weeks of detailed analysis, 33 percent of student messages were at the surface level, 55 percent were at an in-depth level of processing, and an additional 12 percent contained aspects of both surface and deep processing.”


“It appears that by structuring electronic learning activity, students will have more time to reflect on course content and make in-depth cognitive and social contributions to a college class than would be possible in a traditional classroom setting.”

“Not only did students share knowledge, but content analyses indicated that students were processing course information at a fairly high cognitive level. Social cues took a back seat to student judgment, inferencing, and clarification.”

  1.  Henri, F. (1992). Computer conferencing and content analysis. In A.R. Kaye, ed., Collaborative Learning Through Computer Conferencing: The Najaden Papers, pp. 115–136. New York: Springer. []
  2.  Kang, I. (1998). The use of computer-mediated communication: Electronic collaboration and interactivity. In C.J. Bonk & K.S. King, eds, Electronic Collaborators: Learner-centered Technologies for Literacy, Apprenticeship, and Discourse, pp. 315–337. Mahwah, NJ: Erlbaum. []

the twitter effect

Rereading Alan’s post on his blog hiatus, where he takes a month off of posting on his blog to comment elsewhere, I was struck (as always) by the patterns in activity he described. I decided to take a closer peek at the activity on my own blog – I’ve been thinking a lot about discourse analysis lately, so it’s at least partially non-navel-gazing.

Here’s the graph for the first few years of life for my blog. It started out as a private, personal outboard brain, then kind of took off with a life of its own.

a pretty graph, about nothing

Interesting. This blog’s heyday was 2005-2006. A lifetime ago, in intartube years. Then twitter happened in January 2007. It would be _really_ interesting to run some latent content analysis on both posts and comments, to see if they’re different BT vs. AT. Are the activity patterns different? Is the content different? Linking patterns? etc… It’d be completely nonscientific, but fascinating nonetheless…

Towards an analysis of discourse

Sinclair, J & Coulthard, R.M. (1975). Towards an analysis of discourse: the English used by teachers and pupils. London, Oxford University Press.

The book describes research into student/pupil discourse – who controls the flow, what coded messages are being communicated, etc…

Four minimum criteria for producing a descriptive system as outlined in Sinclair (1973):1

1. The descriptive apparatus should be finite, or else one is not saying anything at all, and may be merely creating the illusion of classification.

2. The symbols or terms in the descriptive apparatus should be precisely relatable to their exponents in the data, or else it is not clear what one is saying.

3. The whole of the data should be describable; the descriptive system should be comprehensive. (have an “other” category)

4. There must be at least one impossible combination of symbols. (I don’t get this…)

5 major dimensions along which situations could vary:

1. Number and grouping of participants
2. Control
3. Copresence
4. Intended audience
5. Purpose

Sociolinguistic aspect…

Latent patterning.

A model for discourse
– orientation
– organization
– fit
– play
– assembly

  1. Sinclair, J. (1973). A course in spoken English: Grammar. London. Oxford University Press, 1972. Linguistics in colleges of education. Dudley Educational Journal. 1(3). []