some light reading on technology and robots as tutors

  • Bartneck, C., Kuli, D., Croft, E., & Zoghbi, S. (2008). Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots. International Journal of Social Robotics, 1(1), 71-81.
  • Burgard, W., Cremers, A. B., Fox, D., & Hähnel, D. (1998). The interactive museum tour-guide robot. Aaai/Iaai.
  • Castellano, G., Paiva, A., Kappas, A., Aylett, R., Hastie, H., Barendregt, W., et al. (2013). Towards Empathic Virtual and Robotic Tutors. In Artificial Intelligence in Education (Vol. 7926, pp. 733-736). Berlin, Heidelberg: Springer Berlin Heidelberg.
  • Corrigan, L. J., Peters, C., & Castellano, G. (2013). Identifying Task Engagement: Towards Personalised Interactions with Educational Robots. 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), 655-658.
  • Dautenhahn, K. (2007). Socially intelligent robots: dimensions of human-robot interaction. Philosophical Transactions of the Royal Society of London B: Biological Sciences, 362(1480), 679-704.
  • Ganeshan, K. (2007). Teaching Robots: Robot-Lecturers and Remote Presence (Vol. 2007, pp. 252-260).
  • Gockley, R., Bruce, A., Forlizzi, J., Michalowski, M., Mundell, A., Rosenthal, S., et al. (2005). Designing robots for long-term social interaction. ” On Intelligent Robots “, 1338-1343.
  • Han, J. (2010). Robot-aided learning and r-learning services.
  • Han, J., Hyun, E., Kim, M., Cho, H., Kanda, T., & Nomura, T. (2009). The Cross-cultural Acceptance of Tutoring Robots with Augmented Reality Services. Jdcta.
  • Harteveld, C., & Sutherland, S. C. (2015). The Goal of Scoring: Exploring the Role of Game Performance in Educational Games. the 33rd Annual ACM Conference (pp. 2235-2244). New York, New York, USA: ACM.
  • Howley, I., Kanda, T., Hayashi, K., & Rosé, C. (2014). Effects of social presence and social role on help-seeking and learning. the 2014 ACM/IEEE international conference (pp. 415-422). New York, New York, USA: ACM.
  • Kanda, T., Hirano, T., Eaton, D., & Ishiguro, H. (2004). Interactive Robots as Social Partners and Peer Tutors for Children: A Field Trial. Human-Computer Interaction, 19(1), 61-84.
  • Kardan, S., & Conati, C. (2015). Providing Adaptive Support in an Interactive Simulation for Learning: An Experimental Evaluation. the 33rd Annual ACM Conference (pp. 3671-3680). New York, New York, USA: ACM.
  • Kennedy, J., Baxter, P., & Belpaeme, T. (2015). The Robot Who Tried Too Hard: Social Behaviour of a Robot Tutor Can Negatively Affect Child Learning. the Tenth Annual ACM/IEEE International Conference (pp. 67-74). New York, New York, USA: ACM.
  • Kenny, P., Hartholt, A., Gratch, J., & Swartout, W. (2007). Building Interactive Virtual Humans for Training Environments. Presented at the Proceedings of I/ “.
  • Kiesler_soccog_08.pdf. (n.d.). Kiesler_soccog_08.pdf. Retrieved June 15, 2016, from
  • Kopp, S., Jung, B., Lessmann, N., & Wachsmuth, I. (2003). Max – A Multimodal Assistant in Virtual Reality Construction. Ki.
  • Lee, D.-H., & Kim, J.-H. (2010). A framework for an interactive robot-based tutoring system and its application to ball-passing training. 2010 IEEE International Conference on Robotics and Biomimetics (ROBIO) (pp. 573-578). IEEE.
  • Leyzberg, D., Spaulding, S., & Scassellati, B. (2014). Personalizing robot tutors to individuals’ learning differences. the 2014 ACM/IEEE international conference (pp. 423-430). New York, New York, USA: ACM.
  • Leyzberg, D., Spaulding, S., Toneva, M., & Scassellati, B. (2012). The physical presence of a robot tutor increases cognitive learning gains.
  • Lin, R., & Kraus, S. (2010). Can automated agents proficiently negotiate with humans? Communications of the ACM, 53(1), 78-88.
  • Mitnik, R., Recabarren, M., Nussbaum, M., & Soto, A. (2009). Collaborative robotic instruction: A graph teaching experience. Computers & Education, 53(2), 330-342.
  • Mubin, O., Stevens, C. J., Shahid, S., Mahmud, A. A., & Dong, J.-J. (2013). A REVIEW OF THE APPLICABILITY OF ROBOTS IN EDUCATION. Technology for Education and Learning, 1(1).
  • Nkambou, R., Belghith, K., Kabanza, F., & Khan, M. (2005). Supporting Training on a Robotic Simulator using a Flexible Path Planner. AIED.
  • Nomikou, I., Pitsch, K., & Rohlfing, K. J. (Eds.). (2013). Robot feedback shapes the tutor’s presentation: How a robot’s online gaze strategies lead to micro-adaptation of the human’s conduct. Interaction Studies, 14(2), 268-296.
  • Peterson, I. (1992). Looking-Glass Worlds. Science News, 141(1), 8-10+15.
  • Rizzo, A., Lange, B., Buckwalter, J. G., Forbell, E., Kim, J., Sagae, K., et al. (n.d.). SimCoach: an intelligent virtual human system for providing healthcare information and support. International Journal on Disability and Human Development, 10(4).
  • Ros, R., Coninx, A., Demiris, Y., Patsis, G., Enescu, V., & Sahli, H. (2014). Behavioral accommodation towards a dance robot tutor. the 2014 ACM/IEEE international conference (pp. 278-279). New York, New York, USA: ACM.
  • Saerbeck, M., Schut, T., Bartneck, C., & Janse, M. D. (2010). Expressive robots in education: varying the degree of social supportive behavior of a robotic tutor. the 28th international conference (pp. 1613-1622). New York, New York, USA: ACM.
  • Satake, S., Kanda, T., Glas, D. F., Imai, M., Ishiguro, H., & Hagita, N. (2009). How to approach humans?-strategies for social robots to initiate interaction. ” -Robot Interaction ( “, 109-116.
  • Serholt, S., Basedow, C. A., Barendregt, W., & Obaid, M. (2014). Comparing a humanoid tutor to a human tutor delivering an instructional task to children. 2014 IEEE-RAS 14th International Conference on Humanoid Robots (Humanoids 2014), 1134-1141.
  • Shin, N., & Kim, S. (n.d.). Learning about, from, and with Robots: Students’ Perspectives. RO-MAN 2007 – the 16th IEEE International Symposium on Robot and Human Interactive Communication, 1040-1045.
  • Swartout, W. (2010). Lessons Learned from Virtual Humans. AI Magazine, 31(1), 9-20.
  • The (human) science of medical virtual learning environments. (2011). The (human) science of medical virtual learning environments, 366(1562), 276-285.
  • Toombs, A. L., Bardzell, S., & Bardzell, J. (2015). The Proper Care and Feeding of Hackerspaces: Care Ethics and Cultures of Making. the 33rd Annual ACM Conference (pp. 629-638). New York, New York, USA: ACM.
  • Vollmer, A.-L., Lohan, K. S., Fischer, K., Nagai, Y., Pitsch, K., Fritsch, J., et al. (2009). People modify their tutoring behavior in robot-directed interaction for action learning. 2009 IEEE 8th International Conference on Development and Learning (pp. 1-6). IEEE.
  • Walters, M. L., Dautenhahn, K., Koay, K. L., Kaouri, C., Boekhorst, R., Nehaniv, C., et al. (2005). Close encounters: spatial distances between people and a robot of mechanistic appearance. 5th IEEE-RAS International Conference on Humanoid Robots, 2005., 450-455.
  • Yannier, N., Israr, A., Lehman, J. F., & Klatzky, R. L. (2015).  FeelSleeve : Haptic Feedback to Enhance Early Reading. the 33rd Annual ACM Conference (pp. 1015-1024). New York, New York, USA: ACM.
  • You, S., Nie, J., Suh, K., & Sundar, S. S. (2011). When the robot criticizes you…: self-serving bias in human-robot interaction. the 6th international conference (pp. 295-296). New York, New York, USA: ACM.

Giant Walkthrough Brain

I was lucky to have been taken to a masters’ student seminar by Tatiana Karaman yesterday1, to see some work on a number of her related neuroanatomy projects as part of the Computational Media Design Program at the University of Calgary.

Tatiana sat through a 45-minute MRI head scan in order to get high quality 3D data to work with. She took the data and made a series of slices, which she then fed into a 3D printer. The quality of the prints weren’t quite what she was looking for, so she massaged the data and fed it into a laser cutter to make more robust plastic pieces. And wrote software to let people scan QR codes on the physical slices to get more information. As one does.


1xb0HvRl0J2esoyeccGspo0-NUSxKYFxre8jgspMB6w-e1405619906703But, before getting to that stage, she was involved with a project to create a virtual Giant Walkthrough Brain, based on Joseph Bogen’s design from way back in 1972. He proposed a 60-storey model of a human brain (30 storeys above ground, 30 below) to allow people to walk through the brain and see various bits up close. Strangely, that didn’t prove to be feasible. Until Tatiana and her team built it in software, using the LINDSAY virtual human data.

Jay Ingram took that 3D model on tour in 2014, presenting an interpretive tour through the brain, complete with live music by Jay Ingram and The Free Radicals (and Tatiana running the brain tour live on the big screen). It was part of Beakerhead in Calgary that year, and won the 2014 Science in Society Communication Award from the Canadian Science Writers’ Association.

Since then, the Giant Walkthrough Brain software has been updated to include support for Oculus VR:

And for use in an immersive 3D CAVE environment:

I have to say – what a fantastic student project. Innovative science. Making art. Collaborating with peers. Interdisciplinary and transdisciplinary work. Amazing.

  1. thanks, Leanne! []