TagAlong: Informal Learning from a Remote Companion with Mobile Perspective Sharing

Questions often arise spontaneously in a curious mind, due to an observation about a new or unknown environment. When an expert is right there, prepared to engage in dialog, this curiosity can be harnessed and converted into highly effective, intrinsically motivated learning. This paper investigates how this kind of situated informal learning can be realized in real-world settings with wearable technologies and the support of a remote learning companion. In particular, we seek to understand how the use of different multimedia communication mediums impacts the quality of the interaction with a remote teacher, and how these remote interactions compare with face-to-face, co-present learning. A prototype system called TagAlong was developed with attention to features that facilitate dialog based on the visual environment. It was developed to work robustly in the wild, depending only on widely-available components and infrastructure. A pilot study was performed to learn about what characteristics are most important for successful interactions, as a basis for further system development and a future full-scale study. We conclude that it is critical for system design to be informed by (i) an analysis of the attentional burdens imposed by the system on both wearer and companion and (ii) a knowledge of the strengths and weaknesses of co-present learning.

[1]  Pattie Maes,et al.  Enabling Human Micro-Presence through Small-Screen Head-up Display Devices , 2015, CHI Extended Abstracts.

[2]  John C. Tang,et al.  What video can and cannot do for collaboration: A case study , 2005, Multimedia Systems.

[3]  Alphonse Chapanis,et al.  The Effects of 10 Communication Modes on the Behavior of Teams During Co-Operative Problem-Solving , 1974, Int. J. Man Mach. Stud..

[4]  J. Bogarín Principles of language learning and teaching , 2013 .

[5]  G. Davies,et al.  Memory in context : context in memory , 1990 .

[6]  Vivek K. Goyal,et al.  Mime: compact, low power 3D gesture sensing for interaction with head mounted displays , 2013, UIST.

[7]  Steven K. Feiner,et al.  A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[8]  Steve Mann,et al.  Telepointer: Hands-free completely self-contained wearable visual augmented reality without headwear and without any infrastructural reliance , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[9]  Endel Tulving,et al.  Encoding specificity and retrieval processes in episodic memory. , 1973 .

[10]  Chang Gen Ling,et al.  Principles of language learning and teaching , 1981 .

[11]  Kristine Nagel,et al.  Understanding the design space of referencing in collaborative augmented reality environments , 2007, GI '07.

[12]  Jennifer Healey,et al.  Augmented Reality through Wearable Computing , 1997, Presence: Teleoperators & Virtual Environments.

[13]  Benjamin Cohen,et al.  TeleAdvisor: a versatile augmented reality tool for remote assistance , 2012, CHI.