Augmenting on-screen instructions with micro-projected guides: when it works, and when it fails

We present a study that evaluates the effectiveness of augmenting on-screen instructions with micro-projection for manual task guidance unlike prior work, which replaced screen instructions with alternative modalities (e.g., head-mounted displays). In our study, 30 participants completed 10 trials each of 11 manual tasks chosen to represent a set of common task-components (e.g., cutting, folding) found in many everyday activities such as crafts, cooking, and hobby electronics. Fifteen participants received only on-screen instructions, and 15 received both on-screen and micro-projected instructions. In contrast to prior work, which focused only on whole tasks, our study examines the benefit of augmenting common task instructions. The augmented instructions improved participants' performance overall; however, we show that in certain cases when projected guides and physical objects visually interfered, projected elements caused increased errors. Our results demonstrate that examining effectiveness at an instruction level is both useful and necessary, and provide insight into the design of systems that help users perform everyday tasks.

[1]  Cathy Craig,et al.  The Effect of Using Animated Work Instructions Over Text and Static Graphics When Performing a Small Scale Engineering Assembly , 2008 .

[2]  Jeroen K. Vermunt,et al.  Log-Linear Models for Event Histories , 1997 .

[3]  Ulrich Neumann,et al.  Cognitive, performance, and systems issues for augmented reality applications in manufacturing and maintenance , 1998, Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180).

[4]  Frank Biocca,et al.  Comparative effectiveness of augmented reality in object assembly , 2003, CHI '03.

[5]  David S. Kirk,et al.  Comparing remote gesture technologies for supporting collaborative physical tasks , 2006, CHI.

[6]  Daniel Vogel,et al.  Shift: a technique for operating pen-based interfaces using touch , 2007, CHI.

[7]  Steven K. Feiner,et al.  Knowledge-based augmented reality , 1993, CACM.

[8]  Susan R. Fussell,et al.  Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks , 2004, Hum. Comput. Interact..

[9]  Wendy Ju,et al.  Origami Desk: integrating technological innovation and human-centric design , 2002, DIS '02.

[10]  John R. Anderson,et al.  Cognitive Tutors: Lessons Learned , 1995 .

[11]  Shogo Nishida,et al.  Virtual Pop-Up Book Based on Augmented Reality , 2007, HCI.

[12]  Woodrow Barfield,et al.  Evaluating the effectiveness of augmented reality displays for a manual assembly task , 1999, Virtual Reality.

[13]  Phillip S. Dunston,et al.  Key Areas And Issues For Augmented Reality Applications On Construction Sites , 2009 .

[14]  Jacob O. Wobbrock,et al.  Bonfire: a nomadic system for hybrid laptop-tabletop interaction , 2009, UIST '09.

[15]  Robert E. Kraut,et al.  Collaboration in performance of physical tasks: effects on outcomes and communication , 1996, CSCW '96.

[16]  Susan Palmiter,et al.  An evaluation of animated demonstrations of learning computer-based tasks , 1991, CHI.

[17]  Abhishek Ranjan,et al.  An exploratory analysis of partner action and camera control in a video-mediated collaborative task , 2006, CSCW '06.

[18]  Douglas N. Arnold,et al.  COMPUTER-AIDED INSTRUCTION , 2006 .

[19]  Tom Rodden,et al.  Turn it this way: grounding collaborative action with remote gestures , 2007, CHI.

[20]  Robert E. Kraut,et al.  Action as language in a shared visual space , 2004, CSCW.

[21]  Claire M. Fletcher-Flinn,et al.  The Efficacy of Computer Assisted Instruction (CAI): A Meta-Analysis , 1995 .

[22]  Hideo Saito,et al.  Support system for guitar playing using augmented reality display , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[23]  Ronald Azuma,et al.  Recent Advances in Augmented Reality , 2001, IEEE Computer Graphics and Applications.