Investigating user tolerance for errors in vision-enabled gesture-based interactions

In this paper, we describe our investigation into user tolerance of recognition errors during hand gesture interactions with visual displays. The study is based on our proposed interaction model for investigating gesture based interactions, focusing on three elements: Interaction context, system performance and user goals. This Wizard of Oz experiment investigates how recognition system accuracy rates and task characteristics in both desktop and ubiquitous computing scenarios can influence user tolerance for gesture interactions. Results suggest that interaction context is a greater influence on user tolerance than system performance alone, where recognition error rates can potentially reach 40% before users will abandon gestures and use an alternate interaction mode in a ubiquitous computing scenario. Results also suggest that in a desktop scenario, traditional input methods are more appropriate than gestures.

[1]  William T. Freeman,et al.  Television control by hand gestures , 1994 .

[2]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[3]  Robert I. Damper,et al.  Speech versus keying in command and control applications , 1995, Int. J. Hum. Comput. Stud..

[4]  Ravin Balakrishnan,et al.  VisionWand: interaction techniques for large displays using a passive wand tracked in 3D , 2004, SIGGRAPH 2004.

[5]  Monica M. C. Schraefel,et al.  A study on the use of semaphoric gestures to support secondary task interactions , 2005, CHI EA '05.

[6]  T. W. Butler Computer response time and user performance. , 1983, CHI '83.

[7]  D. Scott McCrickard,et al.  Designing Attention-Centric Notification Systems: Five HCI Challenges , 2005 .

[8]  Michel Beaudouin-Lafon,et al.  Designing interaction, not interfaces , 2004, AVI.

[9]  I. Scott MacKenzie,et al.  Lag as a determinant of human performance in interactive systems , 1993, INTERCHI.

[10]  Jonathon S. Hare,et al.  iGesture: A Platform for Investigating Multimodal, Multimedia Gesture-based Interactions , 2005 .

[11]  mc schraefel,et al.  A Taxonomy of Gestures in Human Computer Interactions , 2005 .

[12]  Tovi Grossman,et al.  The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area , 2005, CHI.

[13]  James L. Crowley,et al.  Perceptual user interfaces: things that see , 2000, CACM.

[14]  Paul Kabbash,et al.  Human performance using computer input devices in the preferred and non-preferred hands , 1993, INTERCHI.

[15]  Jerry Alan Fails,et al.  Light widgets: interacting in every-day spaces , 2002, IUI '02.

[16]  Dennis F. Galletta,et al.  Web Site Delays: How Tolerant are Users? , 2004, J. Assoc. Inf. Syst..

[17]  Jeffrey Nichols,et al.  Interacting at a distance: measuring the performance of laser pointers and other devices , 2002, CHI.