Towards more natural digital content manipulation via user freehand gestural interaction in a living room

Advances in dynamic gesture recognition technologies now make it possible to investigate freehand input techniques. This study tried to understand how users manipulate digital content on a distant screen by hand gesture interaction in a living room environment. While there have been many existing studies that investigate freehand input techniques, we developed and applied a novel study methodology based on a combination of both an existing user elicitation study and conventional Wizard-of-Oz study that involved another non-technical user for providing feedback. Through the study, many useful issues and implications for making freehand gesture interaction design more natural in a living room environment were generated which have not been covered in previous works. Furthermore, we could observe how the initial user-defined gestures are changed over time.

[1]  Michel Beaudouin-Lafon,et al.  Charade: remote control of objects using free-hand gestures , 1993, CACM.

[2]  Tony P. Pridmore,et al.  Expected, sensed, and desired: A framework for designing sensing-based interaction , 2005, TCHI.

[3]  Daniel Thalmann,et al.  Natural activation for gesture recognition systems , 2011, CHI EA '11.

[4]  Mike Wu,et al.  A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.

[5]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[6]  Chris North,et al.  AirStroke: bringing unistroke text entry to freehand gesture interfaces , 2011, CHI.

[7]  Philip R. Cohen,et al.  On the Relationships Among Speech, Gestures, and Object Manipulation in Virtual Environments: Initial Evidence , 2005 .

[8]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[9]  Donald A. Norman,et al.  Natural user interfaces are not natural , 2010, INTR.

[10]  Sukeshini A. Grandhi,et al.  Understanding naturalness and intuitiveness in gesture production: insights for touchless gestural interfaces , 2011, CHI.

[11]  Dongmahn Seo,et al.  Zapping delay reduction method for sports live with multi-angle on smart tv , 2011, EuroITV '11.

[12]  Stephen A. Brewster,et al.  Usable gestures for mobile interfaces: evaluating social acceptability , 2010, CHI.

[13]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[14]  Hiroshi Ishii,et al.  g-stalt: a chirocentric, spatiotemporal, and telekinetic gestural interface , 2010, TEI '10.

[15]  Ken Hinckley,et al.  A survey of design issues in spatial input , 1994, UIST '94.

[16]  Olivier Chapuis,et al.  Mid-air pan-and-zoom on wall-sized displays , 2011, CHI.

[17]  Claudio S. Pinhanez,et al.  A study on the manipulation of 2D objects in a projector/camera-based augmented reality environment , 2005, CHI.

[18]  James D. Hollan,et al.  Direct Manipulation Interfaces , 1985, Hum. Comput. Interact..

[19]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[20]  Andrew D. Wilson Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.

[21]  Albrecht Schmidt,et al.  Gestural interaction on the steering wheel: reducing the visual demand , 2011, CHI.

[22]  Kun-Pyo Lee,et al.  Wearable-object-based interaction for a mobile audio device , 2010, CHI EA '10.

[23]  Niels Henze,et al.  Free-hand gestures for music playback: deriving gestures with a user-centred process , 2010, MUM.

[24]  Anton Nijholt,et al.  Gestures for Large Display Control , 2009, Gesture Workshop.

[25]  Ivan Poupyrev,et al.  Manipulating Objects in Virtual Worlds: Categorization and Empirical Evaluation of Interaction Techniques , 1999, J. Vis. Lang. Comput..

[26]  Doug A. Bowman,et al.  An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments , 1997, SI3D.

[27]  Hans-Werner Gellersen,et al.  How users associate wireless devices , 2011, CHI.

[28]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[29]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[30]  Antinus Nijholt,et al.  Games and Entertainment in Ambient Intelligence Environments , 2009 .

[31]  Bill Verplank,et al.  Actors, hairdos & videotape—informance design , 1994, CHI Conference Companion.

[32]  Sebastian Möller,et al.  I'm home: Defining and evaluating a gesture set for smart-home control , 2011, Int. J. Hum. Comput. Stud..

[33]  Rainer Stiefelhagen,et al.  Pointing gesture recognition based on 3D-tracking of face, hands and head orientation , 2003, ICMI '03.

[34]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[35]  Richard E. Ladner,et al.  Usable gestures for blind people: understanding preference and performance , 2011, CHI.

[36]  Douglas Schuler,et al.  Participatory Design: Principles and Practices , 1993 .

[37]  Doug A. Bowman,et al.  Tech-note: rapMenu: Remote Menu Selection Using Freehand Gestural Input , 2008, 2008 IEEE Symposium on 3D User Interfaces.

[38]  Jonathan Randall Howarth,et al.  Cultural similarities and differences in user-defined gestures for touchscreen user interfaces , 2010, CHI EA '10.

[39]  Arne Jönsson,et al.  Wizard of Oz studies: why and how , 1993, IUI '93.

[40]  Xu Jia,et al.  How users manipulate deformable displays as input devices , 2010, CHI.

[41]  Tovi Grossman,et al.  Multi-finger gestural interaction with 3d volumetric displays , 2004, UIST '04.