User-centered gesture development in TV viewing environment

Recent advances in interaction technologies make it possible for people to use freehand gestures in such application domains as virtual reality, augmented reality, ubiquitous computing, and smart rooms. While some applications and systems have been developed to support gesture-based interaction, it is unclear what design processes these systems have adopted. Considering the diversity of freehand gestures and the lack of design guidance on gesture-based interaction, we believe that a clear and systematic design process can help to improve the quality of gesture-based interaction. In this paper, we report a study that applies a user-centered approach in the process of gesture development, including the requirement gathering and functionality definition, gesture elicitation, gesture design and usability evaluation. Our results show that these issues must be taken into consideration when designing freehand gesture interfaces. The involvement of actual users, especially in the environment in which they would use the final systems, often leads to improved user experience and user satisfaction. Finally, we highlight the implications of this work for the development of all gesture-based applications.

[1]  G. A. Miller THE PSYCHOLOGICAL REVIEW THE MAGICAL NUMBER SEVEN, PLUS OR MINUS TWO: SOME LIMITS ON OUR CAPACITY FOR PROCESSING INFORMATION 1 , 1956 .

[2]  J. F. Kelley,et al.  An iterative design methodology for user-friendly natural language office information applications , 1984, TOIS.

[3]  Donald A. Norman,et al.  User Centered System Design: New Perspectives on Human-Computer Interaction , 1988 .

[4]  Johanna D. Moore,et al.  Proceedings of the Conference on Human Factors in Computing Systems , 1989 .

[5]  R.J.K. Jacob,et al.  Hot topics-eye-gaze computer interfaces: what you look at is what you get , 1993, Computer.

[6]  Anoop K. Sinha,et al.  Suede: a Wizard of Oz prototyping tool for speech user interfaces , 2000, UIST '00.

[7]  Austin Henderson,et al.  Interaction design: beyond human-computer interaction , 2002, UBIQ.

[8]  Tracy L. Westeyn,et al.  Georgia tech gesture toolkit: supporting experiments in gesture recognition , 2003, ICMI '03.

[9]  Micah Alpern,et al.  Developing a car gesture interface for use as a secondary task , 2003, CHI Extended Abstracts.

[10]  Jerry Alan Fails,et al.  A design tool for camera-based interaction , 2003, CHI '03.

[11]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[12]  Konrad Tollmar,et al.  Navigating in virtual environments using a vision-based interface , 2004, NordiCHI '04.

[13]  Tobias Höllerer,et al.  Vision-based interfaces for mobility , 2004, The First Annual International Conference on Mobile and Ubiquitous Systems: Networking and Services, 2004. MOBIQUITOUS 2004..

[14]  Monica M. C. Schraefel,et al.  A study on the use of semaphoric gestures to support secondary task interactions , 2005, CHI EA '05.

[15]  Perttu Hämäläinen,et al.  Children's intuitive gestures in vision-based action games , 2005, CACM.

[16]  Claudio S. Pinhanez,et al.  A study on the manipulation of 2D objects in a projector/camera-based augmented reality environment , 2005, CHI.

[17]  J. P. Lewis,et al.  SmartCanvas: a gesture-driven intelligent drawing desk system , 2005, IUI.

[18]  Albert T. Jones,et al.  Test beds for complex systems , 2005, CACM.

[19]  Andrew D. Wilson Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.

[20]  Mike Wu,et al.  A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.

[21]  Takeo Igarashi,et al.  Eyepatch: prototyping camera-based interaction through examples , 2007, UIST '07.

[22]  Mircea Nicolescu,et al.  Vision-based hand pose estimation: A review , 2007, Comput. Vis. Image Underst..

[23]  Andreas Butz,et al.  Interactions in the air: adding further depth to interactive tabletops , 2009, UIST '09.

[24]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[25]  Wu Hui Research on Key Issues of Vision-Based Gesture Interfaces , 2009 .

[26]  Scott R. Klemmer,et al.  Toolkit Support for Integrating Physical and Digital Interactions , 2009, Hum. Comput. Interact..

[27]  Kee-Eung Kim,et al.  A POMDP approach to P300-based brain-computer interfaces , 2010, IUI '10.

[28]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[29]  Yang Li,et al.  A real-time multi-cue hand tracking algorithm based on computer vision , 2010, 2010 IEEE Virtual Reality Conference (VR).

[30]  Thad Starner,et al.  MAGIC: a motion gesture design tool , 2010, CHI.

[31]  Jun Rekimoto,et al.  Anywhere touchtyping: text input on arbitrary surface using depth sensing , 2010, UIST '10.

[32]  Yi Li,et al.  Features extraction from hand images based on new detection operators , 2011, Pattern Recognit..

[33]  Yael Edan,et al.  Vision-based hand-gesture applications , 2011, Commun. ACM.

[34]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[35]  Elena Mugellini,et al.  Humans and smart environments: a novel multimodal interaction approach , 2011, ICMI '11.

[36]  Zhang Fengjun,et al.  Vision-Based Gesture Interfaces Toolkit for Interactive Games , 2011 .

[37]  Niels Henze,et al.  User-centred process for the definition of free-hand gestures applied to controlling music playback , 2012, Multimedia Systems.

[38]  Radu-Daniel Vatavu,et al.  User-defined gestures for free-hand TV control , 2012, EuroITV.

[39]  Anupam Agrawal,et al.  Vision based hand gesture recognition for human computer interaction: a survey , 2012, Artificial Intelligence Review.

[40]  Peng Song,et al.  A handle bar metaphor for virtual object manipulation with mid-air interaction , 2012, CHI.

[41]  Beat Signer,et al.  SpeeG: a multimodal speech- and gesture-based text input solution , 2012, AVI.

[42]  Xiang Cao,et al.  DejaVu: integrated support for developing interactive camera-based programs , 2012, UIST.

[43]  Yvonne Rogers,et al.  Interaction Design - Beyond Human-Computer Interaction, 3rd Edition , 2012 .

[44]  Joseph J. LaViola,et al.  Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles , 2013, IUI '13.

[45]  Yi Li,et al.  Real-time oriented behavior-driven 3D freehand tracking for direct interaction , 2013, Pattern Recognit..

[46]  Yang Li,et al.  Gesture studio: authoring multi-touch interactions through demonstration and declaration , 2013, CHI.

[47]  Tek-Jin Nam,et al.  EventHurdle: supporting designers' exploratory interaction prototyping with gesture-based sensors , 2013, CHI.

[48]  Jörg Müller,et al.  StrikeAPose: revealing mid-air gestures on public displays , 2013, CHI.

[49]  Riccardo Poli,et al.  Towards cooperative brain-computer interfaces for space navigation , 2013, IUI '13.

[50]  Vivek K. Goyal,et al.  Mime: compact, low power 3D gesture sensing for interaction with head mounted displays , 2013, UIST.

[51]  Per Ola Kristensson,et al.  Memorability of pre-designed and user-defined gesture sets , 2013, CHI.

[52]  Tonio Ball,et al.  A brain-computer interface for high-level remote control of an autonomous, reinforcement-learning-based robotic system for reaching and grasping , 2014, IUI.

[53]  Joseph J. LaViola,et al.  Exploring the usefulness of finger-based 3D gesture menu selection , 2014, CHI.

[54]  Jason I. Hong,et al.  Wave to me: user identification using body lengths and natural gestures , 2014, CHI.

[55]  Kris Luyten,et al.  Multi-viewer gesture-based interaction for omni-directional video , 2014, CHI.

[56]  JacobRob,et al.  What you look at is what you get , 2016 .