User-defined gesture interaction for in-vehicle information systems

Gesture elicitation study, a technique emerging from the field of participatory design, has been extensively applied in emerging interaction and sensing technologies in recent years. However, traditional gesture elicitation study often suffers from the gesture disagreement and legacy bias problem and may not generate optimal gestures for a target system. This paper reports a research project on user-defined gestures for interacting with in-vehicle information systems. The main contribution of our research lies in a 3-stage, participatory design method we propose for deriving more reliable gestures than traditional gesture elicitation methods. Using this method, we generated a set of user-defined gestures for secondary tasks in an in-vehicle information system. Drawing on our research, we develop a set of design guidelines for freehand gesture design. We highlight the implications of this work for the gesture elicitation for all gestural interfaces.

[1]  Ying-Chao Tung,et al.  User-Defined Game Input for Smart Glasses in Public Space , 2015, CHI.

[2]  Teddy Seyed,et al.  Eliciting usable gestures for multi-display environments , 2012, ITS.

[3]  Ming Li,et al.  Feature extraction using two-dimensional maximum embedding difference , 2014, Inf. Sci..

[4]  Anupam Agrawal,et al.  Vision based hand gesture recognition for human computer interaction: a survey , 2012, Artificial Intelligence Review.

[5]  Lu Yang,et al.  Survey on 3D Hand Gesture Recognition , 2016, IEEE Transactions on Circuits and Systems for Video Technology.

[6]  Klaus Bengler,et al.  Gesture Control for Use in Automobiles , 2000, MVA.

[7]  G. A. Miller THE PSYCHOLOGICAL REVIEW THE MAGICAL NUMBER SEVEN, PLUS OR MINUS TWO: SOME LIMITS ON OUR CAPACITY FOR PROCESSING INFORMATION 1 , 1956 .

[8]  Hao Zheng,et al.  Local graph embedding based on maximum margin criterion via fuzzy set , 2017, Fuzzy Sets Syst..

[9]  Ling Shao,et al.  RGB-D datasets using microsoft kinect or similar sensors: a survey , 2017, Multimedia Tools and Applications.

[10]  Sriram Subramanian,et al.  Would you do that?: understanding social acceptance of gestural interfaces , 2010, Mobile HCI.

[11]  Meredith Ringel Morris,et al.  Web on the wall: insights from a multimodal interaction elicitation study , 2012, ITS.

[12]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[13]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[14]  Mohammad Obaid,et al.  User-Defined Body Gestures for Navigational Control of a Humanoid Robot , 2012, ICSR.

[15]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[16]  Mikael B. Skov,et al.  You can touch, but you can't look: interacting with in-vehicle systems , 2008, CHI.

[17]  F. Freeman,et al.  Evaluation of a Psychophysiologically Controlled Adaptive Automation System, Using Performance on a Tracking Task , 2000, Applied psychophysiology and biofeedback.

[18]  Michael Rohs,et al.  User-defined gestures for connecting mobile phones, public displays, and tabletops , 2010, Mobile HCI.

[19]  Joseph J. LaViola,et al.  Exploring the usefulness of finger-based 3D gesture menu selection , 2014, CHI.

[20]  Joseph J. LaViola,et al.  Towards user-defined multi-touch gestures for 3D objects , 2013, ITS.

[21]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[22]  Kris Luyten,et al.  Multi-viewer gesture-based interaction for omni-directional video , 2014, CHI.

[23]  Ci Wang,et al.  User-Defined Gestures for Gestural Interaction: Extending from Hands to Other Body Parts , 2018, Int. J. Hum. Comput. Interact..

[24]  Eva Hornecker,et al.  Modifying Gesture Elicitation: Do Kinaesthetic Priming and Increased Production Reduce Legacy Bias? , 2016, Tangible and Embedded Interaction.

[25]  Anne Marie Piper,et al.  A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface , 2013, IDC.

[26]  Teddy Seyed,et al.  User Elicitation on Single-hand Microgestures , 2016, CHI.

[27]  Niels Henze,et al.  User-centred process for the definition of free-hand gestures applied to controlling music playback , 2012, Multimedia Systems.

[28]  Radu-Daniel Vatavu,et al.  On free-hand TV control: experimental results on user-elicited gestures with Leap Motion , 2015, Personal and Ubiquitous Computing.

[29]  Andreas Riener Gestural Interaction in Vehicular Applications , 2012, Computer.

[30]  Elena Mugellini,et al.  Gesturing on the Steering Wheel: a User-elicited taxonomy , 2014, AutomotiveUI.

[31]  Radu-Daniel Vatavu,et al.  Between-Subjects Elicitation Studies: Formalization and Tool Support , 2016, CHI.

[32]  Albrecht Schmidt,et al.  Gestural interaction on the steering wheel: reducing the visual demand , 2011, CHI.

[33]  Wendy Yee Potential Limitations of Multi-touch Gesture Vocabulary: Differentiation, Adoption, Fatigue , 2009, HCI.

[34]  Per Ola Kristensson,et al.  User-defined Interface Gestures: Dataset and Analysis , 2014, ITS '14.

[35]  Sebastian Möller,et al.  I'm home: Defining and evaluating a gesture set for smart-home control , 2011, Int. J. Hum. Comput. Stud..

[36]  Yi Li,et al.  Real-time oriented behavior-driven 3D freehand tracking for direct interaction , 2013, Pattern Recognit..

[37]  Radu-Daniel Vatavu,et al.  User-defined gestures for free-hand TV control , 2012, EuroITV.

[38]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[39]  Shin'ichi Satoh,et al.  Human gesture recognition system for TV viewing using time-of-flight camera , 2011, Multimedia Tools and Applications.

[40]  Abdulmotaleb El-Saddik,et al.  An Elicitation Study on Gesture Preferences and Memorability Toward a Practical Hand-Gesture Vocabulary for Smart Televisions , 2015, IEEE Access.

[41]  Guowei Yang,et al.  Two-dimensional discriminant locality preserving projections (2DDLPP) and its application to feature extraction via fuzzy set , 2015, Multimedia Tools and Applications.

[42]  Huiyue Wu,et al.  The Gesture Disagreement Problem in Free-hand Gesture Interaction , 2018, Int. J. Hum. Comput. Interact..

[43]  Jean Vanderdonckt,et al.  Gestures for Smart Rings: Empirical Results, Insights, and Design Implications , 2018, Conference on Designing Interactive Systems.

[44]  Rainer Groh,et al.  User Interface and Interaction Design in Future Auto-Mobility , 2016, HCI.

[45]  Elisabeth André,et al.  Studying user-defined iPad gestures for interaction in multi-display environment , 2012, IUI '12.

[46]  Justus J. Randolph Free-Marginal Multirater Kappa (multirater K[free]): An Alternative to Fleiss' Fixed-Marginal Multirater Kappa. , 2005 .

[47]  David Ott,et al.  Web on the Wall Reloaded: Implementation, Replication and Refinement of User-Defined Interaction Sets , 2014, ITS '14.

[48]  Alois Ferscha,et al.  Natural DVI based on intuitive hand gestures , 2011 .

[49]  Jacob O. Wobbrock,et al.  Beyond QWERTY: augmenting touch screen keyboards with multi-touch gestures for non-alphanumeric input , 2012, CHI.

[50]  Ali Mazalek,et al.  Exploring the design space of gestural interaction with active tokens through user-defined gestures , 2014, CHI.

[51]  Jianmin Wang,et al.  User-centered gesture development in TV viewing environment , 2014, Multimedia Tools and Applications.

[52]  Mark Billinghurst,et al.  User Defined Gestures for Augmented Virtual Mirrors: A Guessability Study , 2015, CHI Extended Abstracts.

[53]  Hanseok Ko,et al.  Gesture recognition using depth-based hand tracking for contactless controller application , 2012, 2012 IEEE International Conference on Consumer Electronics (ICCE).

[54]  Micah Alpern,et al.  Developing a car gesture interface for use as a secondary task , 2003, CHI Extended Abstracts.

[55]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[56]  Preben Hansen,et al.  Effects of User’s Hand Orientation and Spatial Movements on Free Hand Interactions with Large Displays , 2018, Int. J. Hum. Comput. Interact..

[57]  Charles R. Severance,et al.  Discovering JavaScript Object Notation , 2012, Computer.

[58]  Bongshin Lee,et al.  Reducing legacy bias in gesture elicitation studies , 2014, INTR.

[59]  Björn W. Schuller,et al.  A real-time system for hand gesture controlled operation of in-car devices , 2003, 2003 International Conference on Multimedia and Expo. ICME '03. Proceedings (Cat. No.03TH8698).

[60]  Abdulmotaleb El-Saddik,et al.  Motion-path based in car gesture control of the multimedia devices , 2011, DIVANet '11.

[61]  Andreas Butz,et al.  Culturally Independent Gestures for In-Car Interactions , 2013, INTERACT.

[62]  Donghun Lee,et al.  Towards successful user interaction with systems: focusing on user-derived gestures for smart home systems. , 2014, Applied ergonomics.

[63]  Christina Boucher,et al.  Exploring Non-touchscreen Gestures for Smartwatches , 2016, CHI.