Influence of cultural factors on freehand gesture design

Abstract In the design of gesture-based user interfaces, gesture elicitation methods are often used to obtain gesture preferences of end users. User choices of gestures can be affected by various factors, such as their cultural backgrounds. Considering cultural factors in designing gesture-based interfaces is important to systems that target for users from different cultures. However, so little empirical research has been conducted on the impact of cultures on gesture commands. This paper reports a study with a series of three experiments on the gesture preferences for tasks in three different application domains with participants from two cultures. We find that some gesture choices are strongly influenced by the cultural background of participants. We discuss the characteristics of those gestures that exhibit cultural dependence and those tasks that do not. We also provide some design guidelines for freehand gesture-based interfaces.

[1]  Ci Wang,et al.  User-Defined Gestures for Gestural Interaction: Extending from Hands to Other Body Parts , 2018, Int. J. Hum. Comput. Interact..

[2]  Huiyue Wu,et al.  The Gesture Disagreement Problem in Free-hand Gesture Interaction , 2018, Int. J. Hum. Comput. Interact..

[3]  Susan T. Dumais,et al.  The vocabulary problem in human-system communication , 1987, CACM.

[4]  Abdulmotaleb El-Saddik,et al.  An Elicitation Study on Gesture Preferences and Memorability Toward a Practical Hand-Gesture Vocabulary for Smart Televisions , 2015, IEEE Access.

[5]  Radu-Daniel Vatavu,et al.  Between-Subjects Elicitation Studies: Formalization and Tool Support , 2016, CHI.

[6]  Matthias Rehm,et al.  Wave like an Egyptian: accelerometer based gesture recognition for culture specific interactions , 2008 .

[7]  Andreas Butz,et al.  Interactions in the air: adding further depth to interactive tabletops , 2009, UIST '09.

[8]  Eva Hornecker,et al.  Modifying Gesture Elicitation: Do Kinaesthetic Priming and Increased Production Reduce Legacy Bias? , 2016, Tangible and Embedded Interaction.

[9]  Per Ola Kristensson,et al.  User-defined Interface Gestures: Dataset and Analysis , 2014, ITS '14.

[10]  Hongan Wang,et al.  An Empirical Study on the Interaction Capability of Arm Stretching , 2017, Int. J. Hum. Comput. Interact..

[11]  Sebastian Möller,et al.  I'm home: Defining and evaluating a gesture set for smart-home control , 2011, Int. J. Hum. Comput. Stud..

[12]  Yi Li,et al.  Real-time oriented behavior-driven 3D freehand tracking for direct interaction , 2013, Pattern Recognit..

[13]  Anne Marie Piper,et al.  A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface , 2013, IDC.

[14]  Marcos Serrano,et al.  Exploring the use of hand-to-face input for interacting with head-worn displays , 2014, CHI.

[15]  Kris Luyten,et al.  Multi-viewer gesture-based interaction for omni-directional video , 2014, CHI.

[16]  Elena Mugellini,et al.  Gesturing on the Steering Wheel: a User-elicited taxonomy , 2014, AutomotiveUI.

[17]  Amanda Brown,et al.  Gesture viewpoint in Japanese and English: Cross-linguistic interactions between two languages in one speaker , 2008 .

[18]  Albrecht Schmidt,et al.  Gestural interaction on the steering wheel: reducing the visual demand , 2011, CHI.

[19]  Ali Mazalek,et al.  Exploring the design space of gestural interaction with active tokens through user-defined gestures , 2014, CHI.

[20]  Radu-Daniel Vatavu,et al.  User-defined gestures for free-hand TV control , 2012, EuroITV.

[21]  Per Ola Kristensson,et al.  Memorability of pre-designed and user-defined gesture sets , 2013, CHI.

[22]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[23]  Shin'ichi Satoh,et al.  Human gesture recognition system for TV viewing using time-of-flight camera , 2011, Multimedia Tools and Applications.

[24]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[25]  Preben Hansen,et al.  Effects of User’s Hand Orientation and Spatial Movements on Free Hand Interactions with Large Displays , 2018, Int. J. Hum. Comput. Interact..

[26]  Ling Shao,et al.  Enhanced Computer Vision With Microsoft Kinect Sensor: A Review , 2013, IEEE Transactions on Cybernetics.

[27]  Justus J. Randolph Free-Marginal Multirater Kappa (multirater K[free]): An Alternative to Fleiss' Fixed-Marginal Multirater Kappa. , 2005 .

[28]  Ying-Chao Tung,et al.  User-Defined Game Input for Smart Glasses in Public Space , 2015, CHI.

[29]  Teddy Seyed,et al.  Eliciting usable gestures for multi-display environments , 2012, ITS.

[30]  Elisabeth André,et al.  User-Defined Body Gestures for an Interactive Storytelling Scenario , 2013, INTERACT.

[31]  Matthew Turk,et al.  Multimodal interaction: A review , 2014, Pattern Recognit. Lett..

[32]  David Ott,et al.  Web on the Wall Reloaded: Implementation, Replication and Refinement of User-Defined Interaction Sets , 2014, ITS '14.

[33]  Ling Shao,et al.  RGB-D datasets using microsoft kinect or similar sensors: a survey , 2017, Multimedia Tools and Applications.

[34]  Stephan Lukosch,et al.  On the Efficiency of a VR Hand Gesture-Based Interface for 3D Object Manipulations in Conceptual Design , 2017, Int. J. Hum. Comput. Interact..

[35]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[36]  Jianmin Wang,et al.  User-centered gesture development in TV viewing environment , 2014, Multimedia Tools and Applications.

[37]  Anupam Agrawal,et al.  Vision based hand gesture recognition for human computer interaction: a survey , 2012, Artificial Intelligence Review.

[38]  Bongshin Lee,et al.  Reducing legacy bias in gesture elicitation studies , 2014, INTR.

[39]  Joseph J. LaViola,et al.  Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles , 2013, IUI '13.

[40]  Peng Song,et al.  A handle bar metaphor for virtual object manipulation with mid-air interaction , 2012, CHI.

[41]  Per Ola Kristensson,et al.  Continuous recognition of one-handed and two-handed gestures using 3D full-body motion tracking sensors , 2012, IUI '12.

[42]  Martin Hitz,et al.  Exploring intuitiveness of metaphor-based gestures for UAV navigation , 2017, 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[43]  Mark Billinghurst,et al.  User Defined Gestures for Augmented Virtual Mirrors: A Guessability Study , 2015, CHI Extended Abstracts.

[44]  Lu Yang,et al.  Survey on 3D Hand Gesture Recognition , 2016, IEEE Transactions on Circuits and Systems for Video Technology.

[45]  Teddy Seyed,et al.  User Elicitation on Single-hand Microgestures , 2016, CHI.

[46]  Radu-Daniel Vatavu,et al.  On free-hand TV control: experimental results on user-elicited gestures with Leap Motion , 2015, Personal and Ubiquitous Computing.

[47]  Michael Rohs,et al.  User-defined gestures for connecting mobile phones, public displays, and tabletops , 2010, Mobile HCI.

[48]  Joseph J. LaViola,et al.  Exploring the usefulness of finger-based 3D gesture menu selection , 2014, CHI.

[49]  Joseph J. LaViola,et al.  Towards user-defined multi-touch gestures for 3D objects , 2013, ITS.

[50]  Seong-Whan Lee,et al.  Gesture Spotting and Recognition for Human–Robot Interaction , 2007, IEEE Transactions on Robotics.

[51]  Jean Vanderdonckt,et al.  Gestures for Smart Rings: Empirical Results, Insights, and Design Implications , 2018, Conference on Designing Interactive Systems.

[52]  Elisabeth André,et al.  Studying user-defined iPad gestures for interaction in multi-display environment , 2012, IUI '12.

[53]  Jonathan Randall Howarth,et al.  Cultural similarities and differences in user-defined gestures for touchscreen user interfaces , 2010, CHI EA '10.

[54]  Leonel Morgado Cultural Awareness and Personal Customization of Gestural Commands Using a Shamanic Interface , 2013, DSAI.

[55]  Niels Henze,et al.  User-centred process for the definition of free-hand gestures applied to controlling music playback , 2012, Multimedia Systems.

[56]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[57]  Juan A. Besada,et al.  A gesture-based method for natural interaction in smart spaces , 2015, J. Ambient Intell. Smart Environ..

[58]  Christina Boucher,et al.  Exploring Non-touchscreen Gestures for Smartwatches , 2016, CHI.

[59]  Meredith Ringel Morris,et al.  Web on the wall: insights from a multimodal interaction elicitation study , 2012, ITS.

[60]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[61]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[62]  Mohammad Obaid,et al.  User-Defined Body Gestures for Navigational Control of a Humanoid Robot , 2012, ICSR.