It Made More Sense: Comparison of User-Elicited On-skin Touch and Freehand Gesture Sets

Research on gestural control interfaces is getting more widespread for the purpose of creating natural interfaces. Two of these popular gesture types are freehand and on-skin touch gestures, because they eliminate the use of an intermediary device. Previous studies investigated these modalities separately with user-elicitation methods; however, there is a gap in the field considering their comparison. In this study, we compare user-elicited on-skin touch and freehand gesture sets to explore users’ preferences. Thus, we conducted an experiment in which we compare 13 gestures to control computer tasks for each set. Eighteen young adults participated in our study and filled our survey consisted of NASA Task Load Index and 4 additional items of social acceptability, learnability, memorability, and the goodness. The results show that on-skin touch gestures were less physically demanding and more socially acceptable compared to freehand gestures. On the other hand, freehand gestures were more intuitive than on-skin touch gestures. Overall, our results suggest that different gesture types could be useful in different scenarios. Our contribution to the field might inspire designers and developers to make better judgments for designing new gestural interfaces for a variety of devices.

[1]  Yoshifumi Kitamura,et al.  Body-centric interaction techniques for very large wall displays , 2010, NordiCHI.

[2]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[3]  Shengdong Zhao,et al.  Botential: Localizing On-Body Gestures by Measuring Electrical Signatures on the Human Skin , 2015, MobileHCI.

[4]  Tilbe Göksun,et al.  Hands as a Controller: User Preferences for Hand Specific On-Skin Gestures , 2017, Conference on Designing Interactive Systems.

[5]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[6]  Ippei Torigoe,et al.  Active bioacoustic measurement for human-to-human skin contact area detection , 2015, 2015 IEEE SENSORS.

[7]  Oğuzhan Özcan,et al.  DubTouch: exploring human to human touch interaction for gaming in double sided displays , 2014, NordiCHI.

[8]  Radu-Daniel Vatavu,et al.  Leap gestures for TV: insights from an elicitation study , 2014, TVX.

[9]  David Lindlbauer,et al.  Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI , 2012 .

[10]  Gierad Laput,et al.  SkinTrack: Using the Body as an Electrical Waveguide for Continuous Finger Tracking on the Skin , 2016, CHI.

[11]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[12]  Jianmin Wang,et al.  User-centered gesture development in TV viewing environment , 2014, Multimedia Tools and Applications.

[13]  Madeline Gannon,et al.  Tactum: A Skin-Centric Approach to Digital Design and Fabrication , 2015, CHI.

[14]  Niels Henze,et al.  Free-hand gestures for music playback: deriving gestures with a user-centred process , 2010, MUM.

[15]  Joni Karvinen,et al.  Haptic feedback in freehand gesture interaction , 2015 .

[16]  Oguz Turan Buruk,et al.  Experiencing Human-to-Human Touch in Digital Games , 2016, CHI Extended Abstracts.

[17]  Ej Eline Jansen,et al.  Teaching users how to interact with gesture-based interfaces:a comparison of teaching-methods , 2012 .

[18]  Sebastian Möller,et al.  I'm home: Defining and evaluating a gesture set for smart-home control , 2011, Int. J. Hum. Comput. Stud..

[19]  Jonathan Randall Howarth,et al.  Cultural similarities and differences in user-defined gestures for touchscreen user interfaces , 2010, CHI EA '10.

[20]  Chris Harrison,et al.  On-body interaction: armed and dangerous , 2012, TEI.

[21]  Radu-Daniel Vatavu,et al.  Nomadic gestures: A technique for reusing gesture commands for frequent ambient interactions , 2012, J. Ambient Intell. Smart Environ..

[22]  Per Ola Kristensson,et al.  Memorability of pre-designed and user-defined gesture sets , 2013, CHI.

[23]  Tilbe Göksun,et al.  Sensation: Measuring the Effects of a Human-to-Human Social Touch Based Controller on the Player Experience , 2016, CHI.

[24]  Radu-Daniel Vatavu A comparative study of user-defined handheld vs. freehand gestures for home entertainment environments , 2013, J. Ambient Intell. Smart Environ..

[25]  Wendy E. Mackay,et al.  Body-centric design space for multi-surface interaction , 2013, CHI.

[26]  Anthony Collins,et al.  Investigating intuitiveness and effectiveness of gestures for free spatial interaction with large displays , 2012, PerDis '12.

[27]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[28]  Sebastian Boring,et al.  Should I Stay or Should I Go? Selecting Between Touch and Mid-Air Gestures for Large-Display Interaction , 2015, INTERACT.

[29]  Radu-Daniel Vatavu,et al.  User-defined gestures for free-hand TV control , 2012, EuroITV.

[30]  Teddy Seyed,et al.  User Elicitation on Single-hand Microgestures , 2016, CHI.

[31]  Radu-Daniel Vatavu,et al.  On free-hand TV control: experimental results on user-elicited gestures with Leap Motion , 2015, Personal and Ubiquitous Computing.

[32]  Raimund Dachselt,et al.  Evaluating a User-Elicited Gesture Set for Interactive Displays , 2011, Mensch & Computer.

[33]  Olivier Chapuis,et al.  Mid-air pan-and-zoom on wall-sized displays , 2011, CHI.

[34]  Judy Kay,et al.  To Dwell or Not to Dwell: An Evaluation of Mid-Air Gestures for Large Information Displays , 2015, OZCHI.

[35]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[36]  Rainer Stiefelhagen,et al.  How to Click in Mid-Air , 2013, HCI.