Understanding shortcut gestures on mobile touch devices

Touch gestures become steadily more important with the ongoing success of touch screen devices. Compared to traditional user interfaces, gestures have the potential to lower cognitive load and the need for visual attention. However, nowadays gestures are defined by designers and developers and it is questionable if these meet all user requirements. In this paper, we present two exploratory studies that investigate how users would use unistroke touch gestures for shortcut access to a mobile phone's key functionalities. We study the functions that users want to access, the preferred activators for gesture execution, and the shapes of the user-invented gestures. We found that most gestures trigger applications, letter-shaped gestures are preferred, and the gestures should be accessible from the lock screen, the wallpaper, and the notification bar. We conclude with a coherent, unambiguous set of gestures for the 20 most frequently accessed functions, which can inform the design of future gesture-controlled applications.

[1]  Lisa Anthony,et al.  Examining the need for visual feedback during gesture interaction on mobile touchscreen devices for kids , 2013, IDC.

[2]  Shumin Zhai,et al.  Using strokes as command shortcuts: cognitive benefits and toolkit support , 2009, CHI.

[3]  Alireza Sahami Shirazi,et al.  Large-scale assessment of mobile notifications , 2014, CHI.

[4]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[5]  Andy Cockburn,et al.  Analysing mouse and pen flick gestures , 2002, CHINZ '02.

[6]  John C. Thomas,et al.  Exploring pinch and spread gestures on mobile devices , 2013, MobileHCI '13.

[7]  Niels Henze,et al.  Gesture recognition with a Wii controller , 2008, TEI.

[8]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[9]  Albrecht Schmidt,et al.  Gestural interaction on the steering wheel: reducing the visual demand , 2011, CHI.

[10]  Lisa Anthony,et al.  $N-protractor: a fast and accurate multistroke recognizer , 2012, Graphics Interface.

[11]  Stephen A. Brewster,et al.  A study of on-device gestures , 2012, Mobile HCI.

[12]  M. Sheelagh T. Carpendale,et al.  Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits , 2011, CHI.

[13]  Scott E. Hudson,et al.  Foldable interactive displays , 2008, UIST '08.

[14]  Yang Li,et al.  Gesture coder: a tool for programming multi-touch gestures by demonstration , 2012, CHI.

[15]  Yang Li Gesture search: a tool for fast mobile data access , 2010, UIST '10.

[16]  Yang Li,et al.  Bootstrapping personal gesture shortcuts with the wisdom of the crowd and handwriting recognition , 2012, CHI.

[17]  Gilles Bailly,et al.  Leaf Menus: Linear Menus with Stroke Shortcuts for Small Handheld Devices , 2009, INTERACT.

[18]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[19]  Niels Henze,et al.  My App is an Experiment: Experience from User Studies in Mobile App Stores , 2011, Int. J. Mob. Hum. Comput. Interact..

[20]  Roel Vertegaal,et al.  PaperPhone: understanding the use of bend gestures in mobile devices with flexible electronic paper displays , 2011, CHI.

[21]  Teddy Seyed,et al.  Eliciting usable gestures for multi-display environments , 2012, ITS.

[22]  Per Ola Kristensson,et al.  Memorability of pre-designed and user-defined gesture sets , 2013, CHI.

[23]  Yang Li,et al.  Gesture avatar: a technique for operating mobile user interfaces using gestures , 2011, CHI.

[24]  ZhaiShumin,et al.  Foundational Issues in Touch-Surface Stroke Gesture Design , 2012 .

[25]  A BoltRichard,et al.  Put-that-there , 1980 .