Examining the need for visual feedback during gesture interaction on mobile touchscreen devices for kids

Surface gesture interaction styles used on modern mobile touchscreen devices are often dependent on the platform and application. Some applications show a visual trace of gesture input as it is made by the user, whereas others do not. Little work has been done examining the usability of visual feedback for surface gestures, especially for children. In this paper, we present results from an empirical study conducted with children, teens, and adults to explore characteristics of gesture interaction with and without visual feedback. We find that the gestures generated with and without visual feedback by users of different ages diverge significantly in ways that make them difficult to interpret. In addition, users prefer to see visual feedback. Based on these findings, we present several design recommendations for new surface gesture interfaces for children, teens, and adults on mobile touchscreen devices. In general, we recommend providing visual feedback, especially for children, wherever possible.

[1]  Rainer Groh,et al.  Towards a formalization of multi-touch gestures , 2010, ITS '10.

[2]  Lisa Anthony,et al.  Interaction and recognition challenges in interpreting children's touch and gesture input on mobile devices , 2012, ITS.

[3]  Rafael Ballagas,et al.  Unravelling seams: improving mobile gesture recognition with visual feedback techniques , 2009, CHI.

[4]  Carly Shuler,et al.  Pockets of Potential : using mobile technologies to promote children's learning , 2009 .

[5]  Radu-Daniel Vatavu,et al.  Understanding the consistency of users' pen and finger stroke gesture articulation , 2013, Graphics Interface.

[6]  Allison Druin,et al.  Differences in pointing task performance between preschool children and adults using mice , 2004, TCHI.

[7]  Yvonne Rogers,et al.  Around the table: are multiple-touch surfaces better than single-touch for children's collaborative interactions? , 2009, CSCL.

[8]  Shumin Zhai,et al.  Foundational Issues in Touch-Surface Stroke Gesture Design - An Integrative Review , 2012, Found. Trends Hum. Comput. Interact..

[9]  Karen Littleton,et al.  It is best to point for young children: a comparison of children's pointing and dragging , 1998 .

[10]  Sean Gustafson Imaginary interfaces: touchscreen-like interaction without the screen , 2012, CHI EA '12.

[11]  Meredith Ringel Morris,et al.  Experiences with and observations of direct-touch tabletops , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[12]  Yvonne Rogers,et al.  Children designing together on a multi-touch tabletop: an analysis of spatial orientation and user interactions , 2009, IDC.

[13]  Richard E. Ladner,et al.  Usable gestures for blind people: understanding preference and performance , 2011, CHI.

[14]  Tom Jones,et al.  An Empirical Study of Children's Use of Computer Pointing Devices , 1991 .

[15]  Kori Inkpen Quinn,et al.  Drag-and-drop versus point-and-click mouse interaction styles for children , 2001, TCHI.

[16]  Kent Lyons,et al.  The impacts of limited visual feedback on mobile text entry for the Twiddler and mini-QWERTY keyboards , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).

[17]  C. Pehoski,et al.  Hand Function in the Child: Foundations for Remediation , 1994 .

[18]  Janet C. Read,et al.  PENS BEHAVING BADLY – USABILITY OF PENS AND GRAPHICS TABLETS FOR TEXT ENTRY WITH CHILDREN , 2002 .

[19]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[20]  Birgit Rösblad,et al.  Chapter 5 – Reaching and Eye-Hand Coordination , 2006 .

[21]  Tsuhan Chen,et al.  Hand tracking using spatial gesture modeling and visual feedback for a virtual DJ system , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[22]  M. Sheelagh T. Carpendale,et al.  Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits , 2011, CHI.

[23]  Keith E. Beery,et al.  Developmental Test of Visual-Motor Integration , 2012 .

[24]  Lisa Anthony,et al.  $N-protractor: a fast and accurate multistroke recognizer , 2012, Graphics Interface.

[25]  Yang Li Gesture search: a tool for fast mobile data access , 2010, UIST '10.

[26]  Afke Donker,et al.  Aiming and clicking in young children's use of the computer mouse , 2007, Comput. Hum. Behav..

[27]  M Akamatsu,et al.  Please Scroll down for Article Ergonomics a Comparison of Tactile, Auditory, and Visual Feedback in a Pointing Task Using a Mouse-type Device , 2022 .

[28]  Shumin Zhai,et al.  A comparative evaluation of finger and pen stroke gestures , 2012, CHI.

[29]  Jean Piaget Piaget’s Theory , 1976 .

[30]  C. Preda,et al.  Partial Cross-Validation of Low Correlation for Scores on the Test of Visual-Motor Integration and the Beery-Buktenica Developmental Test of Visual-Motor Integration , 1998, Perceptual and motor skills.

[31]  I. Sigel,et al.  HANDBOOK OF CHILD PSYCHOLOGY , 2006 .

[32]  Raimund Dachselt,et al.  Investigating multi-touch and pen gestures for diagram editing on interactive surfaces , 2009, ITS '09.

[33]  Toward Comparing the Touchscreen Interaction Patterns of Kids and Adults , 2012 .

[34]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[35]  Pierre Dragicevic,et al.  Earpod: eyes-free menu selection using touch input and reactive audio feedback , 2007, CHI.

[36]  M M Smyth,et al.  Visual control of movement patterns and the grammar of action. , 1989, Acta psychologica.

[37]  Margaret J. Robertson,et al.  Design and Analysis of Experiments , 2006, Handbook of statistics.