Stroke-Gesture Input for People with Motor Impairments: Empirical Results & Research Roadmap

We examine the articulation characteristics of stroke-gestures produced by people with upper body motor impairments on touchscreens as well as the accuracy rates of popular classification techniques, such as the $-family, to recognize those gestures. Our results on a dataset of 9,681 gestures collected from 70 participants reveal that stroke-gestures produced by people with motor impairments are recognized less accurately than the same gesture types produced by people without impairments, yet still accurately enough (93.0%) for practical purposes; are similar in terms of geometrical criteria to the gestures produced by people without impairments; but take considerably more time to produce (3.4s vs. 1.7s) and exhibit lower consistency (-49.7%). We outline a research roadmap for accessible gesture input on touchscreens for users with upper body motor impairments, and we make our large gesture dataset publicly available in the community.

[1]  Eamonn J. Keogh,et al.  Searching and Mining Trillions of Time Series Subsequences under Dynamic Time Warping , 2012, KDD.

[2]  Uran Oh,et al.  Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures , 2013, ASSETS.

[3]  Kyle Montague,et al.  Motor-impaired touchscreen interactions in the wild , 2014, ASSETS.

[4]  Meredith Ringel Morris,et al.  Understanding the Accessibility of Smartphone Photography for People with Motor Impairments , 2018, CHI.

[5]  Leah Findlater,et al.  Exploring Accessible Smartwatch Interactions for People with Upper Body Motor Impairments , 2018, CHI.

[6]  Shumin Zhai,et al.  Foundational Issues in Touch-Surface Stroke Gesture Design - An Integrative Review , 2012, Found. Trends Hum. Comput. Interact..

[7]  Radu-Daniel Vatavu,et al.  Relative accuracy measures for stroke gestures , 2013, ICMI '13.

[8]  Barbara Leporini,et al.  Analyzing visually impaired people’s touch gestures on smartphones , 2017, Multimedia Tools and Applications.

[9]  Radu-Daniel Vatavu,et al.  GATO: predicting human performance with multistroke and multitouch gesture input , 2018, MobileHCI.

[10]  James A. Landay,et al.  Visual similarity of pen gestures , 2000, CHI.

[11]  Beryl Plimmer,et al.  The Power of Automatic Feature Selection: Rubine on Steroids , 2010, SBIM.

[12]  Randall Davis,et al.  HMM-based efficient sketch recognition , 2005, IUI.

[13]  Patrick Carrington,et al.  Wearables and chairables: inclusive design of mobile input and output techniques for power wheelchair users , 2014, CHI.

[14]  Brad A. Myers,et al.  EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion , 2003, UIST '03.

[15]  Leah Findlater,et al.  Accessibility in context: understanding the truly mobile experience of smartphone users with motor impairments , 2014, ASSETS.

[16]  Leah Findlater,et al.  Sharing automatically tracked activity data: implications for therapists and people with mobility impairments , 2017, PervasiveHealth.

[17]  Søren Johansen,et al.  Amendments and Corrections: The Welch--James Approximation to the Distribution of the Residual Sum of Squares in a Weighted Linear Regression , 1982 .

[18]  Réjean Plamondon,et al.  A kinematic theory of rapid human movements , 1995, Biological Cybernetics.

[19]  Meredith Ringel Morris,et al.  Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities , 2017, CHI.

[20]  Laurent Grisoni,et al.  Understanding Users' Perceived Difficulty of Multi-Touch Gesture Articulation , 2014, ICMI.

[21]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[22]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[23]  Yang Li,et al.  Teaching motion gestures via recognizer feedback , 2014, IUI.

[24]  Radu-Daniel Vatavu,et al.  KeyTime: Super-Accurate Prediction of Stroke Gesture Production Times , 2018, CHI.

[25]  Yang Li,et al.  Protractor: a fast and accurate gesture recognizer , 2010, CHI.

[26]  Réjean Plamondon,et al.  A kinematic theory of rapid human movements , 1995, Biological Cybernetics.

[27]  Shumin Zhai,et al.  SHARK2: a large vocabulary shorthand writing system for pen-based computers , 2004, UIST '04.

[28]  Radu-Daniel Vatavu,et al.  Improving Gesture Recognition Accuracy on Touch Screens for Users with Low Vision , 2017, CHI.

[29]  Radu-Daniel Vatavu,et al.  The effect of sampling rate on the performance of template-based gesture recognizers , 2011, ICMI '11.

[30]  Susie Lapwood,et al.  Impact on the family , 2012 .

[31]  Lisa Anthony,et al.  Analyzing user-generated youtube videos to understand touchscreen use by people with motor impairments , 2013, CHI.

[32]  Jon Froehlich,et al.  Surveying the accessibility of touchscreen games for persons with motor impairments: a preliminary analysis , 2013, ASSETS.

[33]  James A. Landay,et al.  Voicedraw: a hands-free voice-driven drawing application for people with motor impairments , 2007, Assets '07.

[34]  Richard E. Ladner,et al.  Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities , 2009, Assets '09.

[35]  José Creissac Campos,et al.  HCI engineering: charting the way towards methods and tools for advanced interactive systems , 2014, EICS.

[36]  Radu-Daniel Vatavu,et al.  $Q: a super-quick, articulation-invariant stroke-gesture recognizer for low-resource devices , 2018, MobileHCI.

[37]  Matteo Matteucci,et al.  A predictive speller controlled by a brain-computer interface based on motor imagery , 2012, TCHI.

[38]  Yang Li Gesture search: a tool for fast mobile data access , 2010, UIST '10.

[39]  Levent Burak Kara,et al.  Hierarchical parsing and recognition of hand-sketched diagrams , 2004, UIST '04.

[40]  Bradley R. Schmerl,et al.  Software Engineering for Self-Adaptive Systems: A Second Research Roadmap , 2010, Software Engineering for Self-Adaptive Systems.

[41]  Joseph J. LaViola,et al.  Penny pincher: a blazing fast, highly accurate $-family recognizer , 2015, Graphics Interface.

[42]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[43]  Laurent Grisoni,et al.  Match-up & conquer: a two-step technique for recognizing unconstrained bimanual and multi-finger touch input , 2014, AVI.

[44]  Joseph J. LaViola,et al.  Jackknife: A Reliable Recognizer with Few Samples and Many Modalities , 2017, CHI.

[45]  Lisa Anthony,et al.  $N-protractor: a fast and accurate multistroke recognizer , 2012, Graphics Interface.

[46]  Shumin Zhai,et al.  Modeling human performance of pen stroke gestures , 2007, CHI.

[47]  Aishat Aloba,et al.  Tablets, tabletops, and smartphones: cross-platform comparisons of children’s touchscreen interactions , 2017, ICMI.

[48]  Jacob O. Wobbrock The benefits of physical edges in gesture-making: empirical support for an edge-based unistroke alphabet , 2003, CHI Extended Abstracts.

[49]  Stacey A. Hancock Modern Statistics for the Social and Behavioral Sciences: A Practical Introduction , 2012 .

[50]  James R. Eagan,et al.  Augmented letters: mnemonic gesture-based shortcuts , 2013, CHI.

[51]  Poika Isokoski,et al.  Model for unistroke writing time , 2001, CHI.

[52]  Uran Oh,et al.  Audio-Based Feedback Techniques for Teaching Touchscreen Gestures , 2015, ACM Trans. Access. Comput..

[53]  Krzysztof Z. Gajos,et al.  Ability-based design , 2018, Commun. ACM.

[54]  Luis A. Leiva,et al.  The Kinematic Theory Produces Human-Like Stroke Gestures , 2017, Interact. Comput..

[55]  Andy Field,et al.  Discovering statistics using SPSS, 2nd ed. , 2005 .

[56]  Jeffrey P. Bigham,et al.  Vocal Programming for People with Upper-Body Motor Impairments , 2018, W4A.

[57]  Olivier Bau,et al.  OctoPocus: a dynamic guide for learning gesture-based command sets , 2008, UIST '08.

[58]  Ravi Kuber,et al.  An empirical investigation of the situationally-induced impairments experienced by blind mobile device users , 2016, W4A.

[59]  Margrit Betke,et al.  EyeSwipe: Dwell-free Text Entry Using Gaze Paths , 2016, CHI.

[60]  Alex Shaw,et al.  Analyzing the articulation features of children's touchscreen gestures , 2016, ICMI.

[61]  Radu-Daniel Vatavu,et al.  Understanding the consistency of users' pen and finger stroke gesture articulation , 2013, Graphics Interface.

[62]  Joaquim A. Jorge,et al.  Mobile touchscreen user interfaces: bridging the gap between motor-impaired and able-bodied users , 2014, Universal Access in the Information Society.

[63]  Richard E. Ladner,et al.  Usable gestures for blind people: understanding preference and performance , 2011, CHI.

[64]  Jon Froehlich,et al.  Comparing Touchscreen and Mouse Input Performance by People With and Without Upper Body Motor Impairments , 2017, CHI.

[65]  Clemens Brunner,et al.  The future in brain/neural computer interaction: Horizon 2020 , 2015 .

[66]  Radu-Daniel Vatavu,et al.  The Impact of Low Vision on Touch-Gesture Articulation on Mobile Devices , 2018, IEEE Pervasive Computing.

[67]  Réjean Plamondon,et al.  Gesture Input for Users with Motor Impairments on Touchscreens: Empirical Results based on the Kinematic Theory , 2018, CHI Extended Abstracts.

[68]  Radu-Daniel Vatavu,et al.  Predicting stroke gesture input performance for users with motor impairments , 2018, MobileHCI Adjunct.

[69]  Lisa Anthony,et al.  A lightweight multistroke recognizer for user interface prototypes , 2010, Graphics Interface.

[70]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[71]  Luca Maria Gambardella,et al.  Deep, Big, Simple Neural Nets for Handwritten Digit Recognition , 2010, Neural Computation.

[72]  Anil K. Jain,et al.  A modified Hausdorff distance for object matching , 1994, Proceedings of 12th International Conference on Pattern Recognition.

[73]  Daniel Vogel,et al.  Estimating the Perceived Difficulty of Pen Gestures , 2011, INTERACT.

[74]  Leah Findlater,et al.  "OK Glass?" A Preliminary Exploration of Google Glass for Persons with Upper Body Motor Impairments , 2014, ASSETS.

[75]  William Rucklidge,et al.  Efficient Visual Recognition Using the Hausdorff Distance , 1996, Lecture Notes in Computer Science.

[76]  Joaquim A. Jorge,et al.  Towards accessible touch interfaces , 2010, ASSETS '10.

[77]  Patrick Carrington,et al.  The gest-rest: a pressure-sensitive chairable input pad for power wheelchair armrests , 2014, ASSETS.

[78]  Brad A. Myers,et al.  Text entry from power wheelchairs: edgewrite for joysticks and touchpads , 2004, Assets '04.

[79]  Shumin Zhai,et al.  Using strokes as command shortcuts: cognitive benefits and toolkit support , 2009, CHI.

[80]  Radu-Daniel Vatavu,et al.  Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations , 2014, ICMI.

[81]  Leah Findlater,et al.  Personalized, Wearable Control of a Head-mounted Display for Users with Upper Body Motor Impairments , 2015, CHI.

[82]  Joseph J. LaViola,et al.  A Rapid Prototyping Approach to Synthetic Data Generation for Improved 2D Gesture Recognition , 2016, UIST.

[83]  Enrico Rukzio,et al.  Improving Input Accuracy on Smartphones for Persons who are Affected by Tremor using Motion Sensors , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[84]  Patrick Carrington,et al.  The Gest-Rest Family , 2016, ACM Trans. Access. Comput..

[85]  Jeffrey P. Bigham,et al.  Enhancing Android accessibility for users with hand tremor by reducing fine pointing and steady tapping , 2015, W4A.

[86]  Radu-Daniel Vatavu,et al.  Gestures as point clouds: a $P recognizer for user interface prototypes , 2012, ICMI '12.

[87]  Radu-Daniel Vatavu,et al.  Smart Touch: Improving Touch Accuracy for People with Motor Impairments with Template Matching , 2016, CHI.

[88]  Andy P. Field,et al.  Discovering Statistics Using SPSS , 2000 .