Improving Gesture Recognition Accuracy on Touch Screens for Users with Low Vision

We contribute in this work on gesture recognition to improve the accessibility of touch screens for people with low vision. We examine the accuracy of popular recognizers for gestures produced by people with and without visual impairments, and we show that the user-independent accuracy of $P, the best recognizer among those evaluated, is small for people with low vision (83.8%), despite $P being very effective for gestures produced by people without visual impairments (95.9%). By carefully analyzing the gesture articulations produced by people with low vision, we inform key algorithmic revisions for the P recognizer, which we call P+. We show significant accuracy improvements of $P+ for gestures produced by people with low vision, from 83.8% to 94.7% on average and up to 98.2%, and 3x faster execution times compared to P.

[1]  Richard E. Ladner,et al.  PassChords: secure multi-touch authentication for blind people , 2012, ASSETS '12.

[2]  Radu-Daniel Vatavu,et al.  Gestures as point clouds: a $P recognizer for user interface prototypes , 2012, ICMI '12.

[3]  Radu-Daniel Vatavu,et al.  Smart Touch: Improving Touch Accuracy for People with Motor Impairments with Template Matching , 2016, CHI.

[4]  Aaron F. Bobick,et al.  Parametric Hidden Markov Models for Gesture Recognition , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Lisa Anthony,et al.  A lightweight multistroke recognizer for user interface prototypes , 2010, Graphics Interface.

[6]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[7]  Radu-Daniel Vatavu,et al.  Relative accuracy measures for stroke gestures , 2013, ICMI '13.

[8]  Radu-Daniel Vatavu,et al.  Understanding the consistency of users' pen and finger stroke gesture articulation , 2013, Graphics Interface.

[9]  Marcelo M. Wanderley,et al.  Gesture-Based Human-Computer Interaction and Simulation, Proceedings of Gesture Workshop 2007 , 2009 .

[10]  Laurent Grisoni,et al.  Multiscale Detection of Gesture Patterns in Continuous Motion Trajectories , 2009, Gesture Workshop.

[11]  Uran Oh,et al.  Audio-Based Feedback Techniques for Teaching Touchscreen Gestures , 2015, ACM Trans. Access. Comput..

[12]  Eamonn J. Keogh,et al.  Searching and Mining Trillions of Time Series Subsequences under Dynamic Time Warping , 2012, KDD.

[13]  Yang Li,et al.  Gesture coder: a tool for programming multi-touch gestures by demonstration , 2012, CHI.

[14]  Barbara Leporini,et al.  Exploring Visually Impaired People's Gesture Preferences for Smartphones , 2015, CHItaly.

[15]  Stephen A. Brewster,et al.  Investigating touchscreen accessibility for people with visual impairments , 2008, NordiCHI.

[16]  Krzysztof Z. Gajos,et al.  Ability-Based Design: Concept, Principles and Examples , 2011, TACC.

[17]  Richard E. Ladner,et al.  Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities , 2009, Assets '09.

[18]  Trevor Darrell,et al.  Hidden Conditional Random Fields for Gesture Recognition , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[19]  Barbara Leporini,et al.  Making Visual Maps Accessible to the Blind , 2011, HCI.

[20]  Krzysztof Z. Gajos,et al.  Automatically generating user interfaces adapted to users' motor and vision capabilities , 2007, UIST.

[21]  Lisa Anthony,et al.  $N-protractor: a fast and accurate multistroke recognizer , 2012, Graphics Interface.

[22]  Ivan Poupyrev,et al.  Interacting with Soli: Exploring Fine-Grained Dynamic Gesture Recognition in the Radio-Frequency Spectrum , 2016, UIST.

[23]  Barbara Leporini,et al.  Analyzing visually impaired people’s touch gestures on smartphones , 2017, Multimedia Tools and Applications.

[24]  Daniel Vogel,et al.  Estimating the Perceived Difficulty of Pen Gestures , 2011, INTERACT.

[25]  Jacob O. Wobbrock,et al.  Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.

[26]  Levent Burak Kara,et al.  Hierarchical parsing and recognition of hand-sketched diagrams , 2004, UIST '04.

[27]  Tony DeRose,et al.  Proton: multitouch gestures as regular expressions , 2012, CHI.

[28]  Poika Isokoski,et al.  Model for unistroke writing time , 2001, CHI.

[29]  Louis Vuurpijl,et al.  Iconic and multi-stroke gesture recognition , 2009, Pattern Recognit..

[30]  Beryl Plimmer,et al.  The Power of Automatic Feature Selection: Rubine on Steroids , 2010, SBIM.

[31]  김만두,et al.  시력 손상과 시각 장애(Visual Impairment and Blindness) , 2011 .

[32]  Radu-Daniel Vatavu,et al.  Touch interaction for children aged 3 to 6 years: Experimental findings and relationship to motor skills , 2015, Int. J. Hum. Comput. Stud..

[33]  Radu-Daniel Vatavu,et al.  1F: one accessory feature design for gesture recognizers , 2012, IUI '12.

[34]  Yang Li,et al.  Gesture studio: authoring multi-touch interactions through demonstration and declaration , 2013, CHI.

[35]  Shumin Zhai,et al.  SHARK2: a large vocabulary shorthand writing system for pen-based computers , 2004, UIST '04.

[36]  Richard E. Ladner,et al.  Usable gestures for blind people: understanding preference and performance , 2011, CHI.

[37]  L. R. Rabiner,et al.  A comparative study of several dynamic time-warping algorithms for connected-word recognition , 1981, The Bell System Technical Journal.

[38]  Radu-Daniel Vatavu,et al.  Synthesizing Stroke Gestures Across User Populations: A Case for Users with Visual Impairments , 2017, CHI.

[39]  Luis A. Leiva,et al.  Gestures à Go Go , 2015, ACM Trans. Intell. Syst. Technol..

[40]  Laurent Grisoni,et al.  Match-up & conquer: a two-step technique for recognizing unconstrained bimanual and multi-finger touch input , 2014, AVI.

[41]  Kristen Shinohara,et al.  Observing Sara: a case study of a blind person's interactions with technology , 2007, Assets '07.

[42]  Radu-Daniel Vatavu,et al.  The impact of motion dimensionality and bit cardinality on the design of 3D gesture recognizers , 2013, Int. J. Hum. Comput. Stud..

[43]  Laurent Grisoni,et al.  Gesture Recognition Based on Elastic Deformation Energies , 2009, Gesture Workshop.

[44]  Kenneth Steiglitz,et al.  Combinatorial Optimization: Algorithms and Complexity , 1981 .

[45]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[46]  Laurent Grisoni,et al.  Understanding Users' Perceived Difficulty of Multi-Touch Gesture Articulation , 2014, ICMI.

[47]  Graham Horton,et al.  A new approach for touch gesture recognition: Conversive Hidden non-Markovian Models , 2015, J. Comput. Sci..

[48]  Yang Li,et al.  Protractor: a fast and accurate gesture recognizer , 2010, CHI.

[49]  Anke M. Brock,et al.  Making Gestural Interaction Accessible to Visually Impaired People , 2014, EuroHaptics.

[50]  H. Piaggio Differential Geometry of Curves and Surfaces , 1952, Nature.

[51]  Radu-Daniel Vatavu,et al.  The effect of sampling rate on the performance of template-based gesture recognizers , 2011, ICMI '11.

[52]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[53]  Radu-Daniel Vatavu,et al.  Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations , 2014, ICMI.

[54]  L. Levin,et al.  Ocular disease : mechanisms and management , 2010 .

[55]  Meredith Ringel Morris,et al.  Touchplates: low-cost tactile overlays for visually impaired touch screen users , 2013, ASSETS.

[56]  Randall Davis,et al.  HMM-based efficient sketch recognition , 2005, IUI.