Enhancing Android accessibility for users with hand tremor by reducing fine pointing and steady tapping

Smartphones and tablets with touchscreen have demonstrated potential to support the needs of individuals with motor impairments such as hand tremor. However, those users still face major challenges with conventional touchscreen gestures. These challenges are mostly caused by the fine precision requirement to disambiguate between targets on small screens. To reduce the difficulty caused by hand tremor in combination with small touch targets on the screen, we developed an experimental system-wide assistive service called Touch Guard. It enables enhanced area touch and a series of complementary features. This service provides the enhanced area touch feature through two possible disambiguation modes: magnification and descriptive targets list. In a laboratory study with motor-impaired users, we compared both modes to conventional tapping and tested Touch Guard with real-world applications. Targets list based disambiguation was more successful, reducing the error rate by 65% compared to conventional tapping. In addition, several challenges and design implications were discovered when presenting new touchscreen interaction techniques to users with motor impairments. As the experimental product of an intern research project at Google, Touch Guard demonstrates broad potential for solving accessibility issues for people with hand tremor using their familiar mobile devices, instead of high-cost hardware.

[1]  Ben Shneiderman,et al.  High Precision Touchscreens: Design Strategies and Comparisons with a Mouse , 1991, Int. J. Man Mach. Stud..

[2]  Joaquim A. Jorge,et al.  Assessing mobile touch interfaces for tetraplegics , 2010, Mobile HCI.

[3]  Shari Trewin,et al.  Physical accessibility of touchscreen smartphones , 2013, ASSETS.

[4]  Joaquim A. Jorge,et al.  Mobile touchscreen user interfaces: bridging the gap between motor-impaired and able-bodied users , 2014, Universal Access in the Information Society.

[5]  Brad A. Myers,et al.  EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion , 2003, UIST '03.

[6]  Jon Froehlich,et al.  Barrier pointing: using physical edges to assist target acquisition on mobile device touch screens , 2007, Assets '07.

[7]  Krishna Bharat,et al.  Making computers easier for older adults to use: area cursors and sticky icons , 1997, CHI.

[8]  Franck Poirier,et al.  Text entry for mobile devices and users with severe motor impairments: handiglyph, a primitive shapes based onscreen keyboard , 2008, Assets '08.

[9]  Fadi Biadsy,et al.  JustSpeak: enabling universal voice control on Android , 2014, W4A.

[10]  S. Wulff SAS for Mixed Models , 2007 .

[11]  C. Schuster,et al.  The Relationship of ANOVA Models with Random Effects and Repeated Measurement Designs , 2001 .

[12]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[13]  Jacob O. Wobbrock,et al.  Enhanced area cursors: reducing fine pointing demands for people with motor impairments , 2010, UIST.

[14]  Christopher M. Schlick,et al.  Evaluating swabbing: a touchscreen input method for elderly users with tremor , 2011, CHI.

[15]  Leah Findlater,et al.  Accessibility in context: understanding the truly mobile experience of smartphone users with motor impairments , 2014, ASSETS.

[16]  Russell D. Wolfinger,et al.  SAS for Mixed Models, Second Edition , 2006 .

[17]  Lisa Anthony,et al.  Analyzing user-generated youtube videos to understand touchscreen use by people with motor impairments , 2013, CHI.

[18]  Richard E. Ladner,et al.  Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities , 2009, Assets '09.

[19]  Amy Ogan,et al.  ZoomBoard: a diminutive qwerty soft keyboard using iterative zooming for ultra-small devices , 2013, CHI.

[20]  Simeon Keates,et al.  Developing steady clicks:: a method of cursor assistance for people with motor impairments , 2006, Assets '06.

[21]  Scott E. Hudson,et al.  Automatically identifying targets users interact with during real world tasks , 2010, IUI '10.

[22]  Patrick Baudisch,et al.  The generalized perceived input point model and how to double touch accuracy by extracting fingerprints , 2010, CHI.

[23]  Ravin Balakrishnan,et al.  Pointing lenses: facilitating stylus input through visual-and motor-space magnification , 2007, CHI.

[24]  Xiang Cao,et al.  Detecting and leveraging finger orientation for interaction with direct-touch surfaces , 2009, UIST '09.

[25]  Steven K. Feiner,et al.  Rubbing and tapping for precise and rapid selection on touch-screen displays , 2008, CHI.

[26]  Shumin Zhai,et al.  High precision touch screen interaction , 2003, CHI '03.

[27]  Mary E Sesto,et al.  Performance and touch characteristics of disabled and non-disabled participants during a reciprocal tapping task using touch screen technology. , 2012, Applied ergonomics.

[28]  Olivier Chapuis,et al.  JellyLens: content-aware adaptive lenses , 2012, UIST.

[29]  V. Venkatesh,et al.  AGE DIFFERENCES IN TECHNOLOGY ADOPTION DECISIONS: IMPLICATIONS FOR A CHANGING WORK FORCE , 2000 .

[30]  Douglas A. Wiegmann,et al.  The Effect of Disability and Approach on Touch Screen Performance during a Number Entry Task , 2010 .

[31]  Rainer Nordmann,et al.  Scanning-Based Human-Computer Interaction Using Intentional Muscle Contractions , 2009, HCI.

[32]  Patrick Baudisch,et al.  Back-of-device interaction allows creating very small touch devices , 2009, CHI.

[33]  Werner Kurschl,et al.  A User Modelling Wizard for People with Motor Impairments , 2013, MoMM '13.

[34]  Kyle Montague,et al.  Designing for individuals: usable touch-screen interaction through shared user models , 2012, ASSETS '12.

[35]  Joaquim A. Jorge,et al.  Towards accessible touch interfaces , 2010, ASSETS '10.

[36]  Curtis B. Irwin,et al.  Effect of Touch Screen Button Size and Spacing on Touch Characteristics of Users With and Without Disabilities , 2012, Hum. Factors.