Motor-impaired touchscreen interactions in the wild

Touchscreens are pervasive in mainstream technologies; they offer novel user interfaces and exciting gestural interactions. However, to interpret and distinguish between the vast ranges of gestural inputs, the devices require users to consistently perform interactions inline with the predefined location, movement and timing parameters of the gesture recognizers. For people with variable motor abilities, particularly hand tremors, performing these input gestures can be extremely challenging and impose limitations on the possible interactions the user can make with the device. In this paper, we examine touchscreen performance and interaction behaviors of motor-impaired users on mobile devices. The primary goal of this work is to measure and understand the variance of touchscreen interaction performances by people with motor-impairments. We conducted a four-week in-the-wild user study with nine participants using a mobile touchscreen device. A Sudoku stimulus application measured their interaction performance abilities during this time. Our results show that not only does interaction performance vary significantly between users, but also that an individual's interaction abilities are significantly different between device sessions. Finally, we propose and evaluate the effect of novel tap gesture recognizers to accommodate for individual variances in touchscreen interactions.

[1]  Scott E. Hudson,et al.  Automatically detecting pointing performance , 2008, IUI '08.

[2]  Shari Trewin Automating accessibility: the dynamic keyboard , 2003, Assets '04.

[3]  Joaquim A. Jorge,et al.  Elderly text-entry performance on touchscreens , 2012, ASSETS '12.

[4]  Markus Löchtefeld,et al.  A user-specific machine learning approach for improving touch accuracy on mobile devices , 2012, UIST '12.

[5]  Olivier Chapuis,et al.  Fitts' Law in the Wild: A Field Study of Aimed Movements , 2007 .

[6]  Christopher M. Schlick,et al.  Evaluating swabbing: a touchscreen input method for elderly users with tremor , 2011, CHI.

[7]  Kyle Montague,et al.  Designing for individuals: usable touch-screen interaction through shared user models , 2012, ASSETS '12.

[8]  Patrick Baudisch,et al.  Understanding touch , 2011, CHI.

[9]  Niels Henze,et al.  Observational and experimental investigation of typing behaviour using virtual keyboards for mobile devices , 2012, CHI.

[10]  Daniel Buschek,et al.  User-specific touch models in a cross-device context , 2013, MobileHCI '13.

[11]  Simeon Keates,et al.  Developing steady clicks:: a method of cursor assistance for people with motor impairments , 2006, Assets '06.

[12]  Scott E. Hudson,et al.  Understanding pointing problems in real world computing environments , 2008, Assets '08.

[13]  Shari Trewin,et al.  Physical accessibility of touchscreen smartphones , 2013, ASSETS.

[14]  Richard C. Simpson,et al.  Toward Goldilocks' pointing device: determining a "just right" gain setting for users with physical impairments , 2005, Assets '05.

[15]  Joaquim A. Jorge,et al.  Towards accessible touch interfaces , 2010, ASSETS '10.