Investigating Grid-Based Navigation: The Impact of Physical Disability

Hands-free speech-based technology can be a useful alternative for individuals that find traditional input devices, such as keyboard and mouse, difficult to use. Various speech-based navigation techniques have been examined, and several are available in commercial software applications. Among these alternatives, grid-based navigation has demonstrated both potential and limitations. In this article, we discuss an empirical study that assessed the efficacy of two enhancements to grid-based navigation: magnification and fine-tuning. The magnification capability enlarges the selected region when it becomes sufficiently small, making it easier to see the target and cursor. The fine-tuning capability allows users to move the cursor short distances to position the cursor over the target. The study involved one group of participants with physical disabilities, an age-matched group of participants without disabilities, and a third group that included young adults without disabilities. The results confirm that both magnification and fine-tuning significantly improved the participants’ performance when selecting targets, especially small targets. Providing either, or both, of the proposed enhancements substantially reduced the gaps in performance due to disability and age. The results will inform the design of speech-based target selection mechanism, allowing users to select targets faster while making fewer errors.

[1]  Gary E. Birch,et al.  The LF-ASD brain computer interface: on-line identification of imagined finger flexions in subjects with spinal cord injuries , 2000, Assets '00.

[2]  Takeo Igarashi,et al.  Voice as sound: using non-verbal voice input for interactive control , 2001, UIST '01.

[3]  Denise C. Park,et al.  Applied cognitive aging research. , 1992 .

[4]  Frank Schieber,et al.  Vision and aging. , 2006 .

[5]  Etsuya Shibayama,et al.  The migratory cursor: accurate speech-based cursor movement by moving multiple ghost cursors using non-verbal vocalizations , 2005, Assets '05.

[6]  Shari Trewin,et al.  A study of input device manipulation difficulties , 1996, Assets '96.

[7]  James A. Landay,et al.  A study of blind drawing practice: creating graphical information without the visual channel , 2000, Assets '00.

[8]  Faustina Hwang A study of cursor trajectories of motion-impaired users , 2002, CHI Extended Abstracts.

[9]  Alexander I. Rudnicky,et al.  Pocketsphinx: A Free, Real-Time Continuous Speech Recognition System for Hand-Held Devices , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.

[10]  Simon P. Levine,et al.  Validation of a keystroke-level model for a text entry system used by people with disabilities , 1994, ASSETS.

[11]  Andrew Sears,et al.  Speech-based cursor control using grids: modelling performance and comparisons with other solutions , 2005, Behav. Inf. Technol..

[12]  Kazuya Takeda,et al.  Multimedia data collection of in-car speech communication , 2001, INTERSPEECH.

[13]  Pavel Slavík,et al.  A Comparative Longitudinal Study of Non-verbal Mouse Pointer , 2007, INTERACT.

[14]  Matthias Peissner,et al.  Voice User Interface Design , 2004, UP.

[15]  Shaojian Zhu,et al.  Speech-Based Navigation: Improving Grid-Based Solutions , 2009, INTERACT.

[16]  Krzysztof Z. Gajos,et al.  Goal Crossing with Mice and Trackballs for People with Motor Impairments: Performance, Submovements, and Design Directions , 2008, TACC.

[17]  Joseph Sharit,et al.  Aging, Motor Control, and the Performance of Computer Mouse Tasks , 1999, Hum. Factors.

[18]  E.F. LoPresti,et al.  Adaptive software for head-operated computer controls , 2004, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[19]  Xiao Li,et al.  The Vocal Joystick: Evaluation of voice-based cursor control techniques for assistive technology , 2008, Disability and rehabilitation. Assistive technology.

[20]  Simeon Keates,et al.  Developing steady clicks:: a method of cursor assistance for people with motor impairments , 2006, Assets '06.

[21]  Krishna Bharat,et al.  Making computers easier for older adults to use: area cursors and sticky icons , 1997, CHI.

[22]  Clare-Marie Karat,et al.  Conversational interface technologies , 2002 .

[23]  James A. Landay,et al.  Sketching images eyes-free: a grid-based dynamic drawing tool for the blind , 2002, Assets '02.

[24]  Richard Wright,et al.  The Vocal Joystick: A Voice-Based Human-Computer Interface for Individuals with Motor Impairments , 2005, HLT.

[25]  Sharon Oviatt,et al.  Multimodal interactive maps: designing for human performance , 1997 .

[26]  Armando Barreto,et al.  Performance analysis of an integrated eye gaze tracking / electromyogram cursor control system , 2007, Assets '07.

[27]  Andrew Sears,et al.  Speech-based cursor control , 2002, ASSETS.

[28]  Andrew Sears,et al.  Physical disabilities and computing technologies: an analysis of impairments , 2002 .

[29]  Shari Trewin,et al.  Keyboard and mouse errors due to motor disabilities , 1999, Int. J. Hum. Comput. Stud..

[30]  J. Jacko,et al.  The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications , 2002 .

[31]  Ziyun Wang,et al.  Compensate the Speech Recognition Delays for Accurate Speech-Based Cursor Position Control , 2009, HCI.

[32]  Andrew Sears,et al.  Using confidence scores to improve hands-free speech based navigation in continuous dictation systems , 2004, TCHI.

[33]  Neff Walker,et al.  Mouse Accelerations and Performance of Older Computer Users , 1996 .

[34]  Sharon Oviatt,et al.  Multimodal Interfaces , 2008, Encyclopedia of Multimedia.

[35]  Andrew Sears,et al.  Speech-based cursor control: understanding the effects of target size, cursor speed, and command selection , 2002, Universal Access in the Information Society.

[36]  Clare-Marie Karat,et al.  The Beauty of Errors: Patterns of Error Correction in Desktop Speech Systems , 1999, INTERACT.

[37]  Clare-Marie Karat,et al.  Hands-Free, Speech-Based Navigation During Dictation: Difficulties, Consequences, and Solutions , 2003, Hum. Comput. Interact..

[38]  Nicole Yankelovich,et al.  Conversational speech interfaces , 2002 .

[39]  James A. Landay,et al.  The integrated communication 2 draw (IC2D): a drawing program for the visually impaired , 1999, CHI EA '99.

[40]  D. B. Mitchell,et al.  The handbook of aging and cognition , 2001 .