Design and evaluation of nonverbal sound-based input for those with motor handicapped

Most personal computing interfaces rely on the users’ ability to use their hand and arm movements to interact with on-screen graphical widgets via mainstream devices, including keyboards and mice. Without proper assistive devices, this style of input poses difficulties for motor-handicapped users. We propose a sound-based input scheme enabling users to operate Windows’ Graphical User Interface by producing hums and fricatives through regular microphones. Hierarchically arranged menus are utilized so that only minimal numbers of different actions are required at a time. The proposed scheme was found to be accurate and capable of responding promptly compared to other sound-based schemes. Being able to select from multiple item-selecting modes helps reducing the average time duration needed for completing tasks in the test scenarios almost by half the time needed when the tasks were performed solely through cursor movements. Still, improvements on facilitating users to select the most appropriate modes for desired tasks should improve the overall usability of the proposed scheme. Implications for Rehabilitation Using typical means for computer input like keyboard and mouse may pose challenges to certain motor-handicapped users who are unable to control their hands or arms. Non-verbal, humming sound input is a good candidate solution that utilizes standard functions, hence economical and language independent. This Desktop Access with Non-verbal Sound Input is an assistive device that allows motor-impaired users to command and navigate the computer with minimal training.

[1]  Anthony J. Hornof,et al.  EyeDraw: enabling children with severe motor impairments to draw with their eyes , 2005, CHI.

[2]  Maria Lindén,et al.  Six-button Click Interface for a Disabled User by an Adjustable Multi-level Sip-and-Puff Switch , 2010, SIGRAD.

[3]  Roope Raisamo,et al.  Appropriateness of foot interaction for non-accurate spatial tasks , 2004, CHI EA '04.

[4]  Andrew Sears,et al.  Speech-based cursor control: understanding the effects of target size, cursor speed, and command selection , 2002, Universal Access in the Information Society.

[5]  Andrew Sears,et al.  Speech-based cursor control: a study of grid-based solutions , 2003, ASSETS.

[6]  Frédéric Maire,et al.  Hands-free mouse-pointer manipulation using motion-tracking and speech recognition , 2007, OZCHI '07.

[7]  Constantine Stephanidis,et al.  User-Centered Interaction Paradigms for Universal Access in the Information Society , 2004, Lecture Notes in Computer Science.

[8]  Supadaech Chanjaradwichai,et al.  Design and evaluation of a non-verbal voice-controlled cursor for point-and-click tasks , 2010 .

[9]  Pavel Slavík,et al.  Longitudinal study of continuous non-speech operated mouse pointer , 2007, CHI Extended Abstracts.

[10]  Takeo Igarashi,et al.  Voice as sound: using non-verbal voice input for interactive control , 2001, UIST '01.

[11]  Xiao Li,et al.  The Vocal Joystick: Evaluation of voice-based cursor control techniques for assistive technology , 2008, Disability and rehabilitation. Assistive technology.

[12]  Andrew Sears,et al.  Speech-based cursor control , 2002, ASSETS.