A Dual-Mode Human Computer Interface Combining Speech and Tongue Motion for People with Severe Disabilities

We are presenting a new wireless and wearable human computer interface called the dual-mode Tongue Drive System (dTDS), which is designed to allow people with severe disabilities to use computers more effectively with increased speed, flexibility, usability, and independence through their tongue motion and speech. The dTDS detects users' tongue motion using a magnetic tracer and an array of magnetic sensors embedded in a compact and ergonomic wireless headset. It also captures the users' voice wirelessly using a small microphone embedded in the same headset. Preliminary evaluation results based on 14 able-bodied subjects and three individuals with high level spinal cord injuries at level C3-C5 indicated that the dTDS headset, combined with a commercially available speech recognition (SR) software, can provide end users with significantly higher performance than either unimodal forms based on the tongue motion or speech alone, particularly in completing tasks that require both pointing and text entry.

[1]  M. Mazo,et al.  System for assisted mobility using eye movements based on electrooculography , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[2]  I. Scott MacKenzie,et al.  Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts' law research in HCI , 2004, Int. J. Hum. Comput. Stud..

[3]  Maysam Ghovanloo,et al.  Using Unconstrained Tongue Motion as an Alternative Control Mechanism for Wheeled Mobility , 2009, IEEE Transactions on Biomedical Engineering.

[4]  Maysam Ghovanloo,et al.  Development of a tongue-piercing method for use with assistive technology. , 2014, JAMA dermatology.

[5]  Lotte N. S. Andreasen Struijk,et al.  An Inductive Tongue Computer Interface for Control of Computers and Assistive Devices , 2006, IEEE Transactions on Biomedical Engineering.

[6]  Maysam Ghovanloo,et al.  New ergonomic headset for tongue-drive system with wireless smartphone interface , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[7]  Xiao Li,et al.  The Vocal Joystick: Evaluation of voice-based cursor control techniques for assistive technology , 2008, Disability and rehabilitation. Assistive technology.

[8]  Dean J Krusienski,et al.  Emulation of computer mouse control with a noninvasive brain–computer interface , 2008, Journal of neural engineering.

[9]  I. Scott MacKenzie,et al.  An empirical comparison of "wiimote" gun attachments for pointing tasks , 2009, EICS '09.

[10]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[11]  Maysam Ghovanloo,et al.  Using speech recognition to enhance the Tongue Drive System functionality in computer access , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[12]  Desney S. Tan,et al.  Optically sensing tongue gestures for computer input , 2009, UIST '09.

[13]  Patrick W. Demasco,et al.  Multimodal input for computer access and augmentative communication , 1996, Assets '96.

[14]  Hung-Yuan Chung,et al.  Application of facial electromyography in computer mouse access for people with disabilities , 2006, Disability and rehabilitation.

[15]  Maysam Ghovanloo,et al.  Quantitative and Comparative Assessment of Learning in a Tongue-Operated Computer Input Device , 2011, IEEE Transactions on Information Technology in Biomedicine.

[16]  Peter Robinson,et al.  The use of gestures in multimodal input , 1998, Assets '98.

[17]  Marcia J. Scherer,et al.  Living in the state of stuck : how assistive technology impacts the lives of people with disabilities , 2005 .

[18]  R.F. Kirsch,et al.  Evaluation of Head Orientation and Neck Muscle EMG Signals as Command Inputs to a Human–Computer Interface for Individuals With High Tetraplegia , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[19]  Andrew Sears,et al.  Speech-based cursor control: a study of grid-based solutions , 2004, ACM SIGACCESS Access. Comput..

[20]  Xiao Li,et al.  The vocal joystick:: evaluation of voice-based cursor control techniques , 2006, Assets '06.

[21]  Maysam Ghovanloo,et al.  Using Fitts's law for evaluating Tongue Drive System as a pointing device for computer access , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[22]  Hua-Xin Peng,et al.  Giant magnetoimpedance materials: Fundamentals and applications , 2008 .

[23]  Carl M. Rebman,et al.  Speech recognition in the human-computer interface , 2003, Inf. Manag..

[24]  Xueliang Huo,et al.  A Magneto-Inductive Sensor Based Wireless Tongue-Computer Interface , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[25]  Xuan Zhang,et al.  Evaluating Eye Tracking with ISO 9241 - Part 9 , 2007, HCI.

[26]  Jon A. Mukand,et al.  Neuronal ensemble control of prosthetic devices by a human with tetraplegia , 2006, Nature.

[27]  Maysam Ghovanloo,et al.  Evaluation of a wireless wearable tongue–computer interface by individuals with high-level spinal cord injuries , 2010, Journal of neural engineering.

[28]  César Augusto Martins Pereira,et al.  Development and Evaluation of a Head-Controlled Human-Computer Interface with Mouse-Like Functions for Physically Disabled Users , 2009, Clinics.

[29]  Melanie Baljko The contrastive evaluation of unimodal and multimodal interfaces for voice otput communication aids , 2005, ICMI '05.

[30]  Maysam Ghovanloo,et al.  Evaluation of a Smartphone Platform as a Wireless Interface Between Tongue Drive System and Electric-Powered Wheelchairs , 2012, IEEE Transactions on Biomedical Engineering.