A multimodal human computer interface combining head movement, speech and tongue motion for people with severe disabilities

Assistive technologies (ATs) play a crucial role in the lives of individuals with severe disabilities by enabling them to have greater autonomy in performing daily tasks. The Tongue Drive System (TDS) developed at the Georgia Tech Bionics Lab is such an AT, empowering people with severe Spinal Cord Injuries (SCIs) to be more independent. Earlier versions of the TDS have offered tongue motion and speech as means of driving mouse activity and keyboard input. In this paper, we introduce a new multi-modal Tongue Drive System (mTDS), which incorporates head tracking to deliver proportional control of a mouse cursor. The mTDS integrates this new capability while preserving tongue motion and speech from previous versions and offers a richer means of driving computing interfaces, than previously available to individuals with severe disabilities. In experimental trials, 3 able bodied subjects attempted to initiate, dictate and send an email using the mTDS. The mean task completion times of expert, intermediate and novice users with mTDS were 1.48, 4.46 and 6.02 times compared to them using a keyboard and mouse interface. We also observed improved user efficacy with repeated use of the mTDS.

[1]  Jonathan R. Wolpaw,et al.  Brain-computer interfaces (BCIs) for communication and control , 2007, Assets '07.

[2]  Xueliang Huo,et al.  A Magneto-Inductive Sensor Based Wireless Tongue-Computer Interface , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[3]  Maysam Ghovanloo,et al.  Quantitative and Comparative Assessment of Learning in a Tongue-Operated Computer Input Device , 2011, IEEE Transactions on Information Technology in Biomedicine.

[4]  C Lau,et al.  Comparison of computer interface devices for persons with severe physical disabilities. , 1993, The American journal of occupational therapy : official publication of the American Occupational Therapy Association.

[5]  Shumin Zhai,et al.  An isometric tongue pointing device , 1997, CHI.

[6]  Ramsey Michael Faragher,et al.  Understanding the Basis of the Kalman Filter Via a Simple and Intuitive Derivation [Lecture Notes] , 2012, IEEE Signal Processing Magazine.

[7]  M. Mazo,et al.  System for assisted mobility using eye movements based on electrooculography , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[8]  M.M. Moore,et al.  Real-world applications for brain-computer interface technology , 2003, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[9]  Maysam Ghovanloo,et al.  Evaluation of a wireless wearable tongue–computer interface by individuals with high-level spinal cord injuries , 2010, Journal of neural engineering.

[10]  Maysam Ghovanloo,et al.  A Dual-Mode Human Computer Interface Combining Speech and Tongue Motion for People with Severe Disabilities , 2013, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[11]  Armando Barreto,et al.  Implementing a Sensor Fusion Algorithm for 3D Orientation Detection with Inertial/Magnetic Sensors , 2015 .

[12]  R.F. Kirsch,et al.  Evaluation of Head Orientation and Neck Muscle EMG Signals as Command Inputs to a Human–Computer Interface for Individuals With High Tetraplegia , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[13]  Maysam Ghovanloo,et al.  Using Fitts's law for evaluating Tongue Drive System as a pointing device for computer access , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.