A high-resolution tongue-based joystick to enable robot control for individuals with severe disabilities

Assistive robotic arms have shown the potential to improve the quality of life of people with severe disabilities. However, a high performance and intuitive control interface for robots with 6-7 DOFs is still missing for these individuals. An inductive tongue computer interface (ITCI) was recently tested for control of robots and the study illustrated potential in this field. The paper describes the investigation of the possibility of developing a high performance tongue-based joystick-like controller for robots through two studies. The first compared different methods for mapping the 18 sensor signals to a 2D coordinate, as a touchpad. The second evaluated the performance of a novel approach for emulating an analog joystick by the ITCI based on the ISO9241-411 standard. Two subjects performed a multi-directional tapping test using a standard analog joystick, the ITCI system held in hand and operated by the other hand, and finally by tongue when mounted inside the mouth. Throughput was measured as the evaluation parameter. The results show that the contact on the touchpads can be localized by almost 1 mm accuracy. The effective throughput of ITCI system for the multi-directional tapping test was 2.03 bps while keeping it in the hand and 1.31 bps when using it inside the mouth.

[1]  M. Ghovanloo,et al.  The Tongue Enables Computer and Wheelchair Control for People with Spinal Cord Injury , 2013, Science Translational Medicine.

[2]  Eugen R. Lontis,et al.  Fuzzy inference system for analog joystick emulation with an inductive tongue-computer interface , 2011 .

[3]  Xuan Zhang,et al.  Evaluating Eye Tracking with ISO 9241 - Part 9 , 2007, HCI.

[4]  Christian Cipriani,et al.  Control of a Robotic Hand Using a Tongue Control System—A Prosthesis Application , 2016, IEEE Transactions on Biomedical Engineering.

[5]  Lotte N. S. Andreasen Struijk,et al.  Error-Free Text Typing Performance of an Inductive Intra-Oral Tongue Computer Interface for Severely Disabled Individuals , 2017, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[6]  Héctor A. Caltenco,et al.  Tip of the Tongue Selectivity and Motor Learning in the Palatal Area , 2012, IEEE Transactions on Biomedical Engineering.

[7]  Bo Bentsen,et al.  Sensor Activation for Wheelchair Driving in Confined Spaces with a Tongue Controlled Oral Interface , 2016, i-CREATe.

[8]  Thomas B. Moeslund,et al.  Controlling a Drone by the Tongue – A Pilot Study on Drone Based Facilitation of Social Activities and Sports for People with Complete Tetraplegia , 2018, Converging Clinical and Engineering Research on Neurorehabilitation III.

[9]  Maysam Ghovanloo,et al.  Developing a Tongue Controlled Exoskeleton for a Wrist Tracking Exercise: A Preliminary Study , 2015 .

[10]  Eugen R Lontis,et al.  Design of inductive sensors for tongue control system for computers and assistive devices , 2008, Disability and rehabilitation. Assistive technology.

[11]  Bo Bentsen,et al.  Inductive pointing device for tongue control system for computers and assistive devices , 2009, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[12]  Francois Routhier,et al.  Evaluation of the JACO robotic arm: Clinico-economic study for powered wheelchair users with upper-extremity disabilities , 2011, 2011 IEEE International Conference on Rehabilitation Robotics.

[13]  Romulus Lontis,et al.  Wireless intraoral tongue control of an assistive robotic arm for individuals with tetraplegia , 2017, Journal of NeuroEngineering and Rehabilitation.

[14]  Sarah A. Douglas,et al.  Testing pointing device performance and user assessment with the ISO 9241, Part 9 standard , 1999, CHI '99.

[15]  Maysam Ghovanloo,et al.  Assessment of the Tongue-Drive System Using a Computer, a Smartphone, and a Powered-Wheelchair by People With Tetraplegia , 2016, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[16]  R. Cooper,et al.  Functional assessment and performance evaluation for assistive robotic manipulators: Literature review , 2013, The journal of spinal cord medicine.

[17]  Maysam Ghovanloo,et al.  An Independent Tongue-Operated Assistive System for Both Access and Mobility , 2018, IEEE Sensors Journal.

[18]  J. Bach,et al.  Wheelchair-Mounted Robot Manipulators: Long Term Use by Patients with Duchenne Muscular Dystrophy , 1990, American journal of physical medicine & rehabilitation.

[19]  Shumin Zhai,et al.  Human Performance in Six Degree of Freedom Input Control , 2002 .

[20]  Roseli de Deus Lopes,et al.  Human-Computer Interface Controlled by the Lip , 2015, IEEE J. Biomed. Health Informatics.

[21]  Héctor A. Caltenco,et al.  The Impact of Function Location on Typing and Pointing Tasks With an Intraoral Tongue–Computer Interface , 2014, Int. J. Hum. Comput. Interact..

[22]  I. Scott MacKenzie,et al.  Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts' law research in HCI , 2004, Int. J. Hum. Comput. Stud..

[23]  Héctor A. Caltenco,et al.  Understanding Computer Users With Tetraplegia: Survey of Assistive Technology Users , 2012, Int. J. Hum. Comput. Interact..

[24]  B J F Driessen,et al.  MANUS—a wheelchair-mounted rehabilitation robot , 2001, Proceedings of the Institution of Mechanical Engineers. Part H, Journal of engineering in medicine.