On the tip of the tongue: learning typing and pointing with an intra-oral computer interface

Abstract Purpose: To evaluate typing and pointing performance and improvement over time of four able-bodied participants using an intra-oral tongue-computer interface for computer control. Background: A physically disabled individual may lack the ability to efficiently control standard computer input devices. There have been several efforts to produce and evaluate interfaces that provide individuals with physical disabilities the possibility to control personal computers. Method: Training with the intra-oral tongue-computer interface was performed by playing games over 18 sessions. Skill improvement was measured through typing and pointing exercises at the end of each training session. Results: Typing throughput improved from averages of 2.36 to 5.43 correct words per minute. Pointing throughput improved from averages of 0.47 to 0.85 bits/s. Target tracking performance, measured as relative time on target, improved from averages of 36% to 47%. Path following throughput improved from averages of 0.31 to 0.83 bits/s and decreased to 0.53 bits/s with more difficult tasks. Conclusions: Learning curves support the notion that the tongue can rapidly learn novel motor tasks. Typing and pointing performance of the tongue-computer interface is comparable to performances of other proficient assistive devices, which makes the tongue a feasible input organ for computer control. Implications for Rehabilitation Intra-oral computer interfaces could provide individuals with severe upper-limb mobility impairments the opportunity to control computers and automatic equipment. Typing and pointing performance of the tongue-computer interface is comparable to performances of other proficient assistive devices, but does not cause fatigue easily and might be invisible to other people, which is highly prioritized by assistive device users. Combination of visual and auditory feedback is vital for a good performance of an intra-oral computer interface and helps to reduce involuntary or erroneous activations.

[1]  Charles H. Shea,et al.  Spacing practice sessions across days benefits the learning of motor skills , 2000 .

[2]  Bo Bentsen,et al.  Effects of sensory feedback in intra-oral target selection tasks with the tongue , 2013, Disability and rehabilitation. Assistive technology.

[3]  Per-Olof Hedvall,et al.  Ideation and ability: when actions speak louder than words , 2012, PDC '12.

[4]  Lotte N. S. Andreasen Struijk,et al.  An Inductive Tongue Computer Interface for Control of Computers and Assistive Devices , 2006, IEEE Transactions on Biomedical Engineering.

[5]  Kim Dremstrup,et al.  Auditory and spatial navigation imagery in Brain–Computer Interface using optimized wavelets , 2008, Journal of Neuroscience Methods.

[6]  M. P. Bolton,et al.  Mouse emulator for tetraplegics , 2006, Medical and Biological Engineering and Computing.

[7]  R G Radwin,et al.  Evaluation of a modified Fitts law brain–computer interface target acquisition task in able and motor disabled individuals , 2009, Journal of neural engineering.

[8]  Maysam Ghovanloo,et al.  Using Fitts's law for evaluating Tongue Drive System as a pointing device for computer access , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[9]  Hector A. Caltenco,et al.  TongueWise: Tongue-computer interface software for people with tetraplegia , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[10]  Gregg C. Vanderheiden,et al.  Mental workload during brain–computer interface training , 2012, Ergonomics.

[11]  Qiang Ji,et al.  Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA 2010, Austin, Texas, USA, March 22-24, 2010 , 2010, ETRA.

[12]  S. Hudson,et al.  CHI '08 Extended Abstracts on Human Factors in Computing Systems , 2009, CHI 2009.

[13]  Neil D. Lawrence,et al.  Proceedings of the First international conference on Deterministic and Statistical Methods in Machine Learning , 2004 .

[14]  Eugen R. Lontis,et al.  Fuzzy inference system for analog joystick emulation with an inductive tongue-computer interface , 2011 .

[15]  Sarah A. Douglas,et al.  Testing pointing device performance and user assessment with the ISO 9241, Part 9 standard , 1999, CHI '99.

[16]  I. Scott MacKenzie,et al.  LetterWise: prefix-based disambiguation for mobile text input , 2001, UIST '01.

[17]  K. Goedert,et al.  Spacing practice sessions across days earlier rather than later in training improves performance of a visuomotor skill , 2008, Experimental Brain Research.

[18]  John Hetling,et al.  Comparison of Three Head-Controlled Mouse Emulators in Three Light Conditions , 2009, Augmentative and alternative communication.

[19]  Roger K. Moore Modeling data entry rates for ASR and alternative input methods , 2004, INTERSPEECH.

[20]  John Paulin Hansen,et al.  Low-cost gaze interaction: ready to deliver the promises , 2009, CHI Extended Abstracts.

[21]  Maysam Ghovanloo,et al.  Using speech recognition to enhance the Tongue Drive System functionality in computer access , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[22]  I. Scott MacKenzie,et al.  Text entry using soft keyboards , 1999, Behav. Inf. Technol..

[23]  John Paulin Hansen,et al.  Gaze typing compared with input by head and hand , 2004, ETRA.

[24]  John Paulin Hansen,et al.  Evaluation of the Potential of Gaze Input for Game Interaction , 2009, PsychNology J..

[25]  Albrecht Schmidt,et al.  The MAGIC Touch: Combining MAGIC-Pointing with a Touch-Sensitive Mouse , 2009, INTERACT.

[26]  Héctor A. Caltenco,et al.  Understanding Computer Users With Tetraplegia: Survey of Assistive Technology Users , 2012, Int. J. Hum. Comput. Interact..

[27]  Chris Ball,et al.  Efficient Communication by Breathing , 2004, Deterministic and Statistical Methods in Machine Learning.

[28]  Eugen R Lontis,et al.  Design of inductive sensors for tongue control system for computers and assistive devices , 2008, Disability and rehabilitation. Assistive technology.

[29]  Olav W. Bertelsen,et al.  Proceedings of the Second Nordic Conference on Human-Computer Interaction 2002, Aarhus, Denmark, October 19-23, 2002 , 2002, NordiCHI.

[30]  I. Scott MacKenzie,et al.  Text Entry for Mobile Computing: Models and Methods,Theory and Practice , 2002, Hum. Comput. Interact..

[31]  Tal Savion-Lemieux,et al.  The effects of practice and delay on motor skill learning and retention , 2005, Experimental Brain Research.

[32]  David J. Beebe,et al.  Development of a Tongue-Operated Switch Array as an Alternative Input Device , 2005 .

[33]  I. Scott MacKenzie,et al.  Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts' law research in HCI , 2004, Int. J. Hum. Comput. Stud..

[34]  Xueliang Huo,et al.  A Magnetic Wireless Tongue-Computer Interface , 2007, 2007 3rd International IEEE/EMBS Conference on Neural Engineering.

[35]  Sriram Subramanian,et al.  Talking about tactile experiences , 2013, CHI.

[36]  I. Scott MacKenzie,et al.  A character-level error analysis technique for evaluating text entry methods , 2002, NordiCHI '02.

[37]  C Lau,et al.  Comparison of computer interface devices for persons with severe physical disabilities. , 1993, The American journal of occupational therapy : official publication of the American Occupational Therapy Association.

[38]  John Paulin Hansen,et al.  Evaluation of a low-cost open-source gaze tracker , 2010, ETRA.

[39]  Héctor A. Caltenco,et al.  The Impact of Function Location on Typing and Pointing Tasks With an Intraoral Tongue–Computer Interface , 2014, Int. J. Hum. Comput. Interact..