A tongue input device for creating conversations

We present a new tongue input device, the tongue joystick, for use by an actor inside an articulated-head character costume. Using our device, the actor can maneuver through a dialogue tree, selecting clips of prerecorded audio to hold a conversation in the voice of the character. The device is constructed of silicone sewn with conductive thread, a unique method for creating rugged, soft, low-actuation force devices. This method has application for entertainment and assistive technology. We compare our device against other portable mouth input devices, showing it to be the fastest and most accurate in tasks mimicking our target application. Finally, we show early results of an actor inside an articulated-head costume using the tongue joystick to interact with a child.

[1]  Koichi Kuzume,et al.  Input device for disabled persons using expiration and tooth-touch sound signals , 2010, SAC '10.

[2]  Chikamune Wada,et al.  Effects of Visual Stimuli on a Communication Assistive Method Using Sympathetic Skin Response , 2010, ICCHP.

[3]  Melody Moore Jackson,et al.  A galvanic skin response interface for people with severe motor disabilities , 2004, Assets '04.

[4]  Maysam Ghovanloo,et al.  Using Unconstrained Tongue Motion as an Alternative Control Mechanism for Wheeled Mobility , 2009, IEEE Transactions on Biomedical Engineering.

[5]  Richard Wright,et al.  The Vocal Joystick: A Voice-Based Human-Computer Interface for Individuals with Motor Impairments , 2005, HLT.

[6]  Shumin Zhai,et al.  An isometric tongue pointing device , 1997, CHI.

[7]  Gregory D. Abowd,et al.  Blui: low-cost localized blowable user interfaces , 2007, UIST '07.

[8]  M. Betke,et al.  The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[9]  M. Swiontkowski Targeted Muscle Reinnervation for Real-time Myoelectric Control of Multifunction Artificial Arms , 2010 .

[10]  P. Kay,et al.  Universals and cultural variation in turn-taking in conversation , 2009, Proceedings of the National Academy of Sciences.

[11]  Sharon L. Oviatt,et al.  Adaptation of users² spoken dialogue patterns in a conversational interface , 2002, INTERSPEECH.

[12]  Bernd Freisleben,et al.  HaWCoS: the "hands-free" wheelchair control system , 2002, ASSETS.

[13]  Desney S. Tan,et al.  Optically sensing tongue gestures for computer input , 2009, UIST '09.

[14]  Lotte N. S. Andreasen Struijk,et al.  An Inductive Tongue Computer Interface for Control of Computers and Assistive Devices , 2006, IEEE Transactions on Biomedical Engineering.

[15]  Dennis J. McFarland,et al.  Brain–computer interfaces for communication and control , 2002, Clinical Neurophysiology.

[16]  Grigori Evreinov,et al.  "Breath-Joystick" - Graphical Manipulator for Physically Disabled Users , 2000 .

[17]  G. Pfurtscheller,et al.  Brain-Computer Interfaces for Communication and Control. , 2011, Communications of the ACM.

[18]  Robert D. Lipschutz,et al.  Targeted muscle reinnervation for real-time myoelectric control of multifunction artificial arms. , 2009, JAMA.

[19]  Andrew Sears,et al.  Physical disabilities and computing technologies: an analysis of impairments , 2002 .

[20]  J.R. LaCourse,et al.  An eye movement communication-control system for the disabled , 1990, IEEE Transactions on Biomedical Engineering.

[21]  Frédéric Maire,et al.  Hands-free mouse-pointer manipulation using motion-tracking and speech recognition , 2007, OZCHI '07.

[22]  Wookho Son,et al.  A new means of HCI: EMG-MOUSE , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[23]  J. Donoghue,et al.  Sensors for brain-computer interfaces , 2006, IEEE Engineering in Medicine and Biology Magazine.

[24]  Takeo Igarashi,et al.  Voice as sound: using non-verbal voice input for interactive control , 2001, UIST '01.