BARTON: Low Power Tongue Movement Sensing with In-Ear Barometers

Sensing tongue movements enables various applications in hands-free interaction and alternative communication. We propose BARTON, a BARometer based low-power and robust TONgue movement sensing system. Using a low sampling rate of below 50 Hz, and only extracting simple temporal features from in-ear pressure signals, we demonstrate that it is plausible to distinguish important tongue gestures (left, right, forward) at low power consumption. We prototype BARTON with commodity earpieces integrated with COTS barometers for in-ear pressure sensing and an ARM micro-controller for signal processing. Evaluations show that BARTON yields 94% classification accuracy and 8.4 mW power consumption, which achieves comparable accuracy, but consumes 44 times lower energy than the state-of-the-art microphone-based solutions. BARTON is also robust to head movements and operates with music played directly from earphones.

[1]  Thad Starner,et al.  Stick it in your ear: building an in-ear jaw movement sensor , 2015, UbiComp/ISWC Adjunct.

[2]  Chen Wang,et al.  Fine-grained sleep monitoring: Hearing your breathing with smartphones , 2015, 2015 IEEE Conference on Computer Communications (INFOCOM).

[3]  S. Karlsson,et al.  Characteristics of Mandibular Masticatory Movement in Young and Elderly Dentate Subjects , 1990, Journal of dental research.

[4]  Ben Taskar,et al.  Non-intrusive tongue machine interface , 2014, CHI.

[5]  Bruce Denby,et al.  Prospects for a Silent Speech Interface using Ultrasound Imaging , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.

[6]  Koji Yatani,et al.  BodyScope: a wearable acoustic sensor for activity recognition , 2012, UbiComp.

[7]  Jingyuan Zhang,et al.  Tongible: a non-contact tongue-based interaction technique , 2012, ASSETS '12.

[8]  David A. Landgrebe,et al.  A survey of decision tree classifier methodology , 1991, IEEE Trans. Syst. Man Cybern..

[9]  Carlos Tejada,et al.  Bitey: an exploration of tooth click gestures for hands-free user interface control , 2016, MobileHCI.

[10]  Roger D. Quinn,et al.  A Dual Mode Human-Robot Teleoperation Interface Based on Airflow in the Aural Cavity , 2007, Int. J. Robotics Res..

[11]  Samantha Kleinberg,et al.  Automated estimation of food type and amount consumed from body-worn audio and motion sensors , 2016, UbiComp.

[12]  Paul Lukowicz,et al.  On the tip of my tongue: a non-invasive pressure-based tongue interface , 2014, AH.

[13]  Mi Zhang,et al.  BodyBeat: a mobile system for sensing non-speech body sounds , 2014, MobiSys.

[14]  Norman Sissenwine,et al.  Preliminary Note on the U. S. Standard Atmosphere, 19621 , 1962 .

[15]  Ravi Vaidyanathan,et al.  Real-time implementation of a non-invasive tongue-based human-robot interface , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  Zheng Li,et al.  Tongue-n-cheek: non-contact tongue gesture recognition , 2015, IPSN.

[17]  Ravi Vaidyanathan,et al.  Tongue-Movement Communication and Control Concept for Hands-Free Human–Machine Interfaces , 2007, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[18]  A. Wayne Whitney,et al.  A Direct Method of Nonparametric Measurement Selection , 1971, IEEE Transactions on Computers.

[19]  Qiang Li,et al.  MusicalHeart: a hearty way of listening to music , 2012, SenSys '12.

[20]  Maysam Ghovanloo,et al.  The tongue and ear interface: a wearable system for silent speech recognition , 2014, SEMWEB.