Interferi: Gesture Sensing using On-Body Acoustic Interferometry

Interferi is an on-body gesture sensing technique using acoustic interferometry. We use ultrasonic transducers resting on the skin to create acoustic interference patterns inside the wearer's body, which interact with anatomical features in complex, yet characteristic ways. We focus on two areas of the body with great expressive power: the hands and face. For each, we built and tested a series of worn sensor configurations, which we used to identify useful transducer arrangements and machine learning fea-tures. We created final prototypes for the hand and face, which our study results show can support eleven- and nine-class gestures sets at 93.4% and 89.0% accuracy, re-spectively. We also evaluated our system in four continu-ous tracking tasks, including smile intensity and weight estimation, which never exceed 9.5% error. We believe these results show great promise and illuminate an inter-esting sensing technique for HCI applications.

[1]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[2]  Rosalind W. Picard,et al.  Expression glasses: a wearable device for facial expression recognition , 1999, CHI Extended Abstracts.

[3]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[4]  Desney S. Tan,et al.  Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces , 2008, CHI.

[5]  Michita Imai,et al.  SkinWatch: skin gesture interaction for smart watch , 2015, AH.

[6]  James Friend,et al.  A non-contact linear bearing and actuator via ultrasonic levitation , 2007 .

[7]  Gierad Laput,et al.  ViBand: High-Fidelity Bio-Acoustic Sensing Using Commodity Smartwatch Accelerometers , 2016, UIST.

[8]  Yuta Sugiura,et al.  Facial Expression Mapping inside Head Mounted Display by Embedded Optical Sensors , 2016, UIST.

[9]  Buntarou Shizuki,et al.  CanalSense: Face-Related Movement Recognition System based on Sensing Air Pressure in Ear Canals , 2017, UIST.

[10]  Thad Starner,et al.  Hambone: A Bio-Acoustic Gesture Interface , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[11]  Adiyan Mujibiya,et al.  The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation , 2013, ITS.

[12]  Kristofer S. J. Pister,et al.  Acceleration Sensing Glove , 1999 .

[13]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[14]  Kai Kunze,et al.  Facial Expression Recognition in Daily Life by Embedded Photo Reflective Sensors on Smart Eyewear , 2016, IUI.

[15]  Chris Harrison,et al.  EyeSpyVR: Interactive Eye Sensing Using Off-the-Shelf, Smartphone-Based VR Headsets , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[16]  Parth H. Pathak,et al.  Finger-writing with Smartwatch: A Case for Finger and Hand Gesture Recognition using Smartwatch , 2015, HotMobile.

[17]  Peter Gerstoft,et al.  Ocean acoustic interferometry. , 2007, The Journal of the Acoustical Society of America.

[18]  Mike Fraser,et al.  EchoFlex: Hand Gesture Recognition using Ultrasound Imaging , 2017, CHI.

[19]  Joseph A. Paradiso,et al.  A Usability User Study Concerning Free-Hand Microgesture and Wrist-Worn Sensors , 2014, 2014 11th International Conference on Wearable and Implantable Body Sensor Networks.

[20]  Yang Zhang,et al.  Tomo: Wearable, Low-Cost Electrical Impedance Tomography for Hand Gesture Recognition , 2015, UIST.

[21]  N. Inoue,et al.  A new ultrasonic interferometer for velocity measurement in liquids using optical diffraction , 1986 .

[22]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[23]  Joseph A. Paradiso,et al.  WristFlex: low-power gesture input with wrist-worn pressure sensors , 2014, UIST.

[24]  Chongyang Ma,et al.  Facial performance sensing head-mounted display , 2015, ACM Trans. Graph..

[25]  E. Brandt,et al.  Acoustic physics: Suspended by sound , 2001, Nature.

[26]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[27]  P. Weinberger,et al.  Prenatal ultrasound exposure and association with postnatal hearing outcomes , 2013, Journal of Otolaryngology - Head & Neck Surgery.

[28]  Mike Y. Chen,et al.  BackHand: Sensing Hand Gestures via Back of the Hand , 2015, UIST.

[29]  Gregory D. Abowd,et al.  TapSkin: Recognizing On-Skin Input for Smartwatches , 2016, ISS.

[30]  Kenji Suzuki,et al.  Design of a Wearable Device for Reading Positive Expressions from Facial EMG Signals , 2014, IEEE Transactions on Affective Computing.

[31]  Anna Gruebler,et al.  Measurement of distal EMG signals using a wearable device for reading facial expressions , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[32]  Yi-Ping Hung,et al.  ThumbRing: private interactions using one-handed thumb motion input on finger segments , 2016, MobileHCI Adjunct.

[33]  Kai Kunze,et al.  Evaluation of Facial Expression Recognition by a Smart Eyewear for Facial Direction Changes, Repeatability, and Positional Drift , 2017, ACM Trans. Interact. Intell. Syst..

[34]  Bodo Urban,et al.  EarFieldSensing: A Novel In-Ear Electric Field Sensing to Enrich Wearable Gesture Input through Facial Expressions , 2017, CHI.

[35]  Pyeong-Gook Jung,et al.  A Wearable Gesture Recognition Device for Detecting Muscular Activities Based on Air-Pressure Sensors , 2015, IEEE Transactions on Industrial Informatics.

[36]  Yang Zhang,et al.  Advancing Hand Gesture Recognition with High Resolution Electrical Impedance Tomography , 2016, UIST.

[37]  Loren G. Terveen,et al.  The sound of one hand: a wrist-mounted bio-acoustic fingertip gesture interface , 2002, CHI Extended Abstracts.

[38]  Kristofer S. J. Pister,et al.  Acceleration sensing glove (ASG) , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[39]  Shwetak N. Patel,et al.  DigiTouch: Reconfigurable Thumb-to-Finger Input and Text Entry on Head-mounted Displays , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[40]  Jun Gong,et al.  WristWhirl: One-handed Continuous Smartwatch Input using Wrist Gestures , 2016, UIST.

[41]  Sergio Escalera,et al.  Survey on RGB, 3D, Thermal, and Multimodal Approaches for Facial Expression Recognition: History, Trends, and Affect-Related Applications , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[42]  E. Sheiner,et al.  Ultrasound in obstetrics: a review of safety. , 2002, European journal of obstetrics, gynecology, and reproductive biology.

[43]  Antonio Krüger,et al.  User-independent real-time hand gesture recognition based on surface electromyography , 2017, MobileHCI.

[44]  Gregory D. Abowd,et al.  FingerPing: Recognizing Fine-grained Hand Poses using Active Acoustic On-body Sensing , 2018, CHI.

[45]  Sriram Subramanian,et al.  UltraHaptics: multi-point mid-air haptic feedback for touch surfaces , 2013, UIST.