AudioTouch: Minimally Invasive Sensing of Micro-Gestures via Active Bio-Acoustic Sensing

We present AudioTouch, a minimally invasive approach for sensing micro-gestures using active bio-acoustic sensing. It only requires attaching two piezo-electric elements, acting as a surface mounted speaker and microphone, on the back of the hand. It does not require any instrumentation on the palm or fingers; therefore, it does not encumber interactions with physical objects. The signal is rich enough to detect small differences in micro-gestures with standard machine-learning classifiers. This approach also allows for the discrimination of different levels of touch-force, further expanding the interaction vocabulary. We conducted four experiments to evaluate the performances of AudioTouch: a user study for measuring the gesture recognition accuracy, a follow-up study investigating the ability to discriminate different levels of touch-force, an experiment assessing the cross-session robustness, and, a systematic evaluation assessing the effect of sensor placement on the back of the hand.

[1]  Alex Olwal,et al.  SkinMarks: Enabling Interactions on Body Landmarks Using Conformal Skin Electronics , 2017, CHI.

[2]  Otmar Hilliges,et al.  In-air gestures around unmodified mobile devices , 2014, UIST.

[3]  Tomoko Hashida,et al.  Hand Gesture and On-body Touch Recognition by Active Acoustic Sensing throughout the Human Body , 2016, UIST.

[4]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[5]  Gregory D. Abowd,et al.  FingOrbits: interaction with wearables using synchronized thumb movements , 2017, SEMWEB.

[6]  Andrew W. Fitzgibbon,et al.  Efficient and precise interactive hand tracking through joint, continuous optimization of pose and correspondences , 2016, ACM Trans. Graph..

[7]  Gregory D. Abowd,et al.  FingerPing: Recognizing Fine-grained Hand Poses using Active Acoustic On-body Sensing , 2018, CHI.

[8]  Tsutomu Terada,et al.  Gesture Recognition Method Utilizing Ultrasonic Active Acoustic Sensing , 2017, J. Inf. Process..

[9]  Yang Zhang,et al.  Tomo: Wearable, Low-Cost Electrical Impedance Tomography for Hand Gesture Recognition , 2015, UIST.

[10]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[11]  Gierad Laput,et al.  SkinTrack: Using the Body as an Electrical Waveguide for Continuous Finger Tracking on the Skin , 2016, CHI.

[12]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[13]  Ivan Poupyrev,et al.  Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects , 2012, CHI.

[14]  Wei Lin,et al.  A Technique for Touch Force Sensing using a Waterproof Device's Built-in Barometer , 2017, CHI Extended Abstracts.

[15]  Da-Yuan Huang,et al.  DigitSpace: Designing Thumb-to-Fingers Touch Interfaces for One-Handed and Eyes-Free Interactions , 2016, CHI.

[16]  Sungjae Hwang,et al.  PseudoButton: enabling pressure-sensitive interaction by repurposing microphone on mobile device , 2012, CHI EA '12.

[17]  Riley Booth,et al.  Detecting finger gestures with a wrist worn piezoelectric sensor array , 2017, 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[18]  James Fogarty,et al.  Leveraging Dual-Observable Input for Fine-Grained Thumb Interaction Using Forearm EMG , 2015, UIST.

[19]  Joseph A. Paradiso,et al.  WristFlex: low-power gesture input with wrist-worn pressure sensors , 2014, UIST.

[20]  Yang Zhang,et al.  Pyro: Thumb-Tip Gesture Recognition Using Pyroelectric Infrared Sensing , 2017, UIST.

[21]  Buntarou Shizuki,et al.  Touch & activate: adding interactivity to existing objects using active acoustic sensing , 2013, UIST.

[22]  Shwetak N. Patel,et al.  GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones , 2012, UIST.

[23]  Mike Fraser,et al.  SensIR: Detecting Hand Gestures with a Wearable Bracelet using Infrared Transmission and Reflection , 2017, UIST.

[24]  Ivan Poupyrev,et al.  Soli , 2016, ACM Trans. Graph..

[25]  Yang Xu,et al.  WiFinger: talk to your smart devices with finger-grained gesture , 2016, UbiComp.

[26]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[27]  Robert Xiao,et al.  Acoustic barcodes: passive, durable and inexpensive notched identification tags , 2012, UIST.

[28]  Keisuke Nakamura,et al.  SoundCraft: Enabling Spatial Interactions on Smartwatches using Hand Generated Acoustics , 2017, UIST.

[29]  Xiaoying Sun,et al.  EV-pen: an electrovibration haptic feedback pen for touchscreens , 2016, SIGGRAPH ASIA Emerging Technologies.

[30]  Li-Wei Chan,et al.  CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring , 2015, UIST.

[31]  Buntarou Shizuki,et al.  Sensing Touch Force using Active Acoustic Sensing , 2015, TEI.

[32]  Shwetak N. Patel,et al.  DigiTouch: Reconfigurable Thumb-to-Finger Input and Text Entry on Head-mounted Displays , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[33]  Paolo Dario,et al.  A Survey of Glove-Based Systems and Their Applications , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[34]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[35]  Mike Y. Chen,et al.  BackHand: Sensing Hand Gestures via Back of the Hand , 2015, UIST.

[36]  Chris Harrison,et al.  Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces , 2008, UIST '08.

[37]  Yang Zhang,et al.  Advancing Hand Gesture Recognition with High Resolution Electrical Impedance Tomography , 2016, UIST.

[38]  Geehyuk Lee,et al.  Force gestures: augmenting touch screen gestures with normal and tangential forces , 2011, UIST.

[39]  Thad Starner,et al.  Hambone: A Bio-Acoustic Gesture Interface , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[40]  Kentaro Takemura,et al.  Hand pose estimation based on active bone-conducted sound sensing , 2016, UbiComp Adjunct.

[41]  Klaus H. Hinrichs,et al.  DigiTap: an eyes-free VR/AR symbolic input device , 2014, VRST '14.

[42]  Christian Holz,et al.  DuoSkin: rapidly prototyping on-skin user interfaces using skin-friendly materials , 2016, SEMWEB.

[43]  Yuta Sugiura,et al.  Behind the palm: Hand gesture recognition through measuring skin deformation on back of hand by using optical sensors , 2017, 2017 56th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE).

[44]  Adiyan Mujibiya,et al.  The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation , 2013, ITS.

[45]  Gierad Laput,et al.  Acoustruments: Passive, Acoustically-Driven, Interactive Controls for Handheld Devices , 2015, CHI.

[46]  Antonio Krüger,et al.  EMPress: Practical Hand Gesture Classification with Wrist-Mounted EMG and Pressure Sensing , 2016, CHI.

[47]  Gierad Laput,et al.  ViBand: High-Fidelity Bio-Acoustic Sensing Using Commodity Smartwatch Accelerometers , 2016, UIST.

[48]  Sebastian Boring,et al.  WatchSense: On- and Above-Skin Input Sensing through a Wearable Depth Sensor , 2017, CHI.

[49]  Sungjae Hwang,et al.  MicPen: pressure-sensitive pen interaction using microphone with standard touchscreen , 2012, CHI EA '12.

[50]  Buntarou Shizuki,et al.  ThumbSlide: An Interaction Technique for Smartwatches using a Thumb Slide Movement , 2016, CHI Extended Abstracts.

[51]  Kening Zhu,et al.  FingerT9: Leveraging Thumb-to-finger Interaction for Same-side-hand Text Entry on Smartwatches , 2018, CHI.

[52]  Mike Fraser,et al.  EchoFlex: Hand Gesture Recognition using Ultrasound Imaging , 2017, CHI.

[53]  Geehyuk Lee,et al.  Forcetap: extending the input vocabulary of mobile touch screens by adding tap gestures , 2011, Mobile HCI.

[54]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.