Device-free gesture tracking using acoustic signals

Device-free gesture tracking is an enabling HCI mechanism for small wearable devices because fingers are too big to control the GUI elements on such small screens, and it is also an important HCI mechanism for medium-to-large size mobile devices because it allows users to provide input without blocking screen view. In this paper, we propose LLAP, a device-free gesture tracking scheme that can be deployed on existing mobile devices as software, without any hardware modification. We use speakers and microphones that already exist on most mobile devices to perform device-free tracking of a hand/finger. The key idea is to use acoustic phase to get fine-grained movement direction and movement distance measurements. LLAP first extracts the sound signal reflected by the moving hand/finger after removing the background sound signals that are relatively consistent over time. LLAP then measures the phase changes of the sound signals caused by hand/finger movements and then converts the phase changes into the distance of the movement. We implemented and evaluated LLAP using commercial-off-the-shelf mobile phones. For 1-D hand movement and 2-D drawing in the air, LLAP has a tracking accuracy of 3.5 mm and 4.6 mm, respectively. Using gesture traces tracked by LLAP, we can recognize the characters and short words drawn in the air with an accuracy of 92.3% and 91.2%, respectively.

[1]  Joerg F. Hipp,et al.  Time-Frequency Analysis , 2014, Encyclopedia of Computational Neuroscience.

[2]  Shaojie Tang,et al.  SoundWrite: Text Input on Surfaces through Mobile Acoustic Sensing , 2015, SmartObjects '15.

[3]  Jue Wang,et al.  RF-IDraw: virtual touch screen in the air using RF signals , 2015, SIGCOMM 2015.

[4]  Khaled A. Harras,et al.  WiGest: A ubiquitous WiFi-based gesture recognition system , 2014, 2015 IEEE Conference on Computer Communications (INFOCOM).

[5]  Hari Balakrishnan,et al.  6th ACM/IEEE International Conference on on Mobile Computing and Networking (ACM MOBICOM ’00) The Cricket Location-Support System , 2022 .

[6]  Yunhao Liu,et al.  Shake and walk: Acoustic direction finding and fine-grained indoor localization using smartphones , 2014, IEEE INFOCOM 2014 - IEEE Conference on Computer Communications.

[7]  Chi Zhang,et al.  Extending Mobile Interaction Through Near-Field Visible Light Sensing , 2015, MobiCom.

[8]  Desney S. Tan,et al.  SoundWave: using the doppler effect to sense gestures , 2012, CHI.

[9]  Shyamnath Gollakota,et al.  Bringing Gesture Recognition to All Devices , 2014, NSDI.

[10]  Guobin Shen,et al.  BeepBeep: a high accuracy acoustic ranging system using COTS mobile devices , 2007, SenSys '07.

[11]  Arun Kumar Agnihotri,et al.  Determination Of Sex By Hand Dimensions , 2005 .

[12]  Fadel Adib,et al.  Multi-Person Localization via RF Body Reflections , 2015, NSDI.

[13]  Sangki Yun,et al.  Turning a Mobile Device into a Mouse in the Air , 2015, MobiSys.

[14]  Xinyu Zhang,et al.  Ubiquitous keyboard for small mobile devices: harnessing multipath fading for fine-grained keystroke localization , 2014, MobiSys.

[15]  Khaled A. Harras,et al.  WiGest demo: A ubiquitous WiFi-based gesture recognition system , 2015, 2015 IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS).

[16]  Kang G. Shin,et al.  EchoTag: Accurate Infrastructure-Free Indoor Location Tagging with Smartphones , 2015, MobiCom.

[17]  Ivan Poupyrev,et al.  Lumitrack: low cost, high precision, high speed tracking with projected m-sequences , 2013, UIST.

[18]  Yunhao Liu,et al.  Context-free Attacks Using Keyboard Acoustic Emanations , 2014, CCS.

[19]  N. Huang,et al.  The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis , 1998, Proceedings of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences.

[20]  Jie Yang,et al.  Snooping Keystrokes with mm-level Audio Ranging on a Single Phone , 2015, MobiCom.

[21]  Khaled H. Hamed,et al.  Time-frequency analysis , 2003 .

[22]  Shyamnath Gollakota,et al.  Contactless Sleep Apnea Detection on Smartphones , 2015, GetMobile Mob. Comput. Commun..

[23]  Wei Wang,et al.  Understanding and Modeling of WiFi Signal Based Human Activity Recognition , 2015, MobiCom.

[24]  Fadel Adib,et al.  Multi-Person Motion Tracking via RF Body Reflections , 2014 .

[25]  Li Sun,et al.  WiDraw: Enabling Hands-free Drawing in the Air on Commodity WiFi Devices , 2015, MobiCom.

[26]  Shwetak N. Patel,et al.  Whole-home gesture recognition using wireless signals , 2013, MobiCom.

[27]  Shwetak N. Patel,et al.  AirLink: sharing files between multiple devices using in-air gestures , 2014, UbiComp.

[28]  Eric C. Larson,et al.  DopLink: using the doppler effect for multi-device interaction , 2013, UbiComp.

[29]  Desney S. Tan,et al.  FingerIO: Using Active Sonar for Fine-Grained Finger Tracking , 2016, CHI.

[30]  David Tse,et al.  Fundamentals of Wireless Communication , 2005 .

[31]  Xinyu Zhang,et al.  mTrack: High-Precision Passive Tracking Using Millimeter Wave Radios , 2015, MobiCom.

[32]  A. Rodríguez Valiente,et al.  Extended high-frequency (9–20 kHz) audiometry reference thresholds in 645 healthy subjects , 2014, International journal of audiology.

[33]  Jie Yang,et al.  E-eyes: device-free location-oriented activity identification using fine-grained WiFi signatures , 2014, MobiCom.

[34]  Parameswaran Ramanathan,et al.  Leveraging directional antenna capabilities for fine-grained gesture recognition , 2014, UbiComp.

[35]  Richard P. Martin,et al.  Detecting driver phone use leveraging car speakers , 2011, MobiCom.

[36]  Pei Zhang,et al.  Spartacus: Spatially-Aware Interaction for Mobile Devices Through Energy-Efficient Audio Sensing , 2015, GETMBL.

[37]  David Chu,et al.  SwordFight: enabling a new class of phone-to-phone action games on commodity phones , 2012, MobiSys '12.

[38]  Otmar Hilliges,et al.  In-air gestures around unmodified mobile devices , 2014, UIST.

[39]  Wei Wang,et al.  Keystroke Recognition Using WiFi Signals , 2015, MobiCom.