Combining IMU With Acoustics for Head Motion Tracking Leveraging Wireless Earphone

—Head motion tracking is a promising research field with vast applications in ubiquitous human-computer interaction (HCI) scenarios. Unfortunately, solutions based on vision and wireless sensing have shortcomings in user privacy and tracking range, respectively. To address these issues, we propose IA-Track, a novel head motion tracking system that combines inertial measurement units (IMU) and acoustic sensing. Our wireless earphone-based method balances flexibility, computational complexity, and tracking accuracy, requiring only an earphone with an IMU and a smartphone. However, we still face two challenges. First, wireless headsets have limited hardware resources, making acoustic Doppler effect-based method unsuitable for acoustic tracking. Second, traditional Kalman filter-based trajectory restoration methods may introduce significant cumulative errors. To tackle these challenges, we rely on IMU sensor data to recover the trajectory and use smartphones to emit ”inaudible” acoustic signals that the earphone receives to adjust the IMU drift track. We conducted extensive experiments involving 50 volunteers in various potential IA-Track usage scenarios, demonstrating that our well-designed system achieves satisfactory head motion tracking performance.

[1]  Ke Sun,et al.  DSW: One-Shot Learning Scheme for Device-Free Acoustic Gesture Signals , 2023, IEEE Transactions on Mobile Computing.

[2]  W. Lou,et al.  PD-FMCW: Push the Limit of Device-Free Acoustic Sensing Using Phase Difference in FMCW , 2023, IEEE Transactions on Mobile Computing.

[3]  Jiangchuan Liu,et al.  BlinkRadar: Non-Intrusive Driver Eye-Blink Detection with UWB Radar , 2022, 2022 IEEE 42nd International Conference on Distributed Computing Systems (ICDCS).

[4]  Yang Gao,et al.  EarHealth: an earphone-based acoustic otoscope for detection of multiple ear diseases in daily life , 2022, MobiSys.

[5]  Wei Wang,et al.  SpeedTalker: Automobile Speed Estimation via Mobile Phones , 2022, IEEE Transactions on Mobile Computing.

[6]  Shwetak N. Patel,et al.  FaceOri: Tracking Head Position and Orientation Using Ultrasonic Ranging on Earphones , 2022, CHI.

[7]  Daibo Liu,et al.  Towards Device Independent Eavesdropping on Telephone Conversations with Built-in Accelerometer , 2021, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[8]  Xinyu Zhang,et al.  UltraSE: single-channel speech enhancement using ultrasound , 2021, MobiCom.

[9]  Jie Xiong,et al.  Evidence in Hand: Passive Vibration Response-based Continuous User Authentication , 2021, 2021 IEEE 41st International Conference on Distributed Computing Systems (ICDCS).

[10]  Yu Wang,et al.  CanalScan: Tongue-Jaw Movement Recognition via Ear Canal Deformation Sensing , 2021, IEEE INFOCOM 2021 - IEEE Conference on Computer Communications.

[11]  Haiping Huang,et al.  Computer Vision-Assisted 3D Object Localization via COTS RFID Devices and a Monocular Camera , 2021, IEEE Transactions on Mobile Computing.

[12]  Yubo Yan,et al.  EarphoneTrack: involving earphones into the ecosystem of acoustic motion tracking , 2020, SenSys.

[13]  Jamie A. Ward,et al.  Towards a characterisation of emotional intent during scripted scenes using in-ear movement sensors , 2020, SEMWEB.

[14]  Gierad Laput,et al.  Enhancing Mobile Voice Assistants with WorldGaze , 2020, CHI.

[15]  Stephen Brewster,et al.  Acoustic Transparency and the Changing Soundscape of Auditory Mixed Reality , 2020, CHI.

[16]  Florian Alt,et al.  The Role of Eye Gaze in Security and Privacy Applications: Survey and Future HCI Research Directions , 2020, CHI.

[17]  Minglu Li,et al.  TouchPass: towards behavior-irrelevant on-touch user authentication on smartphones leveraging vibrations , 2020, MobiCom.

[18]  Minglu Li,et al.  Leveraging Acoustic Signals for Vehicle Steering Tracking with Smartphones , 2020, IEEE Transactions on Mobile Computing.

[19]  Wei Sun,et al.  EarEcho , 2019, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[20]  Cecilia Mascolo,et al.  Head Motion Tracking Through in-Ear Wearables , 2019, EarComp@UbiComp.

[21]  Lin Chen,et al.  Taprint: Secure Text Input for Commodity Smart Wristbands , 2019, MobiCom.

[22]  Andreas Bulling,et al.  Accurate and Robust Eye Contact Detection During Everyday Mobile Device Interactions , 2019, ArXiv.

[23]  Fan Li,et al.  D3-Guard: Acoustic-based Drowsy Driving Detection Using Smartphones , 2019, IEEE INFOCOM 2019 - IEEE Conference on Computer Communications.

[24]  Shyamnath Gollakota,et al.  MilliSonic: Pushing the Limits of Acoustic Motion Tracking , 2019, CHI.

[25]  Kang G. Shin,et al.  Wireless CSI-based head tracking in the driver seat , 2018, CoNEXT.

[26]  Daniil Osokin,et al.  Real-time 2D Multi-Person Pose Estimation on CPU: Lightweight OpenPose , 2018, ICPRAM.

[27]  Bing Zhou,et al.  EchoPrint: Two-factor Authentication using Acoustics and Vision on Smartphones , 2018, MobiCom.

[28]  Lei Xie,et al.  VSkin: Sensing Touch Gestures on Surfaces of Mobile Devices Using Acoustic Signals , 2018, MobiCom.

[29]  Chulhong Min,et al.  Exploring audio and kinetic sensing on earable devices , 2018, WearSys@MobiSys.

[30]  Matti Siekkinen,et al.  Exploring Vision-Based Techniques for Outdoor Positioning Systems: A Feasibility Study , 2017, IEEE Transactions on Mobile Computing.

[31]  Bing Zhou,et al.  BatMapper: Acoustic Sensing Based Indoor Floor Plan Construction Using Smartphones , 2017, MobiSys.

[32]  Yu Wang,et al.  EchoTrack: Acoustic device-free hand tracking on smart phones , 2017, IEEE INFOCOM 2017 - IEEE Conference on Computer Communications.

[33]  Sachin Katti,et al.  Position Tracking for Virtual Reality Using Commodity WiFi , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[34]  Yaser Sheikh,et al.  Realtime Multi-person 2D Pose Estimation Using Part Affinity Fields , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[35]  Wei Wang,et al.  Device-free gesture tracking using acoustic signals , 2016, MobiCom.

[36]  Lili Qiu,et al.  CAT: high-precision acoustic motion tracking , 2016, MobiCom.

[37]  He Wang,et al.  I am a Smartwatch and I can Track my User's Arm , 2016, MobiSys.

[38]  Yunhao Liu,et al.  Swadloon: Direction Finding and Indoor Localization Using Acoustic Signal by Shaking Smartphones , 2015, IEEE Transactions on Mobile Computing.

[39]  Sangki Yun,et al.  Turning a Mobile Device into a Mouse in the Air , 2015, MobiSys.

[40]  Mo Li,et al.  Use it free: instantly knowing your phone attitude , 2014, MobiCom.

[41]  Mohan M. Trivedi,et al.  On the design and evaluation of robust head pose for visual user interfaces: algorithms, databases, and comparisons , 2012, AutomotiveUI.

[42]  Dotan Knaan,et al.  Single image face orientation and gaze detection , 2009, Machine Vision and Applications.

[43]  Albrecht Schmidt,et al.  Eye-gaze interaction for mobile phones , 2007, Mobility '07.

[44]  Andreas Paepcke,et al.  EyePoint: practical pointing and selection using gaze and keyboard , 2007, CHI.

[45]  Jie Xiong,et al.  DriverSonar: Fine-Grained Dangerous Driving Detection Using Active Sonar , 2021, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[46]  R. Kálmán Mathematical description of linear dynamical systems , 1963 .