VPad: Virtual Writing Tablet for Laptops Leveraging Acoustic Signals

Human-computer interaction based on touch screens plays an increasing role in our daily lives. Besides smartphones and tablets, laptops are the most popular mobile devices used in both work and leisure. To satisfy requirements of many emerging applications, it becomes desirable to equip both writing and drawing functions directly on laptop screens. In this paper, we design a virtual writing tablet system, VPad, for traditional laptops without touch screens. VPad leverages two speakers and one microphone, which are available in most commodity laptops, for trajectory tracking without additional hardware. It employs acoustic signals to accurately track hand movements and recognize characters user writes in the air. Specifically, VPad emits inaudible acoustic signals from two speakers in a laptop. Then VPad applies Sliding-window Overlap Fourier Transformation technique to find Doppler frequency shift with higher resolution and accuracy in real time. Furthermore, we analyze frequency shifts and energy features of acoustic signals received by the microphone to track the trajectory of hand movements. Finally, we employ a stroke direction sequence model based on possibility estimation to recognize characters users write in the air. Our experimental results show that VPad achieves the average trajectory tracking error of only 1.55cm and the character recognition accuracy of above 90% merely through two speakers and one microphone on a laptop,

[1]  Lili Qiu,et al.  CAT: high-precision acoustic motion tracking , 2016, MobiCom.

[2]  Jie Yang,et al.  Snooping Keystrokes with mm-level Audio Ranging on a Single Phone , 2015, MobiCom.

[3]  Desney S. Tan,et al.  FingerIO: Using Active Sonar for Fine-Grained Finger Tracking , 2016, CHI.

[4]  Desney S. Tan,et al.  SoundWave: using the doppler effect to sense gestures , 2012, CHI.

[5]  T. Munich,et al.  Offline Handwriting Recognition with Multidimensional Recurrent Neural Networks , 2008, NIPS.

[6]  Sangki Yun,et al.  Turning a Mobile Device into a Mouse in the Air , 2015, MobiSys.

[7]  Wei Wang,et al.  Device-free gesture tracking using acoustic signals , 2016, MobiCom.

[8]  Yunhao Liu,et al.  Shake and walk: Acoustic direction finding and fine-grained indoor localization using smartphones , 2014, IEEE INFOCOM 2014 - IEEE Conference on Computer Communications.

[9]  Sangki Yun,et al.  Strata: Fine-Grained Acoustic-based Device-Free Tracking , 2017, MobiSys.

[10]  Xinyu Zhang,et al.  Ubiquitous keyboard for small mobile devices: harnessing multipath fading for fine-grained keystroke localization , 2014, MobiSys.

[11]  Réjean Plamondon,et al.  On-line recognition of handprinted characters: Survey and beta tests , 1990, Pattern Recognit..

[12]  Mun Choon Chan,et al.  Low cost crowd counting using audio tones , 2012, SenSys '12.

[13]  Shyamnath Gollakota,et al.  Contactless Sleep Apnea Detection on Smartphones , 2015, GetMobile Mob. Comput. Commun..

[14]  Latesh G. Malik,et al.  Fine Classification & Recognition of Hand Written Devnagari Characters with Regular Expressions & Minimum Edit Distance Method , 2008, J. Comput..

[15]  Alan V. Oppenheim,et al.  Discrete-time Signal Processing. Vol.2 , 2001 .

[16]  He Wang,et al.  MoLe: Motion Leaks through Smartwatch Sensors , 2015, MobiCom.

[17]  Ling Shao,et al.  Enhanced Computer Vision With Microsoft Kinect Sensor: A Review , 2013, IEEE Transactions on Cybernetics.