SmartGrip: grip sensing system for commodity mobile devices through sound signals

Although many studies have attempted to detect the hand postures of a mobile device to utilize these postures as a user interface, they either require additional hardware or can differentiate a limited number of grips only if there is a touch event on the mobile device’s screen. In this paper, we propose a novel grip sensing system, called SmartGrip, which allows a mobile device to detect different hand postures without any additional hardware and a screen touch event. SmartGrip emits carefully designed sound signals and differentiates the propagated signals distorted by different user grips. To achieve this, we analyze how a sound signal propagates from the speaker to the microphone of a mobile device and then address three key challenges: sound structure design, volume control, and feature extraction and classification. We implement and evaluate SmartGrip on three Android mobile devices. With six representative grips, SmartGrip exhibits 93.1% average accuracy for ten users in an office environment. We also demonstrate that SmartGrip operates with 83.5 to 98.3% accuracy in six different (noisy) locations. Further demonstrating the feasibility of SmartGrip as a user interface, we develop an Android application that exploits SmartGrip, validating its practical usage.

[1]  Takefumi Ogawa,et al.  A Study on Grasp Recognition Independent of Users' Situations Using Built-in Sensors of Smartphones , 2015, UIST.

[2]  Buntarou Shizuki,et al.  Touch & activate: adding interactivity to existing objects using active acoustic sensing , 2013, UIST.

[3]  Shwetak N. Patel,et al.  GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones , 2012, UIST.

[4]  Anthony Rowe,et al.  Indoor pseudo-ranging of mobile devices using ultrasonic chirps , 2012, SenSys '12.

[5]  V. Michael Bove,et al.  Graspables: grasp-recognition as a user interface , 2009, CHI.

[6]  Kang G. Shin,et al.  EchoTag: Accurate Infrastructure-Free Indoor Location Tagging with Smartphones , 2015, MobiCom.

[7]  Pourang Irani,et al.  Sensing Tablet Grasp + Micro-mobility for Active Reading , 2015, UIST.

[8]  Sarah H. Creem,et al.  Grasping objects by their handles: a necessary interaction between cognition and action. , 2001, Journal of experimental psychology. Human perception and performance.

[9]  Mike Y. Chen,et al.  iGrasp: grasp-based adaptive keyboard for mobile devices , 2013, CHI Extended Abstracts.

[10]  Kang G. Shin,et al.  Expansion of Human-Phone Interface By Sensing Structure-Borne Sound Propagation , 2016, MobiSys.

[11]  David J. Ewins,et al.  Modal Testing: Theory, Practice, And Application , 2000 .

[12]  G. Wilkinson The Journal of Laryngology and Otology THE SENSE OF HEARING , 2022 .

[13]  Shwetak N. Patel,et al.  ContextType: using hand posture information to improve mobile touch screen text entry , 2013, CHI.

[14]  Brian Schwarz,et al.  Experimental modal analysis , 1999 .

[15]  Sebastian Boring,et al.  HandSense: discriminating different ways of grasping and holding a tangible user interface , 2009, Tangible and Embedded Interaction.

[16]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[17]  Jinkyu Lee,et al.  Towards grip sensing for commodity smartphones through acoustic signature , 2017, UbiComp/ISWC Adjunct.

[18]  Eugene Ackerman,et al.  ACOUSTIC ABSORPTION COEFFICIENTS OF HUMAN BODY SURFACES , 1962 .

[19]  Ming-Sui Lee,et al.  iRotateGrasp: automatic screen rotation based on grasp of mobile devices , 2013, CHI Extended Abstracts.

[20]  Xiangyu Liu,et al.  Acoustic Fingerprinting Revisited: Generate Stable Device ID Stealthily with Inaudible Sound , 2014, CCS.