Detecting Driver’s Smartphone Usage via Nonintrusively Sensing Driving Dynamics

In this paper, we address a critical task of dynamically detecting the simultaneous behavior of driving and texting using smartphone as the sensor. We propose, design, and implement TEXIVE which achieves the goal of detecting texting operations during driving utilizing irregularities and rich micro-movements of users. Without relying on any external infrastructures and additional devices, and no need to bring any modification to vehicles, TEXIVE is able to successfully detect dangerous operations with good sensitivity, specificity, and accuracy by leveraging the inertial sensors integrated in regular smartphones. To validate our approach, we conduct extensive experiments involving in a number of volunteers on various of vehicles and smartphones. Our evaluation results show that TEXIVE has a classification accuracy of 87.18%, and precision of 96.67%.

[1]  Zhu Wang,et al.  Mobile Crowd Sensing and Computing , 2015, ACM Comput. Surv..

[2]  Sara A. Bly,et al.  Quiet calls: talking silently on mobile phones , 2001, CHI.

[3]  Mikael Wiberg,et al.  Managing availability: Supporting lightweight negotiations to handle interruptions , 2005, TCHI.

[4]  Ryan Newton,et al.  The pothole patrol: using a mobile sensor network for road surface monitoring , 2008, MobiSys '08.

[5]  Thomas J. Bao,et al.  CarSafe app: alerting drowsy and distracted drivers using dual cameras on smartphones , 2013, MobiSys '13.

[6]  Angelo M. Sabatini,et al.  Machine Learning Methods for Classifying Human Physical Activity from On-Body Accelerometers , 2010, Sensors.

[7]  Richard P. Martin,et al.  Detecting driver phone use leveraging car speakers , 2011, MobiCom.

[8]  Jason I. Hong,et al.  Undistracted driving: a mobile phone that doesn't distract , 2011, HotMobile '11.

[9]  Richard P. Martin,et al.  Sensing vehicle dynamics for determining driver phone use , 2013, MobiSys '13.

[10]  Gustav Markkula,et al.  Driver Distraction Detection with a Camera Vision System , 2007, 2007 IEEE International Conference on Image Processing.

[11]  Pat Langley,et al.  An Analysis of Bayesian Classifiers , 1992, AAAI.

[12]  Hanghang Tong,et al.  Activity recognition with smartphone sensors , 2014 .

[13]  Xiang-Yang Li,et al.  Continuous user identification via touch and movement behavioral biometrics , 2014, 2014 IEEE 33rd International Performance Computing and Communications Conference (IPCCC).

[14]  R. Jafari,et al.  Body sensor networks for driver distraction identification , 2008, 2008 IEEE International Conference on Vehicular Electronics and Safety.

[15]  Kenji Mase,et al.  Activity and Location Recognition Using Wearable Sensors , 2002, IEEE Pervasive Comput..

[16]  F. Ichikawa,et al.  Where's The Phone? A Study of Mobile Phone Location in Public Spaces , 2005, 2005 2nd Asia Pacific Conference on Mobile Technology, Applications and Systems.

[17]  Paramvir Bahl,et al.  I am a smartphone and I know my user is driving , 2014, 2014 Sixth International Conference on Communication Systems and Networks (COMSNETS).

[18]  Paramvir Bahl,et al.  Poster: you driving? talk to you later , 2011, ACM SIGMOBILE International Conference on Mobile Systems, Applications, and Services.

[19]  Yu Wang,et al.  SmartLoc: sensing landmarks silently for smartphone-based metropolitan localization , 2016, EURASIP J. Wirel. Commun. Netw..

[20]  L. Rabiner,et al.  An introduction to hidden Markov models , 1986, IEEE ASSP Magazine.

[21]  Kent Larson,et al.  Real-Time Recognition of Physical Activities and Their Intensities Using Wireless Accelerometers and a Heart Rate Monitor , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[22]  Patrick Baudisch,et al.  Blindsight: eyes-free access to mobile phones , 2008, CHI.

[23]  Fumiko Ichikawa,et al.  A Cross Culture Study on Phone Carrying and Physical Personalization , 2007, HCI.

[24]  Michael L. Littman,et al.  Activity Recognition from Accelerometer Data , 2005, AAAI.

[25]  Yu Wang,et al.  Mobile Big Data Meets Cyber-Physical System Mobile Crowdsensing based Cyber-Physical System for Smart Urban Traffic Control , 2015 .

[26]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[27]  Ramachandran Ramjee,et al.  Nericell: rich monitoring of road and traffic conditions using mobile smartphones , 2008, SenSys '08.

[28]  Xiang-Yang Li,et al.  SilentSense: silent user identification via touch and movement behavioral biometrics , 2013, MobiCom.

[29]  Johannes Peltola,et al.  Activity classification using realistic data from wearable sensors , 2006, IEEE Transactions on Information Technology in Biomedicine.

[30]  Miguel Ángel Sotelo,et al.  Real-time system for monitoring driver vigilance , 2004, Proceedings of the IEEE International Symposium on Industrial Electronics, 2005. ISIE 2005..

[31]  Osama Masoud,et al.  Vision-based methods for driver monitoring , 2003, Proceedings of the 2003 IEEE International Conference on Intelligent Transportation Systems.

[32]  Yasir Mohd Mustafah,et al.  Real-time Human Activity Recognition , 2017 .

[33]  Xiang-Yang Li,et al.  You're driving and texting: detecting drivers using personal smart phones by leveraging inertial sensors , 2013, MobiCom.