e-Shoes: Smart shoes for unobtrusive human activity recognition

Many approaches to human activity recognition such as wearable based or computer vision based are obtrusive in the sense that they prevent the users from performing activities in a natural way, or they might raise privacy invasion concerns. This paper presents e-Shoes — smart shoes for unobtrusive human activity recognition. E-Shoes are shoes instrumented with tiny wireless accelerometers embedded inside the insole of the shoes. The sensors are seamless to the users making the system suitable for recognizing everyday activities. To analyze sensor signals, we propose a convolution neutral networks (CNN) model that automatically learns features from sensing data and makes predictions about performing activities. We verify the effectiveness of the approach with a real dataset that covers seven daily activities. The system achieved 93% accuracy in average, which is very promising, while being energy efficient and easy to use.

[1]  Davide Anguita,et al.  Human Activity Recognition on Smartphones Using a Multiclass Hardware-Friendly Support Vector Machine , 2012, IWAAL.

[2]  Cuong Pham,et al.  Motion Primitive Forests for Human Activity Recognition Using Wearable Sensors , 2016, PRICAI.

[3]  Cuong Pham,et al.  A Wearable Sensor based Approach to Real-Time Fall Detection and Fine-Grained Activity Recognition , 2013, J. Mobile Multimedia.

[4]  Gregory D. Abowd,et al.  Developing shared home behavior datasets to advance HCI and ubiquitous computing research , 2009, CHI Extended Abstracts.

[5]  Svetha Venkatesh,et al.  Activity recognition and abnormality detection with the switching hidden semi-Markov model , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[6]  Gwenn Englebienne,et al.  Accurate activity recognition in a home setting , 2008, UbiComp.

[7]  David Wetherall,et al.  Recognizing daily activities with RFID-based sensors , 2009, UbiComp.

[8]  Bernt Schiele,et al.  Scalable Recognition of Daily Activities with Wearable Sensors , 2007, LoCA.

[9]  Gary M. Weiss,et al.  Activity recognition using cell phone accelerometers , 2011, SKDD.

[10]  G. Wahba Spline Interpolation and Smoothing on the Sphere , 1981 .

[11]  Joseph A. Paradiso,et al.  Gait Analysis Using a Shoe-Integrated Wireless Sensor System , 2008, IEEE Transactions on Information Technology in Biomedicine.

[12]  Daniel Jackson,et al.  FoodBoard: surface contact imaging for food recognition , 2013, UbiComp.

[13]  Paul Lukowicz,et al.  Smart-surface: Large scale textile pressure sensors arrays for activity recognition , 2016, Pervasive Mob. Comput..

[14]  Edward Sazonov,et al.  Posture and Activity Recognition and Energy Expenditure Estimation in a Wearable Platform , 2015, IEEE Journal of Biomedical and Health Informatics.

[15]  Daniel Jackson,et al.  Rapid specification and automated generation of prompting systems to assist people with dementia , 2011, Pervasive Mob. Comput..

[16]  Paul J. M. Havinga,et al.  A Survey of Online Activity Recognition Using Mobile Phones , 2015, Sensors.

[17]  Cuong Pham MobiRAR: Real-Time Human Activity Recognition Using Mobile Devices , 2015, 2015 Seventh International Conference on Knowledge and Systems Engineering (KSE).

[18]  Christopher D. Manning,et al.  Fast dropout training , 2013, ICML.

[19]  Giancarlo Fortino,et al.  Posture Detection Based on Smart Cushion for Wheelchair Users , 2017, Sensors.

[20]  James M. Rehg,et al.  A Scalable Approach to Activity Recognition based on Object Use , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[21]  Roger O. Smith,et al.  Feature Extraction Method for Real Time Human Activity Recognition on Cell Phones , 2011 .

[22]  Kent Larson,et al.  Activity Recognition in the Home Using Simple and Ubiquitous Sensors , 2004, Pervasive.

[23]  Daniel Jackson,et al.  The french kitchen: task-based learning in an instrumented kitchen , 2012, UbiComp.

[24]  Edward Sazonov,et al.  Monitoring of Posture Allocations and Activities by a Shoe-Based Wearable Sensor , 2011, IEEE Transactions on Biomedical Engineering.

[25]  Cuong Pham,et al.  Real-Time Fall Detection and Activity Recognition Using Low-Cost Wearable Sensors , 2013, ICCSA.

[26]  Michael Kipp Annotation Facilities for the Reliable Analysis of Human Motion , 2012, LREC.

[27]  Stephen Lindsay,et al.  The Ambient Kitchen: A Pervasive Sensing Environment for Situated Services , 2012 .

[28]  Bernt Schiele,et al.  A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.