Human activity classification from wearable devices with cameras

There have been many approaches for human activity classification relying on accelerometer sensors, or cameras installed in the environment. There is relatively less work using egocentric videos. Accelerometer-only systems, although computationally efficient, are limited in the variety and complexity of the activities that they can detect. For instance, we can detect a sitting event by using accelerometer data, but cannot determine whether the user has sat on a chair or sofa, or what type of environment the user is in. In order to detect activities with more details and context, we present a robust and autonomous method using both accelerometer and ego-vision data obtained from a smart phone. A multi-class Support Vector Machine (SVM) is used to classify activities by using accelerometer data and optical flow vectors. Objects in the scene are detected from camera data by using an Aggregate Channel Features based detector. Another multi-class SVM is used to detect approaching different objects. Then, a Hidden Markov Model (HMM) is employed to detect more complex activities. Experiments have been conducted with subjects performing activities of sitting on chairs, sitting on sofas, and walking through doorways. The proposed method achieves overall precision and recall rates of 95% and 89%, respectively.

[1]  Pietro Perona,et al.  Fast Feature Pyramids for Object Detection , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Mi Zhang,et al.  Motion primitive-based human activity recognition using a bag-of-features approach , 2012, IHI '12.

[3]  Laurent Itti,et al.  Situation awareness via sensor-equipped eyeglasses , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Bernt Schiele,et al.  A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.

[5]  Angelo M. Sabatini,et al.  Machine Learning Methods for Classifying Human Physical Activity from On-Body Accelerometers , 2010, Sensors.

[6]  Iain Murray,et al.  Human activity recognition using thigh angle derived from single thigh mounted IMU data , 2014, 2014 International Conference on Indoor Positioning and Indoor Navigation (IPIN).

[7]  Yongcai Wang,et al.  Health sensing by wearable sensors and mobile phones: A survey , 2014, 2014 IEEE 16th International Conference on e-Health Networking, Applications and Services (Healthcom).

[8]  Martial Hebert,et al.  Temporal segmentation and activity classification from first-person sensing , 2009, 2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[9]  Duc A. Tran,et al.  The 11th International Conference on Mobile Systems and Pervasive Computing (MobiSPC-2014) A Study on Human Activity Recognition Using Accelerometer Data from Smartphones , 2014 .

[10]  Fabio Ramos,et al.  Multi-scale Conditional Random Fields for first-person activity recognition on elders and disabled patients , 2015 .

[11]  Angelo M. Sabatini,et al.  Accelerometry-Based Classification of Human Activities Using Markov Modeling , 2011, Comput. Intell. Neurosci..

[12]  Faicel Chamroukhi,et al.  Physical Human Activity Recognition Using Wearable Sensors , 2015, Sensors.