Activity recognition from a wearable camera

This paper proposes a novel activity recognition approach from video data obtained with a wearable camera. The objective is to recognise the user's activities from a tiny front-facing camera embedded in his/her glasses. Our system allows carers to remotely access the current status of a specified person, which can be broadly applied to those living with disabilities including the elderly who require cognitive assistance or guidance for daily activities. We collected, trained and tested our system on videos collected from different environmental settings. Sequences of four basic activities (drinking, walking, going upstairs and downstairs) are tested and evaluated in challenging real-world scenarios. An optical flow procedure is used as our primary feature extraction method, from which we downsize, reformat and classify sequence of activities using k-Nearest Neighbour algorithm (k-NN), LogitBoost (on Decision Stumps) and Support Vector Machine (SVM). We suggest the optimal settings of these classifiers through cross-validations and achieve an accuracy of 54.2% to 71.9%. Further smoothing using Hidden Markov Model (HMM) improves the result to 68.5%-82.1%.

[1]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Jeen-Shing Wang,et al.  Development of a portable activity detector for daily activity recognition , 2009, 2009 IEEE International Symposium on Industrial Electronics.

[3]  Daniel P. Siewiorek,et al.  Activity recognition and monitoring using multiple sensors on different body positions , 2006, International Workshop on Wearable and Implantable Body Sensor Networks (BSN'06).

[4]  W. Sanderson,et al.  The coming acceleration of global population ageing , 2008, Nature.

[5]  Yiqiang Chen,et al.  Cross-People Mobile-Phone Based Activity Recognition , 2011, IJCAI.

[6]  Martial Hebert,et al.  Temporal segmentation and activity classification from first-person sensing , 2009, 2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[7]  A. Bourke,et al.  A threshold-based fall-detection algorithm using a bi-axial gyroscope sensor. , 2008, Medical engineering & physics.

[8]  Mingui Sun,et al.  Recognizing physical activity from ego-motion of a camera , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[9]  Tanja Schultz,et al.  HMM-based human motion recognition with optical flow data , 2009, 2009 9th IEEE-RAS International Conference on Humanoid Robots.

[10]  Constantine Stephanidis,et al.  Universal access in the information society , 1999, HCI.

[11]  Li Wang,et al.  Human Action Recognition from Boosted Pose Estimation , 2010, 2010 International Conference on Digital Image Computing: Techniques and Applications.

[12]  Edward Sazonov,et al.  Monitoring of Posture Allocations and Activities by a Shoe-Based Wearable Sensor , 2011, IEEE Transactions on Biomedical Engineering.

[13]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[14]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[15]  Majid Sarrafzadeh,et al.  The SmartCane system: an assistive device for geriatrics , 2008, BODYNETS.

[16]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[17]  Matti Linnavuo,et al.  Detection of falls among the elderly by a floor sensor using the electric near field , 2010, IEEE Transactions on Information Technology in Biomedicine.

[18]  Roger Orpwood,et al.  The design of smart homes for people with dementia—user-interface aspects , 2005, Universal Access in the Information Society.

[19]  Paul Lukowicz,et al.  Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.