Human Movement Modeling and Activity Perception Based on Fiber-Optic Sensing System

This paper presents a flexible fiber-optic sensor-based pressure sensing system for human activity analysis and situation perception in indoor environments. In this system, a binary sensing technology is applied to reduce the data workload, and a bipedal movement-based space encoding scheme is designed to capture people's geometric information. We also develop a nonrepetitive encoding scheme to eliminate the ambiguity caused by the two-foot structure of bipedal movements. Furthermore, we propose an invariant activity representation model based on trajectory segments and their statistical distributions. In addition, a mixture model is applied to represent scenarios. The number of subjects is finally determined by Bayesian information criterion. The Bayesian network and region of interests are employed to facilitate the perception of interactions and situations. The results are obtained using distribution divergence estimation, expectation-maximization, and Bayesian network inference methods. In the experiments, we simulated an office environment and tested walk, work, rest, and talk activities for both one and two person cases. The experiment results have demonstrated that the average individual activity recognition is higher than 90%, and the situation perception rate can achieve 80%.

[1]  Qingquan Sun,et al.  Indoor scene and human activity analysis with wireless binary sensor networks , 2013 .

[2]  Kent Larson,et al.  Activity Recognition in the Home Using Simple and Ubiquitous Sensors , 2004, Pervasive.

[3]  Qi Hao An integral and differential geometric approach to behavioral information acquisition and integration via binary sensor networks , 2012, 2012 IEEE Sensors.

[4]  Jian Lu,et al.  Sensor-Based Human Activity Recognition in a Multi-user Scenario , 2009, AmI.

[5]  Yoichi Sato,et al.  Learning motion patterns and anomaly detection by Human trajectory analysis , 2007, 2007 IEEE International Conference on Systems, Man and Cybernetics.

[6]  Ju Shen,et al.  Virtual Mirror Rendering With Stationary RGB-D Cameras and Stored 3-D Background , 2013, IEEE Transactions on Image Processing.

[7]  Yen-Ping Chen,et al.  Online classifier construction algorithm for human activity detection using a tri-axial accelerometer , 2008, Appl. Math. Comput..

[8]  Osamu Fukuda,et al.  Flexible piezoelectric pressure sensors using oriented aluminum nitride thin films prepared on polyethylene terephthalate films , 2006 .

[9]  O. Urfaliglu,et al.  PIR-sensor based human motion event classification , 2008, 2008 IEEE 16th Signal Processing, Communication and Applications Conference.

[10]  Qi Hao,et al.  Mobile Target Scenario Recognition Via Low-Cost Pyroelectric Sensing System: Toward a Context-Enhanced Accurate Identification , 2014, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[11]  Lian-Wen Jin,et al.  Activity recognition from acceleration data using AR model representation and SVM , 2008, 2008 International Conference on Machine Learning and Cybernetics.

[12]  David Birchfield,et al.  A pressure sensing floor for interactive media applications , 2005, ACE '05.

[13]  Matthew Brand,et al.  Discovery and Segmentation of Activities in Video , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Hong Man,et al.  DSPM: Dynamic Structure Preserving Map for action recognition , 2013, 2013 IEEE International Conference on Multimedia and Expo (ICME).

[15]  Zongming Fei,et al.  ITGR: Intermediate Target Based Geographic Routing , 2010, 2010 Proceedings of 19th International Conference on Computer Communications and Networks.

[16]  Zhenyu He,et al.  Activity recognition from acceleration data based on discrete consine transform and SVM , 2009, 2009 IEEE International Conference on Systems, Man and Cybernetics.

[17]  Jian Lu,et al.  A Pattern Mining Approach to Sensor-Based Human Activity Recognition , 2011, IEEE Transactions on Knowledge and Data Engineering.

[18]  Angelo M. Sabatini,et al.  Assessment of walking features from foot inertial sensing , 2005, IEEE Transactions on Biomedical Engineering.

[19]  James W. Davis,et al.  The Recognition of Human Movement Using Temporal Templates , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[20]  Yunhui Zheng,et al.  Nonadaptive Group testing based fiber sensor deployment for multiperson tracking , 2006, IEEE Sensors Journal.

[21]  Eric Horvitz,et al.  Layered representations for human activity recognition , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[22]  Mikael Fernström,et al.  LiteFoot - A Floor Space for Recording Dance and Controlling Media , 1998, ICMC.

[23]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[24]  Venet Osmani,et al.  Human activity recognition in pervasive health-care: Supporting efficient remote collaboration , 2008, J. Netw. Comput. Appl..

[25]  Shyamsundar Rajaram,et al.  Human Activity Recognition Using Multidimensional Indexing , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[26]  Nasser Kehtarnavaz,et al.  Fusion of Inertial and Depth Sensor Data for Robust Hand Gesture Recognition , 2014, IEEE Sensors Journal.

[27]  Gaurav S. Sukhatme,et al.  Tracking and Modeling of Human Activity Using Laser Rangefinders , 2010, Int. J. Soc. Robotics.

[28]  Clay K. Kirkendall,et al.  Overview of high performance fibre-optic sensing , 2004 .

[29]  Martial Hebert,et al.  Spatio-temporal Shape and Flow Correlation for Action Recognition , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[30]  Nasser Kehtarnavaz,et al.  Real-time human action recognition based on depth motion maps , 2016, Journal of Real-Time Image Processing.

[31]  A. Brodsky,et al.  An EM-based Ensemble Learning Algorithm on Piecewise Surface Regression Problem , 2012 .

[32]  Albrecht Schmidt,et al.  Multi-sensor Activity Context Detection for Wearable Computing , 2003, EUSAI.

[33]  Larry S. Davis,et al.  Understanding videos, constructing plots learning a visually grounded storyline model from annotated videos , 2009, CVPR.

[34]  Qi Hao,et al.  Space encoding based human activity modeling and situation perception , 2013, 2013 IEEE International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA).

[35]  Weihua Sheng,et al.  Wearable Sensor-Based Hand Gesture and Daily Activity Recognition for Robot-Assisted Living , 2011, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[36]  Gregory D. Abowd,et al.  The smart floor: a mechanism for natural user identification and tracking , 2000, CHI Extended Abstracts.

[37]  Ronen Basri,et al.  Actions as Space-Time Shapes , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[38]  Qi Hao,et al.  Mobile targets region-of-interest via distributed pyroelectric sensor network: Towards a robust, real-time context reasoning , 2010, 2010 IEEE Sensors.

[39]  Hiroshi Ishiguro,et al.  Laser-Based Tracking of Human Position and Orientation Using Parametric Shape Modeling , 2009, Adv. Robotics.

[40]  Wolfram Burgard,et al.  Using Boosted Features for the Detection of People in 2D Range Data , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.