Human Posture Recognition and fall detection Using Kinect V2 Camera

Based on the depth information and skeleton tracking technology of Microsoft Kinectv2 sensor, this paper performs human body gesture recognition and realizes human fall detection based on this. First, the depth of the Kinect V2 sensor is used to process the human joints produced by the skeleton tracker. Then the optimized BP neural network is used for posture recognition, and the fall is detected on this basis. The neural network is trained using the dataset generated by the Kinect tracker,but use a different body tracker (NITE) for testing. Finally, the posture recognition and fall detection were experimentally verified. We performed real-time testing (up to 3.5 meters) over the entire working range of the sensor. Through experiments, the overall accuracy of the NITE tracker for fall testing was 98.5%, and the worst accuracy was 97.3%.

[1]  Isabelle Guyon,et al.  A Scaling Law for the Validation-Set Training-Set Size Ratio , 1997 .

[2]  Suphakant Phimoltares,et al.  Posture recognition invariant to background, cloth textures, body size, and camera distance using morphological geometry , 2010, 2010 International Conference on Machine Learning and Cybernetics.

[3]  Luca Iocchi,et al.  Human Posture Tracking and Classification through Stereo Vision and 3D Model Matching , 2008, EURASIP J. Image Video Process..

[4]  Eric Campo,et al.  A review of smart homes - Present state and future challenges , 2008, Comput. Methods Programs Biomed..

[5]  Matteo Munaro,et al.  A feature-based approach to people re-identification using skeleton keypoints , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[6]  Ying Wu,et al.  Mining actionlet ensemble for action recognition with depth cameras , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[7]  Anthony Kulis,et al.  Bio-Inspired Artificial Intelligence: Theories, Methods, and Technologies , 2009, Scalable Comput. Pract. Exp..

[8]  Weihua Sheng,et al.  Wearable Sensor-Based Hand Gesture and Daily Activity Recognition for Robot-Assisted Living , 2011, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[9]  Dimitrios Makris,et al.  Fall detection system using Kinect’s infrared sensor , 2014, Journal of Real-Time Image Processing.

[10]  Othman O. Khalifa,et al.  Human activity recognition for video surveillance using sequences of postures , 2014, The Third International Conference on e-Technologies and Networks for Development (ICeND2014).

[11]  Chen Xin,et al.  The Elderly's Falling Motion Recognition Based on Kinect and Wearable Sensors , 2016, IAS.

[12]  A. Bourke,et al.  A threshold-based fall-detection algorithm using a bi-axial gyroscope sensor. , 2008, Medical engineering & physics.

[13]  Bart Selman,et al.  Human Activity Detection from RGBD Images , 2011, Plan, Activity, and Intent Recognition.

[14]  Alan K. Bourke,et al.  An optimum accelerometer configuration and simple algorithm for accurately detecting falls , 2006 .

[15]  B. Watanapa,et al.  Human gesture recognition using Kinect camera , 2012, 2012 Ninth International Conference on Computer Science and Software Engineering (JCSSE).

[16]  Eugenio Culurciello,et al.  Fall detection using an address-event temporal contrast vision sensor , 2008, 2008 IEEE International Symposium on Circuits and Systems.

[17]  Ben J. A. Kröse,et al.  Activity Monitoring Systems in Health Care , 2011, Computer Analysis of Human Behavior.

[18]  Gang Qian,et al.  Binocular Full-Body Pose Recognition and Orientation Inference Using Multilinear Analysis , 2009, Tensors in Image Processing and Computer Vision.

[19]  François Brémond,et al.  Applying 3D human model in a posture recognition system , 2006, Pattern Recognit. Lett..

[20]  José María Conejero,et al.  A Vision-Based Approach for Building Telecare and Telerehabilitation Services , 2016, Sensors.

[21]  Alessandro Leone,et al.  Detecting falls with 3D range camera in ambient assisted living applications: a preliminary study. , 2011, Medical engineering & physics.

[22]  Alois Knoll,et al.  Action recognition using ensemble weighted multi-instance learning , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).