Fall Recovery Subactivity Recognition With RGB-D Cameras

Accidental falls have been identified as a cause of mortality for elders who live alone around the globe. Following a fall, additional injury can be sustained if proper fall recovery techniques are not followed. These secondary complications can be reduced if the person had access to safe recovery procedures or were assisted, either by a person or a robot. We propose a framework for in situ robotic assistance for post fall recovery scenarios. In order to assist autonomously robots need to recognize an individual's posture and subactivities (e.g., falling, rolling, move to hands and knees, crawling, and push up through legs, sitting or standing). Human body skeleton tracking through RGB-D pose estimation methods fail to identify the body parts during key phases of fall recovery due to high occlusion rates in fallen, and recovering, postures. To address this issue, we investigated how low-level image features can be leveraged to recognize an individual's subactivities. Depth cuboid similarity features (DCSFs) approach was improved with M-partitioned histograms of depth cuboid prototypes, integration of activity progression direction, and outlier spatiotemporal interest point removal. Our modified DCSF algorithm was evaluated on a unique RGB-D multiview dataset, achieving 87.43 ± 1.74% accuracy in the extensive 3003 (C15 10) combinations of trainingtest groups of 15 subjects in 10 trials. This result was significantly larger than the nearest competitor, and faster in the training phase. This work could lead to more accurate in situ robotic assistance for fall recovery, saving lives for victims of falls.

[1]  Gerald Penn,et al.  Convolutional Neural Networks for Speech Recognition , 2014, IEEE/ACM Transactions on Audio, Speech, and Language Processing.

[2]  AGEinG And LifE CoursE , fAmiLy And Community HEALtH WHo Global report on falls Prevention in older Age , .

[3]  Allen R. Hanson,et al.  Mobile manipulators for assisted living in residential settings , 2008, Auton. Robots.

[4]  Yunjian Ge,et al.  HMM-Based Human Fall Detection and Prediction Method Using Tri-Axial Accelerometer , 2013, IEEE Sensors Journal.

[5]  Kerstin Dautenhahn,et al.  Care-O-bot® 3 - Vision of a Robot Butler , 2013, Your Virtual Butler.

[6]  Takanori Shibata,et al.  Living With Seal Robots—Its Sociopsychological and Physiological Influences on the Elderly at a Care House , 2007, IEEE Transactions on Robotics.

[7]  S. J. Redmond,et al.  Sensors-Based Wearable Systems for Monitoring of Human Movement and Falls , 2012, IEEE Sensors Journal.

[8]  Cordelia Schmid,et al.  A Spatio-Temporal Descriptor Based on 3D-Gradients , 2008, BMVC.

[9]  Billur Barshan,et al.  Human Activity Recognition Using Inertial/Magnetic Sensor Units , 2010, HBU.

[10]  Marjorie Skubic,et al.  Fall Detection in Homes of Older Adults Using the Microsoft Kinect , 2015, IEEE Journal of Biomedical and Health Informatics.

[11]  Ying Wu,et al.  Mining actionlet ensemble for action recognition with depth cameras , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[12]  Bruce A. MacDonald,et al.  Socially Assistive Robot HealthBot: Design, Implementation, and Field Trials , 2016, IEEE Systems Journal.

[13]  Ieee Xplore,et al.  IEEE Transactions on Industrial Informatics , 2005 .

[14]  Yun Li,et al.  A Microphone Array System for Automatic Fall Detection , 2012, IEEE Transactions on Biomedical Engineering.

[15]  Jake K. Aggarwal,et al.  Human activity recognition from 3D data: A review , 2014, Pattern Recognit. Lett..

[16]  Zicheng Liu,et al.  HON4D: Histogram of Oriented 4D Normals for Activity Recognition from Depth Sequences , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[17]  Rob Miller,et al.  Smart Homes that Monitor Breathing and Heart Rate , 2015, CHI.

[18]  Jake K. Aggarwal,et al.  Spatio-temporal Depth Cuboid Similarity Feature for Activity Recognition Using Depth Camera , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[19]  Bastian Leibe,et al.  Lying Pose Recognition for Elderly Fall Detection , 2011, Robotics: Science and Systems.

[20]  Toshiharu Mukai,et al.  Development of a nursing-care assistant robot RIBA that can lift a human in its arms , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[21]  Ivan Laptev,et al.  On Space-Time Interest Points , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[22]  Ling Shao,et al.  A survey on fall detection: Principles and approaches , 2013, Neurocomputing.

[23]  Xuelong Li,et al.  Selecting Key Poses on Manifold for Pairwise Action Recognition , 2012, IEEE Transactions on Industrial Informatics.