Real-time Application for Monitoring Human Daily Activity and Risk Situations in Robot-Assisted Living

In this work, we present a real-time application in the scope of human daily activity recognition for robot-assisted living as an extension of our previous work [1]. We implemented our approach using Robot Operating System (ROS) environment, combining different modules to enable a robot to perceive the environment using different sensor modalities. Thus, the robot can move around, detect, track and follow a person to monitor daily activities wherever the person is. We focus our attention mainly on the robotic application by integrating several ROS modules for navigation, activity recognition and decision making. Reported results show that our framework accurately recognizes human activities in a real time application, triggering proper robot (re)actions, including spoken feedback for warnings and/or appropriate robot navigation tasks. Results evidence the potential of our approach for robot-assisted living applications.

[1]  J. Konrad,et al.  Action recognition using log-covariance matrices of silhouette and optical-flow features , 2012 .

[2]  Cristiano Premebida,et al.  A probabilistic approach for human everyday activities recognition using body motion from RGB-D images , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[3]  N. Ayache,et al.  Log‐Euclidean metrics for fast and simple calculus on diffusion tensors , 2006, Magnetic resonance in medicine.

[4]  Bart Selman,et al.  Unstructured human activity detection from RGBD images , 2011, 2012 IEEE International Conference on Robotics and Automation.

[5]  Jake K. Aggarwal,et al.  Spatio-temporal Depth Cuboid Similarity Feature for Activity Recognition Using Depth Camera , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[6]  Weihua Sheng,et al.  Human daily activity recognition in robot-assisted living using multi-sensor fusion , 2009, 2009 IEEE International Conference on Robotics and Automation.

[7]  Guodong Guo,et al.  Evaluating spatiotemporal interest point features for depth-based action recognition , 2014, Image Vis. Comput..

[8]  Karsten Berns,et al.  Methodology for robot mapping and navigation in assisted living environments , 2009, PETRA '09.

[9]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[10]  Petros Daras,et al.  Real-Time Skeleton-Tracking-Based Human Action Recognition Using Kinect Data , 2014, MMM.

[11]  Weihua Sheng,et al.  Realtime human daily activity recognition through fusion of motion and location data , 2010, The 2010 IEEE International Conference on Information and Automation.

[12]  Hema Swetha Koppula,et al.  Learning human activities and object affordances from RGB-D videos , 2012, Int. J. Robotics Res..

[13]  Cristiano Premebida,et al.  Probabilistic human daily activity recognition towards robot-assisted living , 2015, 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[14]  Nasser Kehtarnavaz,et al.  Real-time human action recognition based on depth motion maps , 2013, Journal of Real-Time Image Processing.

[15]  Horst-Michael Groß,et al.  Real-time activity recognition on a mobile companion robot , 2010 .