Egocentric Real-time Workspace Monitoring using an RGB-D camera

We describe an integrated system for personal workspace monitoring based around an RGB-D sensor. The approach is egocentric, facilitating full flexibility, and operates in real-time, providing object detection and recognition, and 3D trajectory estimation whilst the user undertakes tasks in the workspace. A prototype on-body system developed in the context of work-flow analysis for industrial manipulation and assembly tasks is described. The system is evaluated on two tasks with multiple users, and results indicate that the method is effective, giving good accuracy performance.

[1]  Alois Knoll,et al.  Human workflow analysis using 3D occupancy grid hand tracking in a human-robot collaboration scenario , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Danica Kragic,et al.  Simultaneous Visual Recognition of Manipulation Actions and Manipulated Objects , 2008, ECCV.

[3]  Patrick Rives,et al.  Real-time Quadrifocal Visual Odometry , 2010, Int. J. Robotics Res..

[4]  Andrew W. Fitzgibbon,et al.  KinectFusion: Real-time dense surface mapping and tracking , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[5]  Sander Oude Elberink,et al.  Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications , 2012, Sensors.

[6]  James M. Rehg,et al.  A Scalable Approach to Activity Recognition based on Object Use , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[7]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[8]  James M. Rehg,et al.  Learning to recognize objects in egocentric activities , 2011, CVPR 2011.

[9]  David W. Murray,et al.  Wearable hand activity recognition for event summarization , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).

[10]  Yuichi Ohta,et al.  Object Tracking and Object Change Detection in Desktop Manipulation for Video-Based Interactive Manuals , 2004, PCM.

[11]  Dima Damen,et al.  Real-time Learning and Detection of 3D Texture-less Objects: A Scalable Approach , 2012, BMVC.

[12]  Fabien Lagriffoul,et al.  Activity Recognition Based on Intra and Extra Manipulation of Everyday Objects , 2007, UCS.

[13]  Wolfram Burgard,et al.  OctoMap : A Probabilistic , Flexible , and Compact 3 D Map Representation for Robotic Systems , 2010 .

[14]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[15]  Christopher Hunt,et al.  Notes on the OpenSURF Library , 2009 .

[16]  Katsushi Ikeuchi,et al.  Task analysis based on observing hands and objects by vision , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Dima Damen,et al.  Real-time Learning and Detection of 3D Texture-less Objects: A Scalable Approach , 2012, BMVC 2012.

[18]  Ezio Malis,et al.  Improving vision-based control using efficient second-order minimization techniques , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[19]  Horst Bischof,et al.  On-line Learning of Unknown Hand Held Objects via Tracking , 2006 .

[20]  Michael Beetz,et al.  EYEWATCHME—3D Hand and object tracking for inside out activity analysis , 2009, 2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[21]  Andrew I. Comport,et al.  Real-time direct tracking of color images in the presence of illumination variation , 2011, 2011 IEEE International Conference on Robotics and Automation.