HIGS: Hand Interaction Guidance System

In this paper, we introduce a hand interaction guidance system (HIGS) which can automatically author video guidance from 'one-shot' expert demonstration recording, and decompose the step instructions and deliver feedback according to the users' operation. The system observes hand-object interactions with a RGB-D camera from an egocentric view. In the guidance recording stage, HIGS is able to detect the global positions and moments of hand-object interactions and recover the 3D-hand poses during the operation. In the guidance stage, the recorded guidance video is automatically progressed as the user performs the task and the steps are used to assess task progress and completion. Our system operates in real time for both authoring and for monitoring and providing guidance. We see this work as a step towards fully automated learning and guidance systems, in mixed reality settings.

[1]  Steven K. Feiner,et al.  Augmented reality in the psychomotor phase of a procedural task , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[2]  Didier Stricker,et al.  Real-time modeling and tracking manual workflows from first-person vision , 2013, 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[3]  Steven K. Feiner,et al.  Knowledge-based augmented reality , 1993, CACM.

[4]  Yaser Sheikh,et al.  Hand Keypoint Detection in Single Images Using Multiview Bootstrapping , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  Thomas Brox,et al.  Learning to Estimate 3D Hand Pose from Single RGB Images , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[6]  Naohiko Khotake InfoStick : An Interaction Device for Inter-Appliance Computing , 1999 .

[7]  Antonija Mitrovic,et al.  Intelligent Augmented Reality Training for Motherboard Assembly , 2015, International Journal of Artificial Intelligence in Education.

[8]  Mark Billinghurst,et al.  Design considerations for combining augmented reality with intelligent tutors , 2018, Comput. Graph..

[9]  Pedro F. Campos,et al.  SMART: a SysteM of Augmented Reality for Teaching 2 nd grade students , 2008 .

[10]  Tomasz Malisiewicz,et al.  Deep Image Homography Estimation , 2016, ArXiv.

[11]  Bruce Edmonds,et al.  A conversational intelligent tutoring system to automatically predict learning styles , 2012, Comput. Educ..

[12]  Arindam Dey,et al.  TeachAR: An Interactive Augmented Reality Tool for Teaching Basic English to Non-native Children , 2016, 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct).

[13]  Dima Damen,et al.  Automated capture and delivery of assistive task guidance with an eyewear computer: the GlaciAR system , 2016, AH.

[14]  Dima Damen,et al.  Hotspots detection for machine operation in egocentric vision , 2017, 2017 Fifteenth IAPR International Conference on Machine Vision Applications (MVA).

[15]  Mahadev Satyanarayanan,et al.  Early Implementation Experience with Wearable Cognitive Assistance Applications , 2015, WearSys@MobiSys.

[16]  Varun Ramakrishna,et al.  Convolutional Pose Machines , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[17]  Giles Westerfield,et al.  Intelligent Augmented Reality Training for Assembly and Maintenance , 2012 .