Action recognition for human-marionette interaction

In this paper, we propose a human-marionette interaction system based on a human action recognition approach for applications to interactive artistic puppetry and a mimicking-marionette game. We developed an intelligent marionette called "i-marionette" that is controlled by a sophisticated control device to achieve various human actions. Moreover, we utilized an action recognition approach to enable the i-marionette to learn and recognize complex dance movements. The idea of artistic puppetry is to present a conflict scenario between two different cultural worlds: the performer is active and represents the culture of modern technology based in the real world. In contrast, the i-marionette represents traditional culture and is passive and based in a virtual world. The active performer guides the passive i-marionette to form a space-time connection between the real world and the virtual world. The i-marionette mimics the performer's action, while the performer also mimics the i-marionette's action. The performance represents an artistic conception in which humans invent technology and the i-marionette is manipulated by human control. However, in this interactive circle, the human is implicitly affected by the i-marionette. In our mimicking-marionette game, a player mimics the i-marionette's action. Subsequently, our human action recognition system measures the action similarity between the player and the i-marionette, and our system provides a similarity score.

[1]  Magnus Egerstedt,et al.  Motion Programs for Puppet Choreography and Control , 2007, HSCC.

[2]  Bart Selman,et al.  Unstructured human activity detection from RGBD images , 2011, 2012 IEEE International Conference on Robotics and Automation.

[3]  Zicheng Liu,et al.  Expandable Data-Driven Graphical Modeling of Human Actions Based on Salient Postures , 2008, IEEE Transactions on Circuits and Systems for Video Technology.

[4]  J.K. Aggarwal,et al.  Human activity analysis , 2011, ACM Comput. Surv..

[5]  Huosheng Hu,et al.  Ubiquitous robotics in physical human action recognition: A comparison between dynamic ANNs and GP , 2008, 2008 IEEE International Conference on Robotics and Automation.

[6]  Odest Chadwicke Jenkins,et al.  Tracking human motion and actions for interactive robots , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[7]  Li Fei-Fei,et al.  Classifying Actions and Measuring Action Similarity by Modeling the Mutual Context of Objects and Human Poses , 2011 .

[8]  Darko Kirovski,et al.  Real-time classification of dance gestures from skeleton animation , 2011, SCA '11.

[9]  Ming-Sui Lee,et al.  Human action recognition using Action Trait Code , 2012, Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012).

[10]  Wanqing Li,et al.  Action recognition based on a bag of 3D points , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[11]  Magnus Egerstedt,et al.  Choreography for Marionettes: Imitation, Planning, and Control , 2007 .