Innovative technologies for the creative industries: Advanced human-machine interfaces for dynamic performance effects

Technical advances in performing arts and TV production have focused on technologies like sound, special effects, projection and the like. New developments in computer vision have made gesture and facial expression recognition possible. What is missing is the means to link these new technologies and place them under direct control of the performer. Furthermore, performers have limited capacity to interpret and respond to their audience. This research aims to develop new gesture and facial expression recognition technologies to make possible completely new forms of performance that enable performers to directly influence the physical environment and respond to audience reaction. In this article, we focus on a gesture recognition system developed for this purpose and present a novel approach for gesture recognition. We discuss the gesture modelling technique used and its main features. Experiments show accuracy of over 98.71% making the gesture recognizer suitable for our innovative stage interface.

[1]  Yangsheng Xu,et al.  Gesture interface: modeling and learning , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[2]  Farhad Dadgostar,et al.  Facial Expression Analysis by Support Vector Regression , 2005 .

[3]  Tapio Seppänen,et al.  Hand gesture recognition of a mobile device user , 2000, 2000 IEEE International Conference on Multimedia and Expo. ICME2000. Proceedings. Latest Advances in the Fast Changing World of Multimedia (Cat. No.00TH8532).

[4]  Takahiro Watanabe,et al.  Real-time gesture recognition using KL expansion of image sequence , 1997, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97.

[5]  Mu-Chun Su,et al.  A fuzzy rule-based approach to spatio-temporal hand gesture recognition , 2000, IEEE Trans. Syst. Man Cybern. Part C.

[6]  Neil Gershenfeld,et al.  An immersive, multi-user, musical stage environment , 2001, SIGGRAPH.

[7]  Yoichi Sato,et al.  Real-Time Fingertip Tracking and Gesture Recognition , 2002, IEEE Computer Graphics and Applications.

[8]  Masatoshi Ishikawa,et al.  Gesture recognition using laser-based tracking system , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[9]  Cyrus Shahabi,et al.  Device independence and extensibility in gesture recognition , 2003, IEEE Virtual Reality, 2003. Proceedings..

[10]  Loren Olson,et al.  Movement-based interactive dance performance , 2006, MM '06.

[11]  Ali Mazalek,et al.  Tangible comics: a performance space with full-body interaction , 2007, ACE '07.

[12]  Katsushi Ikeuchi,et al.  Acquiring hand-action models by attention point analysis , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[13]  Xueyin Lin,et al.  Toward real-time human-computer interaction with continuous dynamic hand gestures , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[14]  Mario Aguilar,et al.  Facilitating User Interaction with Complex Systems via Hand Gesture Recognition , 2003 .

[15]  Abdolhossein Sarrafzadeh,et al.  An adaptive real-time skin detector based on Hue thresholding: A comparison on two motion tracking methods , 2006, Pattern Recognit. Lett..

[16]  Alex Pentland,et al.  Space-time gestures , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[17]  Christopher H. Messom,et al.  Development of Extemporaneous Performance by Synthetic Actors in the Rehearsal Process , 2004, ICEC.