Learning Movement through Human-Computer Co-Creative Improvisation

Computers that are able to collaboratively improvise movement with humans could have an impact on a variety of application domains, ranging from improving procedural animation in game environments to fostering human-computer co-creativity. Enabling real-time movement improvisation requires equipping computers with strategies for learning and understanding movement. Most existing research focuses on gesture classification, which does not facilitate the learning of new gestures, thereby limiting the creative capacity of computers. In this paper, we explore how to develop a gesture clustering pipeline that facilitates reasoning about arbitrary novel movements in real-time. We describe the implementation of this pipeline within the context of LuminAI, a system in which humans can collaboratively improvise movements together with an AI agent. A preliminary evaluation indicates that our pipeline is capable of efficiently clustering similar gestures together, but further work is necessary to fully assess the pipeline's ability to meaningfully cluster complex movements.

[1]  Alberto D. Pascual-Montano,et al.  A survey of dimensionality reduction techniques , 2014, ArXiv.

[2]  Nanning Zheng,et al.  Unsupervised Analysis of Human Gestures , 2001, IEEE Pacific Rim Conference on Multimedia.

[3]  Nauman Aslam,et al.  Temporal clustering of motion capture data with optimal partitioning , 2016, VRCAI.

[4]  R. Sawyer,et al.  Group creativity: musical performance and collaboration , 2006 .

[5]  Brian Magerko,et al.  Digital Improvisational Theatre: Party Quirks , 2011, IVA.

[6]  Qiang Yang,et al.  Lifelong Machine Learning Systems: Beyond Learning Algorithms , 2013, AAAI Spring Symposium: Lifelong Machine Learning.

[7]  B Hecox,et al.  Dance in physical rehabilitation. , 1976, Physical therapy.

[8]  Yingjie Tian,et al.  A Comprehensive Survey of Clustering Algorithms , 2015, Annals of Data Science.

[9]  Akshay Gupta,et al.  Viewpoints AI , 2013, AIIDE.

[10]  Bruce A. Draper,et al.  Unsupervised learning of human expressions, gestures, and actions , 2011, Face and Gesture 2011.

[11]  K. Balci,et al.  Clustering poses of motion capture data using limb centroids , 2008, 2008 23rd International Symposium on Computer and Information Sciences.

[12]  Fabio Tozeto Ramos,et al.  A comparison of unsupervised learning algorithms for gesture clustering , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[13]  Dohyung Kim,et al.  Classification of K-Pop Dance Movements Based on Skeleton Information Obtained by a Kinect Sensor , 2017, Sensors.

[14]  Thecla Schiphorst,et al.  Seeing, Sensing and Recognizing Laban Movement Qualities , 2017, CHI.

[15]  Sunil Kumar,et al.  Feasibility of Principal Component Analysis in hand gesture recognition system , 2017, ArXiv.

[16]  Alon Lerner,et al.  Enhanced interactive gaming by blending full-body tracking and gesture animation , 2010, SA '10.