An automatic system for humanoid dance creation

Abstract The paper describes a novel approach to allow a robot to dance following musical rhythm. The proposed system generates a dance for a humanoid robot through the combination of basic movements synchronized with the music. The system made up of three parts: the extraction of features from audio file, estimation of movements through the Hidden Markov Models and, finally, the generation of dance. Starting from a set of given movements, the robot choices sequence of movements a suitable Hidden Markov Model, and synchronize them processing musical input. The proposed approach has the advantage that movement execution probabilities could be changed according evaluation of the dance execution in order to have an artificial creative system. In the same way, a choreographer could give more importance to some movements and/or exclude others, using the system as a co-creation tool. The approach has been tested on Aldebaran NAO humanoid using different genres of music, and experimentations was conduct at presence of real human dancers to have feedback of the goodness of the robot execution. Three professional judges expressed their evaluations about the following points: appropriateness of movements for a given musical genre; the precision to track the rhythm; the aesthetic impact of the whole sequence of movements; and the overall judgment of the robot performance. All the evaluations are very satisfying, and confirm that robot dance is realistic and aesthetically acceptable. The robustness and flexibility of the system allow us to embed the system in artificial creative system in future work. In the discussion we introduce some issues to pursuit this aim, using a previous proposed cognitive architecture based on needs and motivations.

[1]  Giovanni Pilato,et al.  Introducing a creative process on a cognitive architecture , 2013, BICA 2013.

[2]  Yiannis Demiris,et al.  Adaptive human-robot interaction in sensorimotor task instruction: From human to robot dance tutors , 2014, Robotics Auton. Syst..

[3]  Frank Chongwoo Park,et al.  Natural Movement Generation Using Hidden Markov Models and Principal Components , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[4]  Takashi Ikegami,et al.  Making a Robot Dance to Music Using Chaotic Itinerancy in a Network of FitzHugh-Nagumo Neurons , 2007, ICONIP.

[5]  Giovanni Pilato,et al.  Combining Representational Domains for Computational Creativity , 2014, ICCC.

[6]  RosRaquel,et al.  Adaptive human-robot interaction in sensorimotor task instruction , 2014 .

[7]  McAngus N. Todd,et al.  A sensorimotor theory of temporal tracking and beat induction , 2002, Psychological research.

[8]  Atsushi Nakazawa,et al.  Dancing‐to‐Music Character Animation , 2006, Comput. Graph. Forum.

[9]  Ryohei Nakatsu,et al.  Concept and construction of a robot dance system , 2007, ICMIT: Mechatronics and Information Technology.

[10]  Ignazio Infantino,et al.  Affective Human-Humanoid Interaction Through Cognitive Architecture , 2012 .

[11]  Giovanni Pilato,et al.  I Feel Blue: Robots and Humans Sharing Color Representation for Emotional Cognitive Interaction , 2012, BICA.

[12]  Maren Bennewitz,et al.  Whole-body imitation of human motions with a Nao humanoid , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[13]  Adrian Hilton,et al.  Realistic synthesis of novel human movements from a database of motion capture examples , 2000, Proceedings Workshop on Human Motion.

[14]  Giovanni Pilato,et al.  Creativity evaluation in a cognitive architecture , 2015, BICA 2015.

[15]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[16]  Dong-Soo Kwon,et al.  Autonomous Humanoid Robot Dance Generation System based on real-time music input , 2013, 2013 IEEE RO-MAN.

[17]  Xavier Serra,et al.  ESSENTIA: an open-source library for sound and music analysis , 2013, ACM Multimedia.

[18]  Marc R. Thompson,et al.  Embodied Meter: Hierarchical Eigenmodes in Music-Induced Movement , 2010 .

[19]  Giovanni Pilato,et al.  Vision and emotional flow in a cognitive architecture for human-machine interaction , 2011, BICA.

[20]  Manuela M. Veloso,et al.  Autonomous robot dancing driven by beats and emotions of music , 2012, AAMAS.

[21]  Xavier Serra,et al.  Essentia: An Audio Analysis Library for Music Information Retrieval , 2013, ISMIR.

[22]  Ryohei Nakatsu,et al.  Dance Motion Control of a Humanoid Robot Based on Real-Time Tempo Tracking from Musical Audio Signals , 2009, ICEC.

[23]  Stephen H. Lane,et al.  Robot choreography: An artistic-scientific connection , 1996 .