Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques

This paper illustrates our recent work on analysis and classification of expressive gesture in human full-body movement and in particular in dance performances. An experiment is presented which is the result of a joint work carried out at the DIST-InfoMus Lab, University of Genova, Italy, and at the Department of Psychology of the University of Uppsala, Sweden, in the framework of the EU-IST project MEGA (Multisensory Expressive Gesture Applications, www.megaproject.org). The experiment aims at (i) individuating which motion cues are mostly involved in conveying the dancer's expressive intentions to the audience during a dance performance, (ii) measuring and analyzing them in order to classify dance gestures in term of basic emotions, (iii) testing a collection of developed models and algorithms for analysis of such expressive content by comparing their performances with spectators' ratings of the same dance fragments. The paper discusses the experiment in detail with reference to related conceptual issues, developed techniques, and obtained results.