Exploring Visualizations in Real-time Motion Capture for Dance Education

In this paper, we describe an ongoing work towards developing a whole-body interaction interface for exploring different visualizations of movement, using real-time motion capture and 3D models, to apply in dance learning and improvisation within a creative, gamified context. A full inertial motion capture system is used by the performer while a simple user interface provides the option to the user to experiment with different avatars, and visualizations e.g., trace of motions on different parts of the body and to interact with virtual objects. The 3D simulation provides a real-time visual feedback for the movement. The interaction follows the paradigm of moving from mimicking kinetic material into a self-reflection teaching approach. The interactive avatar is the reflection of the performer, but on the same time the avatar depicts a character, a dance partner which can inspire the user who moves to explore different ways of moving. Either within the framework of artistic experimentation and creativity, or in the context of education, the visual metaphors of movement shape and qualities consist a powerful tool and raise many scientific and research questions.

[1]  Kristina Höök,et al.  CHI '12 Extended Abstracts on Human Factors in Computing Systems , 2012, CHI 2012.

[2]  Thecla Schiphorst,et al.  Designing for movement: evaluating computational models using LMA effort qualities , 2014, CHI.

[3]  Rhodri Cusack,et al.  Points in Mental Space: an Interdisciplinary Study of Imagery in Movement Creation , 2011 .

[4]  Noel E. O'Connor,et al.  Evaluating a dancer's performance using kinect-based skeleton tracking , 2011, ACM Multimedia.

[5]  Nicole Harbonnier-Topin,et al.  “How seeing helps doing, and doing allows to see more”: the process of imitation in the dance class , 2012 .

[6]  R. Passingham,et al.  Action observation and acquired motor skills: an FMRI study with expert dancers. , 2005, Cerebral cortex.

[7]  Mexhid Ferati,et al.  Usability Evaluation of Kinect-Based System for Ballet Movements , 2015, HCI.

[8]  Zoe Marquardt,et al.  Super Mirror: a kinect interface for ballet dancers , 2012, CHI EA '12.

[9]  Daniel Bisig,et al.  Neural Narratives: Dance with Virtual Body Extensions , 2016, MOCO.

[10]  Eric N. Franklin Dance Imagery for Technique and Performance , 1996 .

[11]  Tovi Grossman,et al.  YouMove: enhancing movement training with an augmented reality mirror , 2013, UIST.

[12]  Andreas Aristidou,et al.  Motion Analysis for Folk Dance Evaluation , 2014, GCH.

[13]  Akrivi Katifori,et al.  HCI challenges in Dance Education , 2016, EAI Endorsed Trans. Ambient Syst..

[14]  H. Tamura,et al.  A prototype dance training support system with motion capture and mixed reality technologies , 2004, RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759).

[15]  Ivan Poupyrev,et al.  Proceedings of the 26th annual ACM symposium on User interface software and technology , 2013, UIST 2013.

[16]  Teresa L Heiland,et al.  Examining effects of Franklin Method metaphorical and anatomical mental images on college dancers’ jumping height , 2013 .

[17]  Radoslaw Niewiadomski,et al.  Movement Fluidity Analysis Based on Performance and Perception , 2016, CHI Extended Abstracts.