Low-intrusive recognition of expressive movement qualities

In this paper we present a low-intrusive approach to the detection of expressive full-body movement qualities. We focus on two qualities: Lightness and Fragility and we detect them using the data captured by four wearable devices, two Inertial Movement Units (IMU) and two electromyographs (EMG), placed on the forearms. The work we present in the paper stems from a strict collaboration with expressive movement experts (e.g., contemporary dance choreographers) for defining a vocabulary of basic movement qualities. We recorded 13 dancers performing movements expressing the qualities under investigation. The recordings were next segmented and the perceived level of each quality for each segment was ranked by 5 experts using a 5-points Likert scale. We obtained a dataset of 150 segments of movement expressing Fragility and/or Lightness. In the second part of the paper, we define a set of features on IMU and EMG data and we extract them on the recorded corpus. We finally applied a set of supervised machine learning techniques to classify the segments. The best results for the whole dataset were obtained with a Naive Bayes classifier for Lightness (F-score 0.77), and with a Support Vector Machine classifier for Fragility (F-score 0.77). Our approach can be used in ecological contexts e.g., during artistic performances.

[1]  P. Bernasconi,et al.  Analysis of co‐ordination between breathing and exercise rhythms in man. , 1993, The Journal of physiology.

[2]  R Quian Quiroga,et al.  Event synchronization: a simple and fast method to measure synchronicity and time delay patterns. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[3]  Antonio Camurri,et al.  Expressive interfaces , 2004, Cognition, Technology & Work.

[4]  Antonio Camurri,et al.  Multimodal Analysis of Expressive Gesture in Music and Dance Performances , 2003, Gesture Workshop.

[5]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[6]  Michael Neff,et al.  Aesthetic edits for character animation , 2003, SCA '03.

[7]  Kozaburo Hachimura,et al.  Analysis and evaluation of dancing movement based on LMA , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[8]  Anthony M. J. Bull,et al.  ASSESSMENT OF THE TIMING OF RESPIRATION DURING ROWING AND ITS RELATIONSHIP TO SPINAL KINEMATICS , 2006 .

[9]  Kostas Karpouzis,et al.  Virtual agent multimodal mimicry of humans , 2007, Lang. Resour. Evaluation.

[10]  Ellen Campana,et al.  A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis , 2009, Adv. Hum. Comput. Interact..

[11]  Helena M. Mentis,et al.  Instructing people for training gestural interactive systems , 2012, CHI.

[12]  Gaël Richard,et al.  Multimodal classification of dance movements using body joint trajectories and step sounds , 2013, 2013 14th International Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS).

[13]  M. Mancini,et al.  Human and Virtual Agent Expressive Gesture Quality Analysis and Synthesis , 2013 .

[14]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[15]  Nikolaos Grammalidis,et al.  Multi-sensor Technology and Fuzzy Logic for Dancer's Motion Analysis and Performance Evaluation within a 3D Virtual Environment , 2014, HCI.

[16]  Titus B. Zaharia,et al.  Laban descriptors for gesture recognition and emotional analysis , 2015, The Visual Computer.

[17]  T. Shafir,et al.  Multitask learning for Laban movement analysis , 2015, MOCO.

[18]  I. A. Sulistijono,et al.  Comparison of five time series EMG features extractions using Myo Armband , 2015, 2015 International Electronics Symposium (IES).

[19]  Rui Alberto Esteves Freixo Electromyography and inertial sensor-based gesture detection and control , 2015 .

[20]  Christian Jacquemin,et al.  Interactive Visuals as Metaphors for Dance Movement Qualities , 2015, ACM Trans. Interact. Intell. Syst..

[21]  Ayan Banerjee,et al.  SCEPTRE: A Pervasive, Non-Invasive, and Programmable Gesture Recognition Technology , 2016, IUI.

[22]  João Diogo Faria Lopes Gesture spotting from IMU and EMG data for human-robot interaction , 2016 .

[23]  Radoslaw Niewiadomski,et al.  Using the Audio Respiration Signal for Multimodal Discrimination of Expressive Movement Qualities , 2016, HBU.

[24]  Nicholas Ward,et al.  Designing and measuring gesture using laban movement analysis and electromyogram , 2016, UbiComp Adjunct.

[25]  Radoslaw Niewiadomski,et al.  The Dancer in the Eye: Towards a Multi-Layered Computational Framework of Qualities in Movement , 2016, MOCO.

[26]  Mahmoud Tavakoli,et al.  Single channel surface EMG control of advanced prosthetic hands: A simple, low cost and efficient approach , 2017, Expert Syst. Appl..

[27]  Radoslaw Niewiadomski,et al.  The Multi-Event-Class Synchronization (MECS) Algorithm , 2019, ArXiv.