Keep on dancing: Effects of expressive motion mimicry

Expressive motion refers to movements that help convey an agent's attitude towards its task or environment. People frequently use expressive motion to indicate internal states such as emotion, confidence, and engagement. Robots can also exhibit expressive motion, and studies have shown that people can legibly interpret such expressive motion. Mimicry involves imitating the behaviors of others, and has been shown to increase rapport between people. The research question addressed in this study is how robots mimicking the expressive motion of children affects their interaction with dancing robots. The paper presents our approach to generating and characterizing expressive motion, based on the Laban Efforts System and the results of the study, which provides both significant and suggestive evidence to support that such mimicry has positive effects on the children's behaviors.

[1]  Malte F. Jung Affective Grounding in Human-Robot Interaction , 2017, 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI.

[2]  Andreas Aristidou,et al.  Emotion Recognition for Exergames using Laban Movement Analysis , 2013, MIG.

[3]  Reid G. Simmons,et al.  Layering Laban Effort Features on Robot Task Motions , 2015, HRI.

[4]  Woontack Woo,et al.  MIDAS: MIC Interactive DAnce System , 2000, Smc 2000 conference proceedings. 2000 ieee international conference on systems, man and cybernetics. 'cybernetics evolving to systems, humans, organizations, and their complex interactions' (cat. no.0.

[5]  Tomomasa Sato,et al.  Analysis of Impression of Robot Bodily Expression , 2002, J. Robotics Mechatronics.

[6]  Emilia I. Barakova,et al.  Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis , 2010, Robotics Auton. Syst..

[7]  Amy LaViers,et al.  Style-Based Robotic Motion in Contemporary Dance Performance , 2014 .

[8]  T. Chartrand,et al.  Where is the love? The social aspects of mimicry , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[9]  J. Rett,et al.  Human-robot interface with anticipatory characteristics based on Laban Movement Analysis and Bayesian models , 2007, 2007 IEEE 10th International Conference on Rehabilitation Robotics.

[10]  Hadi Aliakbarpour,et al.  Probabilistic LMA-based classification of human behaviour understanding using Power Spectrum technique , 2010, 2010 13th International Conference on Information Fusion.

[11]  Marek P. Michalowski,et al.  Keepon , 2009, Int. J. Soc. Robotics.

[12]  T. Chartrand,et al.  Chapter 5 Human Mimicry , 2009 .

[13]  Myung Jin Chung,et al.  LMA based emotional motion representation using RGB-D camera , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[14]  Megumi Masuda,et al.  A Laban-Based Approach to Emotional Motion Rendering for Human-Robot Interaction , 2010, ICEC.

[15]  Heather Knight,et al.  Expressive Motion for Low Degree-of-Freedom Robots , 2016 .

[16]  Norman I. Badler,et al.  The EMOTE model for effort and shape , 2000, SIGGRAPH.

[17]  Reid G. Simmons,et al.  Rhythmic attention in child-robot dance play , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[18]  Dana Kulic,et al.  Laban Effort and Shape Analysis of Affective Hand and Arm Movements , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[19]  Reid G. Simmons,et al.  Laban head-motions convey robot state: A call for robot body language , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[20]  J. Bailenson,et al.  Digital Chameleons , 2005, Psychological science.

[21]  James Everett Young,et al.  Communicating affect via flight path Exploring use of the Laban Effort System for designing affective locomotion paths , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[22]  Marek P. Michalowski,et al.  A dancing robot for rhythmic social interaction , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[23]  Manuela M. Veloso,et al.  Taking candy from a robot: Speed features and candy accessibility predict human response , 2015, 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[24]  Woontack Woo,et al.  Emotion Recognition from Dance Image Sequences Using Contour Approximation , 2004, SSPR/SPR.