Using Socially Expressive Mixed Reality Arms for Enhancing Low-Expressivity Robots

Expressivity–the use of multiple modalities to convey internal state and intent of a robot–is critical for interaction. Yet, due to cost, safety, and other constraints, many robots lack high degrees of physical expressivity. This paper explores using mixed reality to enhance a robot with limited expressivity by adding virtual arms that extend the robot’s expressiveness. The arms, capable of a range of non-physically-constrained gestures, were evaluated in a between-subject study (n =34) where participants engaged in a mixed reality mathematics task with a socially assistive robot. The study results indicate that the virtual arms added a higher degree of perceived emotion, helpfulness, and physical presence to the robot. Users who reported a higher perceived physical presence also found the robot to have a higher degree of social presence, ease of use, usefulness, and had a positive attitude toward using the robot with mixed reality. The results also demonstrate the users’ ability to distinguish the virtual gestures’ valence and intent.

[1]  Andrea Lockerd Thomaz,et al.  Enhancing interaction through exaggerated motion synthesis , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[2]  Brian Scassellati,et al.  The Physical Presence of a Robot Tutor Increases Cognitive Learning Gains , 2012, CogSci.

[3]  Gregory M. P. O'Hare,et al.  Where Robots and Virtual Agents Meet , 2009, Int. J. Soc. Robotics.

[4]  Selma Sabanovic,et al.  Expressivity for Sustained Human-Robot Interaction , 2019, 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[5]  Bilge Mutlu,et al.  Embodiment in Socially Interactive Robots , 2019, Found. Trends Robotics.

[6]  Ralph Kohler,et al.  Beyond line-of-sight information dissemination for Force Protection , 2012, MILCOM 2012 - 2012 IEEE Military Communications Conference.

[7]  Maja J Matarić,et al.  Socially Assistive Robotics for Post-stroke Rehabilitation Journal of Neuroengineering and Rehabilitation Socially Assistive Robotics for Post-stroke Rehabilitation , 2007 .

[8]  Robert Tibshirani,et al.  Bootstrap Methods for Standard Errors, Confidence Intervals, and Other Measures of Statistical Accuracy , 1986 .

[9]  Wendy Ju,et al.  Tell me more designing HRI to encourage more trust, disclosure, and companionship , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[10]  Paul Milgram,et al.  Telerobotic control using augmented reality , 1995, Proceedings 4th IEEE International Workshop on Robot and Human Communication.

[11]  Fabio Tesser,et al.  Multimodal child-robot interaction: building social bonds , 2013, HRI 2013.

[12]  Wendy Ju,et al.  Expressing thought: Improving robot readability with animation principles , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[13]  M. Tomasello Origins of human communication , 2008 .

[14]  Terry K Koo,et al.  A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research. , 2016, Journal Chiropractic Medicine.

[15]  Neil T. Dantam,et al.  Augmented, Mixed, and Virtual Reality Enabling of Robot Deixis , 2018, HCI.

[16]  Greg Welch,et al.  A Systematic Review of Social Presence: Definition, Antecedents, and Implications , 2018, Front. Robot. AI.

[17]  Jonas Beskow,et al.  Reverse Engineering Psychologically Valid Facial Expressions of Emotion into Social Robots , 2018, 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018).

[18]  Candace L. Sidner,et al.  Recognizing engagement in human-robot interaction , 2010, HRI 2010.

[19]  Jan Kedzierski,et al.  EMYS—Emotive Head of a Social Robot , 2013, Int. J. Soc. Robotics.

[20]  Ehud Sharlin,et al.  Robot expressionism through cartooning , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[21]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[22]  Daniela Rus,et al.  Baxter's Homunculus: Virtual Reality Spaces for Teleoperation in Manufacturing , 2017, IEEE Robotics and Automation Letters.

[23]  Ali Meghdari,et al.  Spontaneous Human-Robot Emotional Interaction Through Facial Expressions , 2016, ICSR.

[24]  Patrick Reignier,et al.  PEAR: Prototyping Expressive Animated Robots - A Framework for Social Robot Prototyping , 2018, VISIGRAPP.

[25]  Torbjørn S. Dahl,et al.  How does the robot feel? Perception of valence and arousal in emotional body language , 2018, Paladyn J. Behav. Robotics.

[26]  Ben J. A. Kröse,et al.  Assessing Acceptance of Assistive Social Agent Technology by Older Adults: the Almere Model , 2010, Int. J. Soc. Robotics.

[27]  Ken Goldberg,et al.  Deep Imitation Learning for Complex Manipulation Tasks from Virtual Reality Teleoperation , 2017, ICRA.

[28]  T. Igarashi,et al.  TouchMe : An Augmented Reality Based Remote Robot Manipulation , 2011 .

[29]  N. Sarkar,et al.  Can Robotic Interaction Improve Joint Attention Skills? , 2015, Journal of autism and developmental disorders.

[30]  Stephanie Rosenthal,et al.  Enhancing human understanding of a mobile robot's state and actions using expressive lights , 2016, 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[31]  Gregory M. P. O'Hare,et al.  Mixing robotic realities , 2006, IUI '06.

[32]  Tomohiro Yoshikawa,et al.  Learning Effect of Collaborative Learning between Human and Robot Having Emotion Expression Model , 2015, 2015 IEEE International Conference on Systems, Man, and Cybernetics.

[33]  F. Thomas,et al.  The illusion of life : Disney animation , 1981 .

[34]  Brian Scassellati,et al.  Improving social skills in children with ASD using a long-term, in-home social robot , 2018, Science Robotics.

[35]  Illah R. Nourbakhsh,et al.  An Affective Mobile Robot Educator with a Full-Time Job , 1999, Artif. Intell..

[36]  Tathagata Chakraborti,et al.  Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI) , 2019, HRI.

[37]  Jennifer Lee,et al.  Communicating Robot Motion Intent with Augmented Reality , 2018, 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[38]  Mason Bretan,et al.  Emotionally expressive dynamic physical behaviors in robots , 2015, Int. J. Hum. Comput. Stud..

[39]  Hooman Hedayati,et al.  Improving Collocated Robot Teleoperation with Augmented Reality , 2018, 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[40]  Maja J. Matarić,et al.  A Survey of Nonverbal Signaling Methods for Non-Humanoid Robots , 2018, Found. Trends Robotics.