"Hmm, Did You Hear What I Just Said?": Development of a Re-Engagement System for Socially Interactive Robots

Maintaining engagement is challenging in human–human interaction. When disengagements happen, people try to adapt their behavior with an expectation that engagement will be regained. In human–robot interaction, although socially interactive robots are engaging, people can easily drop engagement while interacting with robots. This paper proposes a multi-layer re-engagement system that applies different strategies through human-like verbal and non-verbal behaviors to regain user engagement, taking into account the user’s attention level and affective states. We conducted a usability test in a robot storytelling scenario to demonstrate technical operation of the system as well as to investigate how people react when interacting with a robot with re-engagement ability. Our usability test results reveal that the system has the potential to maintain a user’s engagement. Our selected users gave positive comments, through open-ended questions, to the robot with this ability. They also rated the robot with the re-engagement ability higher on several dimensions, i.e., animacy, likability, and perceived intelligence.

[1]  Goldie Nejat,et al.  Recognizing Emotional Body Language Displayed by a Human-like Social Robot , 2014, International Journal of Social Robotics.

[2]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .

[3]  Jaime Gómez García-Bermejo,et al.  An Architecture for the Integration of Robots and Sensors for the Care of the Elderly in an Ambient Assisted Living Environment , 2019, Robotics.

[4]  Sven Behnke,et al.  Intuitive Multimodal Interaction with Communication Robot Fritz , 2007 .

[5]  Andrew Ortony,et al.  Affect and Proto-Affect in Effective Functioning , 2005, Who Needs Emotions?.

[6]  D. Gilbert,et al.  The Novelty Penalty , 2017, Psychological science.

[7]  D. Gilbert,et al.  The Novelty Penalty: Why Do People Like Talking About New Experiences but Hearing About Old Ones? , 2017 .

[8]  Hui Yu,et al.  Robot-Enhanced Therapy: Development and Validation of Supervised Autonomous Robotic System for Autism Spectrum Disorders Therapy , 2019, IEEE Robotics & Automation Magazine.

[9]  Omar Mubin,et al.  Adaptive Social Robot for Sustaining Social Engagement during Long-Term Children–Robot Interaction , 2017, Int. J. Hum. Comput. Interact..

[10]  Hideaki Kuzuoka,et al.  Museum guide robot based on sociological interaction analysis , 2007, CHI.

[11]  Patrick W. Jordan,et al.  An Introduction to Usability , 1998 .

[12]  K. Yamazaki,et al.  Coordination of verbal and non-verbal actions in human―robot interaction at museums and exhibitions , 2010 .

[13]  Mohamed Chetouani,et al.  Towards Engagement Models that Consider Individual Factors in HRI: On the Relation of Extroversion and Negative Attitude Towards Robots to Gaze and Speech During a Human–Robot Assembly Task , 2015, Int. J. Soc. Robotics.

[14]  Emiel Krahmer,et al.  Child-Robot Interactions for Second Language Tutoring to Preschool Children , 2017, Front. Hum. Neurosci..

[15]  Takayuki Kanda,et al.  Can a social robot help children's understanding of science in classrooms? , 2014, HAI.

[16]  G. Nejat,et al.  Promoting engagement in cognitively stimulating activities using an intelligent socially assistive robot , 2010, 2010 IEEE/ASME International Conference on Advanced Intelligent Mechatronics.

[17]  Kolja Kühnlenz,et al.  An emotional adaption approach to increase helpfulness towards a robot , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[18]  Ana Paiva,et al.  Empathic Robots for Long-term Interaction , 2014, Int. J. Soc. Robotics.

[19]  Candace L. Sidner,et al.  Explorations in engagement for humans and robots , 2005, Artif. Intell..

[20]  Ronald C. Arkin,et al.  Other-oriented robot deception: A computational approach for deceptive action generation to benefit the mark , 2014, 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014).

[21]  Hideaki Kuzuoka,et al.  “The first five seconds”: Contingent stepwise entry into an interaction as a means to secure sustained engagement in HRI , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[22]  Tom Ziemke,et al.  How to Build a Supervised Autonomous System for Robot-Enhanced Therapy for Children with Autism Spectrum Disorder , 2017, Paladyn J. Behav. Robotics.

[23]  Bram Vanderborght,et al.  A Motion System for Social and Animated Robots , 2014 .

[24]  Robert A. Virzi,et al.  Refining the Test Phase of Usability Evaluation: How Many Subjects Is Enough? , 1992 .

[25]  Fabio Tesser,et al.  Towards long-term social child-robot interaction , 2016 .

[26]  Yuichiro Yoshikawa,et al.  Responsive Robot Gaze to Interaction Partner , 2006, Robotics: Science and Systems.

[27]  Aaron Sloman,et al.  Varieties of Metacognition in Natural and Artificial Systems , 2011, Metareasoning.

[28]  Catherine J. Stevens,et al.  A review of the applicability of robots in education , 2013 .

[29]  Candace L. Sidner,et al.  Where to look: a study of human-robot engagement , 2004, IUI '04.

[30]  Arvid Kappas,et al.  Perception matters! Engagement in task orientated social robotics , 2015, 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[31]  Bram Vanderborght,et al.  A multilayer reactive system for robots interacting with children with autism , 2016, ArXiv.

[32]  Giuseppe Riva,et al.  L2TOR: Second-Language Tutoring Using Social Robots , 2019, Cyberpsychology, Behavior, and Social Networking.

[33]  Wolfram Burgard,et al.  Experiences with an Interactive Museum Tour-Guide Robot , 1999, Artif. Intell..

[34]  Cynthia Breazeal,et al.  Effect of a robot on user perceptions , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[35]  Aaron Sloman,et al.  EVOLVABLE ARCHITECTURES FOR HUMAN-LIKE MINDS , 2000 .

[36]  Bilge Mutlu,et al.  Designing persuasive robots: How robots might persuade people using vocal and nonverbal cues , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[37]  Junchao Xu,et al.  Affective Body Language of Humanoid Robots: Perception and Effects in Human Robot Interaction , 2015 .

[38]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[39]  Hiroshi G. Okuno,et al.  The MEI Robot: Towards Using Motherese to Develop Multimodal Emotional Intelligence , 2014, IEEE Transactions on Autonomous Mental Development.

[40]  Fabio Tesser,et al.  Multimodal child-robot interaction: building social bonds , 2013, HRI 2013.

[41]  Tomohiro Yoshikawa,et al.  An Emotional Expression Model for Educational-Support Robots , 2015, J. Artif. Intell. Soft Comput. Res..

[42]  Bilge Mutlu,et al.  Pay attention!: designing adaptive agents that monitor and improve user engagement , 2012, CHI.

[43]  Rodolphe Gelin,et al.  A Mass-Produced Sociable Humanoid Robot: Pepper: The First Machine of Its Kind , 2018, IEEE Robotics & Automation Magazine.

[44]  Eric Horvitz,et al.  Managing Human-Robot Engagement with Forecasts and... um... Hesitations , 2014, ICMI.

[45]  Yoshinori Kobayashi,et al.  A techno-sociological solution for designing a museum guide robot: Regarding choosing an appropriate visitor , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[46]  Bilge Mutlu,et al.  A Storytelling Robot: Modeling and Evaluation of Human-like Gaze Behavior , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[47]  Ayanna M. Howard,et al.  Applying Behavioral Strategies for Student Engagement Using a Robotic Educational Agent , 2013, 2013 IEEE International Conference on Systems, Man, and Cybernetics.

[48]  Virginia P. Richmond,et al.  The Relationship Between Selected Immediacy Behaviors and Cognitive Learning , 1987 .

[49]  D. De Rossi,et al.  Towards a Believable Social Robot , 2013 .

[50]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[51]  Eduard Clotet,et al.  Extending the Application of an Assistant Personal Robot as a Walk-Helper Tool , 2019, Robotics.

[52]  Maja J. Mataric,et al.  Investigating Implicit Cues for User State Estimation in Human-Robot Interaction Using Physiological Measurements , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[53]  Elettra Oleari,et al.  Designing motivational robot: How robots might motivate children to eat fruits and vegetables , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[54]  Cindy L. Bethel,et al.  A Survey of Using Vocal Prosody to Convey Emotion in Robot Speech , 2016, Int. J. Soc. Robotics.

[55]  J R Lewis,et al.  Sample Sizes for Usability Studies: Additional Considerations , 1994, Human factors.

[56]  Susan Bell Trickett,et al.  Social Engagement in Public Places: A Tale of One Robot , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).