Prototyping realistic long-term human-robot interaction for the study of agent migration

This paper examines participants' experiences of interacting with a robotic companion (agent) that has the ability to move its “mind” between different robotic embodiments to take advantage of the features and functionalities associated with the different embodiments in a process called agent migration. In particular, we focus on identifying factors that can help the companion retain its identity in different embodiments. This includes examining the clarity of the migration behaviour and how this behaviour may contribute to identity retention. Nine participants took part in a long-term study, and interacted with the robotic companion in the smart house twice-weekly over a period of 5 weeks. We used Narrative-based Integrated Episodic Scenario (NIES) framework for designing long-term interaction scenarios that provided habituation and intervention phases while conveying the impression of continuous long-term interaction. The results show that NEIS allows us to explore complex intervention scenarios and obtain a sense of continuity of context across the long-term study. The results also suggest that as participants become habituated with the companion, they found the realisation of migration signaling clearer, and felt more certain of the identity of the companion in later sessions, and that the most important factor for this was the agent's continuation of tasks across embodiments. This paper is both empirical as well as methodological in nature.

[1]  Krzysztof Arent,et al.  Identity of a companion, migrating between robots without common communication modalities: Initial results of VHRI study , 2013, 2013 18th International Conference on Methods & Models in Automation & Robotics (MMAR).

[2]  Karl F. MacDorman,et al.  Long-term relationships as a benchmark for robot personhood , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[3]  Kerstin Dautenhahn,et al.  Hey! There is someone at your door. A hearing robot using visual communication signals of hearing dogs to communicate intent , 2013, 2013 IEEE Symposium on Artificial Life (ALife).

[4]  Elena Márquez Segura,et al.  Do I remember you? Memory and identity in multiple embodiments , 2013, 2013 IEEE RO-MAN.

[5]  Bill Tomlinson,et al.  Embodied mobile agents , 2006, AAMAS '06.

[6]  John F. Bradley,et al.  Agent Chameleons: Virtual Agents Real Intelligence , 2003, IVA.

[7]  Kerstin Dautenhahn,et al.  Integrating Constrained Experiments in Long-Term Human–Robot Interaction Using Task- and Scenario-Based Prototyping , 2015, Inf. Soc..

[8]  Wendy Ju,et al.  Tell me more designing HRI to encourage more trust, disclosure, and companionship , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Kerstin Dautenhahn,et al.  Would You Trust a (Faulty) Robot? Effects of Error, Task Type and Personality on Human-Robot Cooperation and Trust , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[10]  Kerstin Dautenhahn,et al.  Views from Within a Narrative: Evaluating Long-Term Human–Robot Interaction in a Naturalistic Environment Using Open-Ended Scenarios , 2014, Cognitive Computation.

[11]  Kerstin Dautenhahn,et al.  Companion Migration – Initial Participants’ Feedback from a Video-Based Prototyping Study , 2011 .

[12]  Yasuyuki Sumi,et al.  AgentSalon: facilitating face-to-face knowledge exchange through conversations among personal agents , 2001, AGENTS '01.

[13]  Vanessa Evers,et al.  Robots to motivate elderly people: Present and future challenges , 2013, 2013 IEEE RO-MAN.

[14]  John F. Bradley,et al.  Empowering Agents within Virtual Environments , 2004 .

[15]  Bruce Christianson,et al.  Knowledge-driven User Activity Recognition for a Smart House. Development and Validation of a Generic and Low-Cost, Resource-Efficient System , 2013, ACHI 2013.

[16]  Ana Paiva,et al.  ViPleo and PhyPleo: artificial pet with two embodiments , 2011, Advances in Computer Entertainment Technology.

[17]  Won-Sook Lee,et al.  On designing migrating agents: from autonomous virtual agents to intelligent robotic systems , 2014, SIGGRAPH ASIA Autonomous Virtual Humans and Social Robot for Telepresence.

[18]  Robin R. Murphy,et al.  Survey of Non-facial/Non-verbal Affective Expressions for Appearance-Constrained Robots , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[19]  Kerstin Dautenhahn,et al.  A User study on visualization of agent migration between two companion robots , 2009 .

[20]  Tetsuo Ono,et al.  ITACO: Constructing an emotional relationship between human and robot , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[21]  John F. Bradley,et al.  Agent chameleons: agent minds and bodies , 2003, Proceedings 11th IEEE International Workshop on Program Comprehension.

[22]  K. Dautenhahn,et al.  The boy-robot should bark! : Children's impressions of agent migration into diverse embodiments , 2009 .

[23]  T. Ono,et al.  Agent migration: communications between a human and robot , 1999, IEEE SMC'99 Conference Proceedings. 1999 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.99CH37028).

[24]  I. René J. A. te Boekhorst,et al.  Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion , 2008, Auton. Robots.

[25]  Kerstin Dautenhahn,et al.  Social Roles and Baseline Proxemic Preferences for a Domestic Service Robot , 2014, Int. J. Soc. Robotics.

[26]  Kerstin Dautenhahn,et al.  “Teach Me–Show Me”—End-User Personalization of a Smart Home and Companion Robot , 2016, IEEE Transactions on Human-Machine Systems.