A proxemic-based HRI testbed

This paper describes a novel, low cost HRI testbed for the evaluation of robot movement, gaze, audio style, and media content as a function of proximity. Numerous human-robot interaction studies have established the importance of proxemics in establishing trust and social consonance, but each has used a robot capable of only some component, for example gaze but not audio style. The Survivor Buddy proxemics testbed is expected to serve as blueprint for duplication or inspire the creation of other robots, enabling researchers to rapidly develop and test new schemes of proxemic based control. It is a small, four-degree of freedom, multi-media "head" costing approximately $2,000 USD to build and can be mounted on other robots or used independently. To enable proxemics support, Survivor Buddy can be coupled with either a dedicated range sensor or distance can be extracted from the embedded camera using computer vision. The paper presents a sample demonstration of proxemic competence for Survivor Buddy mounted on a search and rescue robot following the victim management scenario developed by Bethel and Murphy.

[1]  Hiroshi Ishiguro,et al.  “Could i have a word?”: Effects of robot's whisper , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Hiroshi G. Okuno,et al.  Dynamic communication of humanoid robot with multiple people based on interaction distance , 2004 .

[3]  Veikko Surakka,et al.  Affective effects of agent proximity in conversational systems , 2004, NordiCHI '04.

[4]  Robin R. Murphy,et al.  A survey of social gaze , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[5]  Cynthia Breazeal,et al.  MeBot: A robotic platform for socially embodied telepresence , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[6]  Leila Takayama,et al.  Influences on proxemic behaviors in human-robot interaction , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Chrystopher L. Nehaniv,et al.  An empirical framework for human-robot proxemics , 2009 .

[8]  Anders Green,et al.  Investigating Spatial Relationships in Human-Robot Interaction , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Robin R. Murphy,et al.  Non-facial/non-verbal methods of affective expression as applied to robot-assisted victim assessment , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[10]  T. V. Oosterhout,et al.  A visual method for robot proxemics measurements , 2008 .

[11]  Takayuki Kanda,et al.  Robot behavior adaptation for human-robot interaction based on policy gradient reinforcement learning , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Kerstin Dautenhahn,et al.  Robotic etiquette: Results from user studies involving a fetch and carry task , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[13]  Takayuki Kanda,et al.  Adapting Robot Behavior for Human--Robot Interaction , 2008, IEEE Transactions on Robotics.

[14]  Kerstin Dautenhahn,et al.  A personalized robot companion? - The role of individual differences on spatial preferences in HRI scenarios , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[15]  Young-Min Kim,et al.  A fuzzy intimacy space model to develop human-robot affective relationship , 2010, 2010 World Automation Congress.

[16]  E. Hall,et al.  The Hidden Dimension , 1970 .

[17]  Ehud Sharlin,et al.  Exploring emotive actuation and its role in human-robot interaction , 2010, HRI 2010.

[18]  Bilge Mutlu,et al.  Human-robot proxemics: Physical and psychological distancing in human-robot interaction , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[19]  Clifford Nass,et al.  The media equation - how people treat computers, television, and new media like real people and places , 1996 .

[20]  Robin R. Murphy,et al.  Non-facial and non-verbal affective expression for appearance-constrained robots used in victim management , 2010, Paladyn J. Behav. Robotics.

[21]  David Lee,et al.  Exploratory studies on social spaces between humans and a mechanical-looking robot , 2006, Connect. Sci..

[22]  Robin R. Murphy,et al.  Introduction to AI Robotics , 2000 .

[23]  Robin R. Murphy,et al.  A multi-disciplinary design process for affective robots: Case study of Survivor Buddy 2.0 , 2011, 2011 IEEE International Conference on Robotics and Automation.

[24]  Fakhri Karray,et al.  A testbed platform for assessing human-robot verbal interaction , 2010, 2010 International Conference on Autonomous and Intelligent Systems, AIS 2010.

[25]  I. René J. A. te Boekhorst,et al.  Human approach distances to a mechanical-looking robot with different robot voice styles , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[26]  Hiroshi Mizoguchi,et al.  Realization of Expressive Mobile Robot , 1997, Proceedings of International Conference on Robotics and Automation.

[27]  Kerstin Dautenhahn,et al.  An Autonomous Proxemic System for a Mobile Companion Robot , 2010 .

[28]  C. Bartneck,et al.  Perception of affect elicited by robot motion , 2010, HRI 2010.