Probabilistic Models of Proxemics for Spatially Situated Communication in HRI

To enable socially situated human-robot interaction, a robot must both understand and control proxemics—the social use of space—in order to employ communication mechanisms analogous to those used by humans. In this work, we focus on social speech and gesture production and recognition during both human-human and human-robot interactions. We conducted a data collection to model these factors as a function of the relative distance and orientation between sociable agents. The resulting models were used to implement a spatially situated autonomous proxemic robot controller. The controller utilizes a samplingbased approach, wherein each sample represents interagent distance and orientation, as well as agent speech and gesture production and recognition estimates; a particle filter (with resampling) uses these estimates to maximize the performance of both the robot and the human during the interaction. This functional approach yields pose, speech, and gesture estimates consistent with related work in human-human and human-robot interactions. This work contributes to the understanding of the underlying pre-cultural processes that govern proxemic behavior, and has implications for the development of robust proxemic controllers for robots situated in complex interactions (e.g., with more than two agents, or with individuals with hearing or visual impairments) and environments (e.g., with loud noises, low light, or visual occlusions).

[1]  Maja J. Mataric,et al.  Proxemic Feature Recognition for Interactive Robots: Automating Metrics from the Social Sciences , 2011, ICSR.

[2]  E. Hall,et al.  The Hidden Dimension , 1970 .

[3]  Cynthia Breazeal,et al.  Social interactions in HRI: the robot view , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[4]  Elena Torta,et al.  Design of Robust Robotic Proxemic Behaviour , 2011, ICSR.

[5]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[6]  Edward T. Hall,et al.  A System for the Notation of Proxemic Behavior1 , 1963 .

[7]  Samantha Rowbotham,et al.  Don't Stand So Close to Me: The Effect of Auditory Input on Interpersonal Space , 2009, Perception.

[8]  Takayuki Kanda,et al.  How to approach humans?-strategies for social robots to initiate interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Maja J. Mataric,et al.  Recognition of spatial dynamics for predicting social interaction , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[10]  Ross Mead Space, speech, and gesture in human-robot interaction , 2012, ICMI '12.

[11]  Andreas Krause,et al.  Unfreezing the robot: Navigation in dense, interacting crowds , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Nicla Rossini The Analysis of Gesture: Establishing a Set of Parameters , 2003, Gesture Workshop.

[13]  D. Feil-Seifer,et al.  Defining socially assistive robotics , 2005, 9th International Conference on Rehabilitation Robotics, 2005. ICORR 2005..

[14]  E. Hall The Silent Language , 1959 .

[15]  L. Hayduk,et al.  Personal space of the blind. , 1980, Social psychology quarterly.

[16]  T. W. Mallenby,et al.  The personal space of hard-of-hearing children after extended contact with 'normals'. , 1975, The British journal of social and clinical psychology.

[17]  Maja J. Mataric,et al.  Automated Proxemic Feature Extraction and Behavior Recognition: Applications in Human-Robot Interaction , 2013, Int. J. Soc. Robotics.

[18]  Maja J. Matarić,et al.  Representations of Proxemic Behavior for Human-Machine Interaction , 2012 .

[19]  J. Gregory Trafton,et al.  Robot-directed speech: using language to assess first-time users' conceptualizations of a robot , 2010, HRI 2010.

[20]  Maja J. Mataric,et al.  A probabilistic framework for autonomous proxemic control in situated and mobile human-robot interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[21]  Brian Scassellati,et al.  The Grand Challenges in Socially Assistive Robotics , 2007 .

[22]  Cynthia Breazeal,et al.  Socially intelligent robots: research, development, and applications , 2001, 2001 IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace (Cat.No.01CH37236).

[23]  Maja J. Mataric,et al.  Automated detection and classification of positive vs. negative robot interactions with children with autism using distance-based features , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[24]  Chrystopher L. Nehaniv,et al.  An empirical framework for human-robot proxemics , 2009 .

[25]  Jaap Ham,et al.  Proceedings of the Third international conference on Social Robotics , 2011 .

[26]  Kerstin Dautenhahn,et al.  Socially intelligent robots: dimensions of human–robot interaction , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[27]  W. G. Gardner,et al.  HRTF measurements of a KEMAR , 1995 .

[28]  Leslie Adams,et al.  The Effect of Lighting Conditions on Personal Space Requirements , 1991 .

[29]  Margaret J. Weber,et al.  Influence of Sensory Abilities on the Interpersonal Distance of the Elderly , 2003 .

[30]  Leila Takayama,et al.  Influences on proxemic behaviors in human-robot interaction , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[31]  Edward T. Hall,et al.  Handbook for Proxemic Research , 1974 .

[32]  Christian Laugier,et al.  Human Aware Navigation for Assistive Robotics , 2012, ISER.