Infant-like Social Interactions between a Robot and a Human Caregiver

From birth, human infants are immersed in a social environment that allows them to learn by leveraging the skills and capabilities of their caregivers. A critical pre-cursor to this type of social learning is the ability to maintain interaction levels that are neither overwhelming nor under-stim ulating. In this paper, we present a mechanism for an autonomous robot to regulate the intensity of its social interactions with a human. Similar to the feedback from infant to caregiver, the robot uses expressive displays to modulate the interaction intensity. This mechanism is integrated within a general framework that combines perception, attention, drives, emotions, behavior selection, and motor acts. We present a specific implementation of this architecture that enables the robot to react appropriately to both social stimuli (faces) and non-social stimuli (moving toys) while maintaining a suitable interaction intensity. We present results from both face-to-face interactions and interactions mediated through a toy. Note: This paper was submitted in June, 1998.

[1]  Tucker R. Balch,et al.  Communication in reactive multiagent robotic systems , 1995, Auton. Robots.

[2]  Rodney A. Brooks,et al.  Intelligence Without Reason , 1991, IJCAI.

[3]  H. Bastian Sensation and Perception.—I , 1869, Nature.

[4]  Brian Scassellati A Binocular, Foveated Active Vision System , 1998 .

[5]  P. Maes,et al.  Old tricks, new dogs: ethology and interactive creatures , 1997 .

[6]  Tomaso A. Poggio,et al.  Example-Based Learning for View-Based Human Face Detection , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Pawan Sinha,et al.  Perceiving and recognizing three-dimensional forms , 1996 .

[8]  M. Bullowa Before Speech: The Beginning of Interpersonal Communication , 1979 .

[9]  C. Izard,et al.  Four systems for emotion activation: cognitive and noncognitive processes. , 1993, Psychological review.

[10]  Herbert H. Clark,et al.  Grounding in communication , 1991, Perspectives on socially shared cognition.

[11]  BreazealCynthia,et al.  Infant-like social interactions between a robot and a human caregiver , 2000 .

[12]  C. Trevarthen Communication and cooperation in early infancy: a description of primary intersubjectivity , 1979 .

[13]  Anne Treisman,et al.  Features and objects in visual processing , 1986 .

[14]  H. Evans The Study of Instinct , 1952 .

[15]  Maja J. Mataric,et al.  Issues and approaches in the design of collective autonomous agents , 1995, Robotics Auton. Syst..

[16]  Juan David Velásquez,et al.  Cathexis--a computational model for the generation of emotions and their influence in the behavior of autonomous agents , 1996 .

[17]  W. Scott Neal Reilly,et al.  An Architecture for Action, Emotion, and Social Behavior , 1992, MAAMAW.

[18]  Rodney A. Brooks,et al.  A Robust Layered Control Syste For A Mobile Robot , 2022 .

[19]  C. Elliott The affective reasoner: a process model of emotions in a multi-agent system , 1992 .

[20]  Cynthia Breazeal,et al.  A Motivational System for Regulating Human-Robot Interaction , 1998, AAAI/IAAI.

[21]  W. S. Reilly,et al.  Believable Social and Emotional Agents. , 1996 .

[22]  Luc Steels,et al.  When are robots intelligent autonomous agents? , 1995, Robotics Auton. Syst..

[23]  Ronald C. Arkin Homeostatic control for a mobile robot: Dynamic replanning in hazardous environments , 1992, J. Field Robotics.

[24]  Brian Scassellati Mechanisms of Shared Attention for a Humanoid Robot , 1998 .

[25]  M. Turk,et al.  Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.

[26]  Myrna M. Milani,et al.  The body language and emotion of dogs: a practical guide to the physical and behavioral displays owners and dogs exchange and how to use them to create a lasting bond. 1st ed. , 1986 .

[28]  Ken Perlin,et al.  Real Time Responsive Animation with Personality , 1995, IEEE Trans. Vis. Comput. Graph..

[29]  J. Kagan,et al.  On the nature of emotion. , 1994, Monographs of the Society for Research in Child Development.

[30]  Y. J. Tejwani,et al.  Robot vision , 1989, IEEE International Symposium on Circuits and Systems,.

[31]  P. Niedenthal,et al.  The heart's eye: Emotional influences in perception and attention. , 1994 .

[32]  D. McFarland,et al.  Intelligent behavior in animals and robots , 1993 .

[33]  M. Halliday Learning How to Mean: Explorations in the Development of Language , 1975 .

[34]  David W. Murray,et al.  A modular head/eye platform for real-time reactive vision Mechatronics , 1993 .

[35]  Aude Billard,et al.  Grounding communication in situated, social robots , 1997 .

[36]  Brian Scassellati,et al.  Alternative Essences of Intelligence , 1998, AAAI/IAAI.

[37]  J. Bruner,et al.  The role of tutoring in problem solving. , 1976, Journal of child psychology and psychiatry, and allied disciplines.

[38]  Berthold K. P. Horn Robot vision , 1986, MIT electrical engineering and computer science series.

[39]  Mark Steedman,et al.  Animated conversation: rule-based generation of facial expression, gesture & spoken intonation for multiple conversational agents , 1994, SIGGRAPH.

[40]  Takeo Kanade,et al.  Human Face Detection in Visual Scenes , 1995, NIPS.

[41]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .

[42]  G. Reeke The society of mind , 1991 .

[43]  Phil Roberts,et al.  Learning to mean , 1982 .

[44]  D. Coombs Real-Time Gaze Holding in Binocular Robot Vision , 1992 .

[45]  William Rowan,et al.  The Study of Instinct , 1953 .

[46]  P. Ekman,et al.  Facial action coding system: a technique for the measurement of facial movement , 1978 .