Cooperative gestures: effective signaling for humanoid robots

Cooperative gestures are a key aspect of human-human pro-social interaction. Thus, it is reasonable to expect that endowing humanoid robots with the ability to use such gestures when interacting with humans would be useful. However, while people are used to responding to such gestures expressed by other humans, it is unclear how they might react to a robot making them. To explore this topic, we conducted a within-subjects, video based laboratory experiment, measuring time to cooperate with a humanoid robot making interactional gestures. We manipulated the gesture type (beckon, give, shake hands), the gesture style (smooth, abrupt), and the gesture orientation (front, side). We also employed two measures of individual differences: negative attitudes toward robots (NARS) and human gesture decoding ability (DANVA2-POS). Our results show that people cooperate with abrupt gestures more quickly than smooth ones and front-oriented gestures more quickly than those made to the side, people's speed at decoding robot gestures is correlated with their ability to decode human gestures, and negative attitudes toward robots is strongly correlated with a decreased ability in decoding human gestures.

[1]  S. Nowicki,et al.  Individual differences in the nonverbal communication of affect: The diagnostic analysis of nonverbal accuracy scale , 1994 .

[2]  Frank J. Bernieri,et al.  Interpersonal Sensitivity : Theory and Measurement , 2001 .

[3]  Klaus Krippendorff,et al.  Content Analysis: An Introduction to Its Methodology , 1980 .

[4]  K. Dautenhahn,et al.  The Negative Attitudes Towards Robots Scale and reactions to robot behaviour in a live Human-Robot Interaction study , 2009 .

[5]  Susan Goldin-Meadow,et al.  The Seeds of Spatial Grammar in the Manual Modality , 2005, Cogn. Sci..

[6]  Hallee Pitterman,et al.  A Test of the Ability to Identify Emotion in Human Standing and Sitting Postures: The Diagnostic Analysis of Nonverbal Accuracy-2 Posture Test (DANVA2-POS) , 2004, Genetic, social, and general psychology monographs.

[7]  Sonia Chernova,et al.  Mobile human-robot teaming with environmental tolerance , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[8]  Rachid Alami,et al.  A methodological approach relating the classification of gesture to identification of human intent in the context of human-robot interaction , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[9]  T. Kanda,et al.  Measurement of negative attitudes toward robots , 2006 .

[10]  Maja Pantic,et al.  Social signal processing: Survey of an emerging domain , 2009, Image Vis. Comput..

[11]  S. Preston,et al.  Empathy: Its ultimate and proximate bases. , 2001, The Behavioral and brain sciences.

[12]  Sriram Subramanian,et al.  Conversational gestures in human-robot interaction , 2009, 2009 IEEE International Conference on Systems, Man and Cybernetics.

[13]  Takayuki Kanda,et al.  Analysis of Humanoid Appearances in Human-Robot Interaction , 2008, IEEE Trans. Robotics.

[14]  Andrea Lockerd Thomaz,et al.  Teaching and working with robots as a collaboration , 2004, Proceedings of the Third International Joint Conference on Autonomous Agents and Multiagent Systems, 2004. AAMAS 2004..

[15]  Lauren Wispé,et al.  The psychology of sympathy , 1991 .

[16]  Matthew W. Crocker,et al.  Visual attention in spoken human-robot interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[17]  Susan R. Fussell,et al.  Comparing a computer agent with a humanoid robot , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[18]  Cynthia Breazeal,et al.  Social Robots that Interact with People , 2008, Springer Handbook of Robotics.

[19]  Peter Robinson,et al.  How anthropomorphism affects empathy toward robots , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[20]  Marek P. Michalowski,et al.  Keepon : A Playful Robot for Research, Therapy, and Entertainment (Original Paper) , 2009 .

[21]  A. Kendon Gesture: Visible Action as Utterance , 2004 .

[22]  D. Callan,et al.  Giving speech a hand: Gesture modulates activity in auditory cortex during speech perception , 2009, Human brain mapping.

[23]  Aaron Powers,et al.  Matching robot appearance and behavior to tasks to improve human-robot cooperation , 2003, The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003..

[24]  Heeyoung Kim,et al.  Personality design of sociable robots by control of gesture design factors , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[25]  J. Allen,et al.  Reading others emotions: The role of intuitive judgments in predicting marital satisfaction, quality, and stability. , 2004, Journal of family psychology : JFP : journal of the Division of Family Psychology of the American Psychological Association.

[26]  Dana Kulic,et al.  Affective State Estimation for Human–Robot Interaction , 2007, IEEE Transactions on Robotics.

[27]  Candace L. Sidner,et al.  Explorations in engagement for humans and robots , 2005, Artif. Intell..

[28]  T. Flash,et al.  The coordination of arm movements: an experimentally confirmed mathematical model , 1985, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[29]  D H Brainard,et al.  The Psychophysics Toolbox. , 1997, Spatial vision.

[30]  Klaus Krippendorff,et al.  Answering the Call for a Standard Reliability Measure for Coding Data , 2007 .

[31]  S. Sternberg Memory-scanning: mental processes revealed by reaction-time experiments. , 1969, American scientist.

[32]  A. Kendon Gesticulation and Speech: Two Aspects of the Process of Utterance , 1981 .

[33]  Y Uno,et al.  Quantitative examinations of internal representations for arm trajectory planning: minimum commanded torque change model. , 1999, Journal of neurophysiology.

[34]  Takayuki Kanda,et al.  Footing in human-robot conversations: How robots might shape participant roles using gaze cues , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[35]  Kerstin Dautenhahn,et al.  Methodology & Themes of Human-Robot Interaction: A Growing Research Field , 2007 .

[36]  M. Tomasello,et al.  Cooperation and human cognition: the Vygotskian intelligence hypothesis , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[37]  D. Waal,et al.  The ‘Russian Doll’ model of empathy and imitation , 2007 .

[38]  Peter Robinson,et al.  Empathizing with robots: Fellow feeling along the anthropomorphic spectrum , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.