Anticipatory Perceptual Simulation for Human-Robot Joint Practice: Theory and Application Study

With the aim of fluency and efficiency in human-robot teams, we have developed a cognitive architecture based on the neuro-psychological principles of anticipation and perceptual simulation through top-down biasing. An instantiation of this architecture was implemented on a nonanthropomorphic robotic lamp, performing in a human-robot collaborative task. In a human-subject study, in which the robot works on a joint task with untrained subjects, we find our approach to be significantly more efficient and fluent than in a comparable system without anticipatory perceptual simulation. We also show the robot and the human to be increasingly contributing at a similar rate. Through self-report, we find significant differences between the two conditions in the sense of team fluency, the team's improvement over time, and the robot's contribution to the efficiency and fluency. We also find difference in verbal attitudes towards the robot: most notably, subjects working with the anticipatory robot attribute more positive and more human qualities to the robot, but display increased self-blame and self-deprecation.

[1]  Margaret Wilson,et al.  Six views of embodied cognition , 2002, Psychonomic bulletin & review.

[2]  H. Bekkering,et al.  Joint action: bodies and minds moving together , 2006, Trends in Cognitive Sciences.

[3]  Matthias Hackel,et al.  Humanoid Robots, Human-like Machines , 2007 .

[4]  Sergio E. Chaigneau,et al.  THE SIMILARITY-IN-TOPOGRAPHY PRINCIPLE: RECONCILING THEORIES OF CONCEPTUAL DEFICITS , 2003, Cognitive neuropsychology.

[5]  C. Breazeal,et al.  Ensemble: fluency and embodiment for robots acting with humans , 2007 .

[6]  Paul Lamere,et al.  Sphinx-4: a flexible open source framework for speech recognition , 2004 .

[7]  Gordon Cheng,et al.  Visual Attention and Distributed Processing of Visual Information for the Control of Humanoid Robots , 2007 .

[8]  Cynthia Breazeal,et al.  Achieving fluency through perceptual-symbol practice in human-robot collaboration , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  L. Barsalou,et al.  Whither structured representation? , 1999, Behavioral and Brain Sciences.

[10]  Cynthia Breazeal,et al.  Cost-Based Anticipatory Action Selection for Human–Robot Fluency , 2007, IEEE Transactions on Robotics.

[11]  Cynthia Breazeal,et al.  Collaboration in Human-Robot Teams , 2004, AIAA 1st Intelligent Systems Technical Conference.

[12]  Christoph Bregler,et al.  Learning and recognizing human dynamics in video sequences , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[13]  G. Knoblich,et al.  The case for motor involvement in perceiving conspecifics. , 2005, Psychological bulletin.

[14]  Yoichiro Endo Anticipatory and Improvisational Robot via Recollection and Exploitation of Episodic Memories , 2005 .

[15]  H. Woern,et al.  COOPERATION BETWEEN HUMAN BEINGS AND ROBOT SYSTEMS IN AN INDUSTRIAL ENVIRONMENT , 2001 .

[16]  Katsushi Ikeuchi,et al.  Task-model based human robot cooperation using vision , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[17]  Alex Pentland,et al.  Understanding purposeful human motion , 1999, Proceedings IEEE International Workshop on Modelling People. MPeople'99.