Achieving fluency through perceptual-symbol practice in human-robot collaboration

We have developed a cognitive architecture for robotic teammates based on the neuro-psychological principles of perceptual symbols and simulation, with the aim of attaining increased fluency in human-robot teams. An instantiation of this architecture was implemented on a robotic desk lamp, performing in a human-robot collaborative task. This paper describes initial results from a human-subject study measuring team efficiency and team fluency, in which the robot works on a joint task with untrained subjects. We find significant differences in a number of efficiency and fluency metrics, when comparing our architecture to a purely reactive robot with similar capabilities.

[1]  G. Knoblich,et al.  The case for motor involvement in perceiving conspecifics. , 2005, Psychological bulletin.

[2]  Stephen M. Rock,et al.  Dialogue-based human-robot interaction for space construction teams , 2002, Proceedings, IEEE Aerospace Conference.

[3]  Yoichiro Endo Anticipatory and Improvisational Robot via Recollection and Exploitation of Episodic Memories , 2005 .

[4]  Christoph Bregler,et al.  Learning and recognizing human dynamics in video sequences , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[5]  Cynthia Breazeal,et al.  Cost-Based Anticipatory Action Selection for Human–Robot Fluency , 2007, IEEE Transactions on Robotics.

[6]  H. Woern,et al.  COOPERATION BETWEEN HUMAN BEINGS AND ROBOT SYSTEMS IN AN INDUSTRIAL ENVIRONMENT , 2001 .

[7]  C. Breazeal,et al.  Ensemble: fluency and embodiment for robots acting with humans , 2007 .

[8]  Alex Pentland,et al.  Understanding purposeful human motion , 1999, Proceedings IEEE International Workshop on Modelling People. MPeople'99.

[9]  Cynthia Breazeal,et al.  Collaboration in Human-Robot Teams , 2004, AIAA 1st Intelligent Systems Technical Conference.

[10]  M. Iacoboni,et al.  Listening to speech activates motor areas involved in speech production , 2004, Nature Neuroscience.

[11]  S. Vereza Philosophy in the flesh: the embodied mind and its challenge to Western thought , 2001 .

[12]  Katsushi Ikeuchi,et al.  Task-model based human robot cooperation using vision , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[13]  Fabrice Heitz,et al.  Gesture localization and recognition using probabilistic visual learning , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[14]  C. Breazeal,et al.  Robotic Partners ’ Bodies and Minds : An Embodied Approach to Fluid Human-Robot Collaboration , 2006 .

[15]  Terrence Fong,et al.  Collaboration, Dialogue, and Human-Robot Interaction , 2001 .

[16]  Paul Lamere,et al.  Sphinx-4: a flexible open source framework for speech recognition , 2004 .

[17]  Margaret Wilson,et al.  Six views of embodied cognition , 2002, Psychonomic bulletin & review.

[18]  L. Barsalou,et al.  Whither structured representation? , 1999, Behavioral and Brain Sciences.

[19]  Daniel C. Richardson,et al.  Grounding Cognition: On the Perceptual-Motor and Image-Schematic Infrastructure of Language , 2005 .

[20]  Gordon Cheng,et al.  Visual Attention and Distributed Processing of Visual Information for the Control of Humanoid Robots , 2007 .

[21]  Sergio E. Chaigneau,et al.  THE SIMILARITY-IN-TOPOGRAPHY PRINCIPLE: RECONCILING THEORIES OF CONCEPTUAL DEFICITS , 2003, Cognitive neuropsychology.

[22]  Cynthia Breazeal,et al.  Experiments with a robotic computer: Body, affect and cognition interactions , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[23]  Terrence Fong,et al.  Collaboration, Dialogue, Human-Robot Interaction , 2001, ISRR.