Human-Robot Collaboration: From Psychology to Social Robotics

With the advances in robotic technology, research in human-robot collaboration (HRC) has gained in importance. For robots to interact with humans autonomously they need active decision making that takes human partners into account. However, state-of-the-art research in HRC does often assume a leader-follower division, in which one agent leads the interaction. We believe that this is caused by the lack of a reliable representation of the human and the environment to allow autonomous decision making. This problem can be overcome by an embodied approach to HRC which is inspired by psychological studies of human-human interaction (HHI). In this survey, we review neuroscientific and psychological findings of the sensorimotor patterns that govern HHI and view them in a robotics context. Additionally, we study the advances made by the robotic community into the direction of embodied HRC. We focus on the mechanisms that are required for active, physical human-robot collaboration. Finally, we discuss the similarities and differences in the two fields of study which pinpoint directions of future research.

[1]  S. Duncan,et al.  Some Signals and Rules for Taking Speaking Turns in Conversations , 1972 .

[2]  E. Schegloff,et al.  A simplest systematics for the organization of turn-taking for conversation , 1974 .

[3]  A. M. Turing,et al.  Computing Machinery and Intelligence , 1950, The Philosophy of Artificial Intelligence.

[4]  A. Meltzoff Infant Imitation After a 1-Week Delay: Long-Term Memory for Novel Acts and Multiple Stimuli. , 1988, Developmental psychology.

[5]  W. Prinz A common-coding approach to perception and action , 1990 .

[6]  R A Brooks,et al.  New Approaches to Robotics , 1991, Science.

[7]  Michael I. Jordan,et al.  An internal model for sensorimotor integration. , 1995, Science.

[8]  Marinus Maris,et al.  Exploiting physical constraints: heap formation through behavioral error in a group of robots , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[9]  Ran,et al.  The correspondence problem , 1998 .

[10]  T. Chartrand,et al.  The chameleon effect: the perception-behavior link and social interaction. , 1999, Journal of personality and social psychology.

[11]  J. Dixon,et al.  Journal of Experimental Child Psychology , 2001 .

[12]  Mitsuo Kawato,et al.  MOSAIC Model for Sensorimotor Learning and Control , 2001, Neural Computation.

[13]  A. Noë,et al.  A sensorimotor account of vision and visual consciousness. , 2001, The Behavioral and brain sciences.

[14]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[15]  Jun Nakanishi,et al.  Learning Attractor Landscapes for Learning Motor Primitives , 2002, NIPS.

[16]  Jun Nakanishi,et al.  Learning rhythmic movements by demonstration using nonlinear oscillators , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  K. Doya,et al.  A unifying computational framework for motor control and social interaction. , 2003, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[18]  R. Johansson,et al.  Action plans used in action observation , 2003, Nature.

[19]  Stefan Schaal,et al.  Computational approaches to motor learning by imitation. , 2003, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[20]  W. Prinz,et al.  Representing others' actions: just like one's own? , 2003, Cognition.

[21]  G. Rizzolatti,et al.  The mirror-neuron system. , 2004, Annual review of neuroscience.

[22]  Rajesh P. N. Rao,et al.  Robotic imitation from human motion capture using Gaussian processes , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[23]  Bruno Galantucci,et al.  An Experimental Study of the Emergence of Human Communication Systems , 2005, Cogn. Sci..

[24]  P. Liddle,et al.  External Behavior Monitoring Mirrors Internal Behavior Monitoring , 2005 .

[25]  G. Gallup,et al.  Contagious yawning and the brain. , 2005, Brain research. Cognitive brain research.

[26]  M. Tomasello,et al.  Understanding and sharing intentions: The origins of cultural cognition , 2005, Behavioral and Brain Sciences.

[27]  Rolf Pfeifer,et al.  How the body shapes the way we think - a new view on intelligence , 2006 .

[28]  Minoru Asada,et al.  Learning for joint attention helped by functional development , 2006, Adv. Robotics.

[29]  M. Nielsen,et al.  Copying actions and copying outcomes: social learning through the second year. , 2006, Developmental psychology.

[30]  Brian Scassellati,et al.  Active Learning of Joint Attention , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[31]  H. Bekkering,et al.  Joint action: bodies and minds moving together , 2006, Trends in Cognitive Sciences.

[32]  Brian Scassellati,et al.  Synchronization in Social Tasks: Robotic Drumming , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[33]  Aude Billard,et al.  Incremental learning of gestures by imitation in a humanoid robot , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[34]  D. Wolpert Probabilistic models in human sensorimotor control. , 2007, Human movement science.

[35]  Michael J. Richardson,et al.  Judging and actualizing intrapersonal and interpersonal affordances. , 2007, Journal of experimental psychology. Human perception and performance.

[36]  Michael J. Richardson,et al.  Rocking together: dynamics of intentional and unintentional interpersonal coordination. , 2007, Human movement science.

[37]  S. Tipper,et al.  Gaze cueing of attention: visual attention, social cognition, and individual differences. , 2007, Psychological bulletin.

[38]  R. Keast,et al.  Getting The Right Mix: Unpacking Integration Meanings and Strategies , 2007 .

[39]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[40]  Karl J. Friston,et al.  Predictive coding: an account of the mirror neuron system , 2007, Cognitive Processing.

[41]  Cynthia Breazeal,et al.  Effects of anticipatory action on human-robot teamwork: Efficiency, fluency, and perception of team , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[42]  V. Ramachandran,et al.  The human mirror neuron system: a link between action observation and social skills. , 2007, Social cognitive and affective neuroscience.

[43]  Stephanie Seneff,et al.  Spoken Dialogue Systems , 2008 .

[44]  C. Macrae,et al.  A case of hand waving: Action synchrony and person perception , 2008, Cognition.

[45]  Manuel Lopes,et al.  Learning Object Affordances: From Sensory--Motor Coordination to Imitation , 2008, IEEE Transactions on Robotics.

[46]  Autumn B. Hostetter,et al.  Visible embodiment: Gestures as simulated action , 2008, Psychonomic bulletin & review.

[47]  B. Ingersoll,et al.  The Social Role of Imitation in Autism: Implications for the Treatment of Imitation Deficits , 2008 .

[48]  H. Tager-Flusberg,et al.  Autism spectrum disorders: clinical and research frontiers , 2008, Archives of Disease in Childhood.

[49]  H. Bekkering,et al.  Understanding action beyond imitation: reversed compatibility effects of action observation in imitation and joint action. , 2008, Journal of experimental psychology. Human perception and performance.

[50]  Michael J. Richardson,et al.  Dynamics of Interpersonal Coordination , 2008 .

[51]  H. Bekkering,et al.  Understanding the flexibility of action–perception coupling , 2009, Psychological research.

[52]  Susan S. Jones,et al.  The development of imitation in infancy , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[53]  Jeffrey C. Trinkle,et al.  ShadowPlay: A generative model for nonverbal human-robot interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[54]  T. Chartrand,et al.  Where is the love? The social aspects of mimicry , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[55]  Scott S. Wiltermuth,et al.  Synchrony and Cooperation , 2009, Psychological science.

[56]  Paul Evrard,et al.  Homotopy switching model for dyad haptic interaction in physical collaborative tasks , 2009, World Haptics 2009 - Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems.

[57]  Brett Browning,et al.  A survey of robot learning from demonstration , 2009, Robotics Auton. Syst..

[58]  U. Castiello,et al.  Does the intention to communicate affect action kinematics? , 2009, Consciousness and Cognition.

[59]  Katharina J. Rohlfing,et al.  Attention via Synchrony: Making Use of Multimodal Cues in Social Learning , 2009, IEEE Transactions on Autonomous Mental Development.

[60]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[61]  Philippe Gaussier,et al.  How an Agent Can Detect and Use Synchrony Parameter of Its Own Interaction with a Human? , 2009, COST 2102 Training School.

[62]  Michael J. Richardson,et al.  Social Connection Through Joint Action and Interpersonal Coordination , 2009, Top. Cogn. Sci..

[63]  Yoshihiro Miyake,et al.  Interpersonal Synchronization of Body Motion and the Walk-Mate Walking Support Robot , 2009, IEEE Transactions on Robotics.

[64]  Natalie Sebanz,et al.  Prediction in Joint Action: What, When, and Where , 2009, Top. Cogn. Sci..

[65]  Sethu Vijayakumar,et al.  Active Sequential Learning with Tactile Feedback , 2010, AISTATS.

[66]  Rachid Alami,et al.  Planning Safe and Legible Hand-over Motions for Human-Robot Interaction , 2010 .

[67]  José Santos-Victor,et al.  Abstraction Levels for Robotic Imitation: Overview and Computational Approaches , 2010, From Motor Learning to Interaction Learning in Robots.

[68]  Byron Boots,et al.  Closing the learning-planning loop with predictive state representations , 2009, Int. J. Robotics Res..

[69]  Sandra Hirche,et al.  Load sharing in human-robot cooperative manipulation , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[70]  James M. Rehg,et al.  Learning Visual Object Categories for Robot Affordance Prediction , 2010, Int. J. Robotics Res..

[71]  Alois Knoll,et al.  Design Principles for Safety in Human-Robot Interaction , 2010, Int. J. Soc. Robotics.

[72]  Yael Edan,et al.  A Human-Robot Collaborative Reinforcement Learning Algorithm , 2010, J. Intell. Robotic Syst..

[73]  Ehud Sharlin,et al.  Evaluating Human-Robot Interaction , 2011, Int. J. Soc. Robotics.

[74]  Siddhartha S. Srinivasa,et al.  Perception and Control Challenges for Effec tive Human-Robot Handoff s , 2011 .

[75]  Sandra Hirche,et al.  An experience-driven robotic assistant acquiring human knowledge to improve haptic cooperation , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[76]  Brian Scassellati,et al.  Robot gaze does not reflexively cue human attention , 2011, CogSci.

[77]  Siddhartha S. Srinivasa,et al.  Using spatial and temporal contrast for fluent robot-human hand-overs , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[78]  Rémi Ronfard,et al.  A survey of vision-based methods for action representation, segmentation and recognition , 2011, Comput. Vis. Image Underst..

[79]  Sandra Hirche,et al.  Synchronization in a goal-directed task: Human movement coordination with each other and robotic partners , 2011, 2011 RO-MAN.

[80]  P. Keller,et al.  The role of temporal prediction abilities in interpersonal sensorimotor synchronization , 2010, Experimental Brain Research.

[81]  Gergő Orbán,et al.  Representations of uncertainty in sensorimotor control , 2011, Current Opinion in Neurobiology.

[82]  N. Sebanz,et al.  Psychological research on joint action: Theory and data , 2011 .

[83]  Abderrahmane Kheddar Human-robot haptic joint actions is an equal control-sharing approach possible? , 2011, 2011 4th International Conference on Human System Interactions, HSI 2011.

[84]  Danica Kragic,et al.  Visual object-action recognition: Inferring object affordances from human demonstration , 2011, Comput. Vis. Image Underst..

[85]  Weihua Sheng,et al.  Using human motion estimation for human-robot cooperative manipulation , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[86]  Jeffrey M. Hausdorff,et al.  Modality-specific communication enabling gait synchronization during over-ground side-by-side walking. , 2012, Human movement science.

[87]  M. Tomasello,et al.  Collaborative partner or social tool? New evidence for young children's understanding of joint intentions in collaborative activities. , 2012, Developmental science.

[88]  Frank Broz,et al.  Interaction Histories and Short Term Memory: Enactive Development of Turn-taking Behaviors in a Childlike Humanoid Robot , 2012, ArXiv.

[89]  Rachid Alami,et al.  Sharing effort in planning human-robot handover tasks , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[90]  F. Zuher,et al.  Recognition of Human Motions for Imitation and Control of a Humanoid Robot , 2012, 2012 Brazilian Robotics Symposium and Latin American Robotics Symposium.

[91]  Wolfram Burgard,et al.  Feature-Based Prediction of Trajectories for Socially Compliant Navigation , 2012, Robotics: Science and Systems.

[92]  Takashi Minato,et al.  Physical Human-Robot Interaction: Mutual Learning and Adaptation , 2012, IEEE Robotics & Automation Magazine.

[93]  Maggie Shiffrar,et al.  People Watching: Social, Perceptual, and Neurophysiological Studies of Body Perception , 2012 .

[94]  R. Amant,et al.  Affordances for robots: a brief survey , 2012 .

[95]  E. Brown,et al.  The role of prediction in social neuroscience , 2012, Front. Hum. Neurosci..

[96]  M. Shiffrar,et al.  Expecting to lift a box together makes the load look lighter , 2011, Psychological research.

[97]  Cagatay Basdogan,et al.  The role of roles: Physical cooperation between humans and robots , 2012, Int. J. Robotics Res..

[98]  E. D. Paolo,et al.  The interactive brain hypothesis , 2012, Front. Hum. Neurosci..

[99]  Elizabeth A. Croft,et al.  Identifying nonverbal cues for automated human-robot turn-taking , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[100]  Albert Ali Salah,et al.  Joint Attention by Gaze Interpolation and Saliency , 2013, IEEE Transactions on Cybernetics.

[101]  T. Kanda,et al.  Infants understand the referential nature of human gaze but not robot gaze. , 2013, Journal of experimental child psychology.

[102]  Siddhartha S. Srinivasa,et al.  Legibility and predictability of robot motion , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[103]  M. Carpenter,et al.  Children Selectively Trust Individuals Who Have Imitated Them , 2013 .

[104]  Danica Kragic,et al.  Interactive object classification using sensorimotor contingencies , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[105]  Stefanos Nikolaidis,et al.  Human-robot cross-training: Computational formulation, modeling and evaluation of a human team training strategy , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[106]  Wolfgang Prinz,et al.  Action science: Foundations of an emerging discipline , 2013 .

[107]  David W. Aha,et al.  Artificial Intelligence , 2014 .

[108]  Gabriel Skantze,et al.  The furhat Back-Projected humanoid Head-Lip Reading, gaze and Multi-Party Interaction , 2013, Int. J. Humanoid Robotics.

[109]  Illah R. Nourbakhsh,et al.  Planning for Human–Robot Interaction in Socially Situated Tasks , 2013, Int. J. Soc. Robotics.

[110]  Bilge Mutlu,et al.  Coordination Mechanisms in Human-Robot Collaboration , 2013 .

[111]  M. Carpenter,et al.  The social side of imitation , 2013 .

[112]  Bernhard Schölkopf,et al.  Probabilistic movement modeling for intention inference in human–robot interaction , 2013, Int. J. Robotics Res..

[113]  Brian Scassellati,et al.  Are you looking at me? Perception of robot attention is mediated by gaze type and group size , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[114]  R. Saxe,et al.  Theory of Mind: A Neural Prediction Problem , 2013, Neuron.

[115]  Alexander Maye,et al.  Extending sensorimotor contingency theory: prediction, planning, and action generation , 2013, Adapt. Behav..

[116]  Hema Swetha Koppula,et al.  Learning human activities and object affordances from RGB-D videos , 2012, Int. J. Robotics Res..

[117]  Sean Andrist,et al.  Managing chaos: models of turn-taking in character-multichild interactions , 2013, ICMI '13.

[118]  Dmitry Berenson,et al.  Human-robot collaborative manipulation planning using early prediction of human motion , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[119]  M. Candidi,et al.  Kinematics fingerprints of leader and follower role-taking during cooperative joint actions , 2013, Experimental Brain Research.

[120]  P. König,et al.  Where's the action? The pragmatic turn in cognitive science , 2013, Trends in Cognitive Sciences.

[121]  G. Pezzulo,et al.  Action simulation in the human brain: Twelve questions , 2013 .

[122]  Sofiane Boucenna,et al.  Learning of Social Signatures Through Imitation Game Between a Robot and a Human Partner , 2014, IEEE Transactions on Autonomous Mental Development.

[123]  Hema Swetha Koppula,et al.  Anticipatory Planning for Human-Robot Teams , 2014, ISER.

[124]  James J. Gibson,et al.  The Ecological Approach to Visual Perception: Classic Edition , 2014 .

[125]  Michael J. Hove,et al.  Rhythm in joint action: psychological and neurophysiological mechanisms for real-time interpersonal coordination , 2014, Philosophical Transactions of the Royal Society B: Biological Sciences.

[126]  Etienne Burdet,et al.  Slaves no longer: review on role assignment for human–robot joint motor action , 2014, Adapt. Behav..

[127]  Pierre-Yves Oudeyer,et al.  Object Learning Through Active Exploration , 2014, IEEE Transactions on Autonomous Mental Development.

[128]  Karon E. MacLean,et al.  Meet Me where I’m Gazing: How Shared Attention Gaze Affects Human-Robot Handover Timing , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[129]  C. Heyes,et al.  Mirror neurons: from origin to function. , 2014, The Behavioral and brain sciences.

[130]  Sabine U. König,et al.  The experience of new sensorimotor contingencies by sensory augmentation , 2014, Consciousness and Cognition.

[131]  Satoshi Endo,et al.  Implementation and experimental validation of Dynamic Movement Primitives for object handover , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[132]  Oliver G. B. Garrod,et al.  Dynamic Facial Expressions of Emotion Transmit an Evolving Hierarchy of Signals over Time , 2014, Current Biology.

[133]  Brian Scassellati,et al.  Discovering task constraints through observation and active learning , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[134]  Gabriel Skantze,et al.  Turn-taking, feedback and joint attention in situated human-robot interaction , 2014, Speech Commun..

[135]  Giuseppe Boccignone,et al.  Affective Facial Expression Processing via Simulation: A Probabilistic Model , 2014, Biologically Inspired Cognitive Architectures.

[136]  Oliver Kroemer,et al.  Interaction primitives for human-robot cooperation tasks , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[137]  Giulio Sandini,et al.  Understanding Object Weight from Human and Humanoid Lifting Actions , 2014, IEEE Transactions on Autonomous Mental Development.

[138]  L. Wheaton,et al.  I give you a cup, I get a cup: A kinematic study on social intention , 2014, Neuropsychologia.

[139]  C. Keysers,et al.  Hebbian learning and predictive mirror neurons for actions, sensations and emotions , 2014, Philosophical Transactions of the Royal Society B: Biological Sciences.

[140]  Siddhartha S. Srinivasa,et al.  Legible robot pointing , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[141]  S. Hirche,et al.  Rhythm Patterns Interaction - Synchronization Behavior for Human-Robot Joint Action , 2014, PloS one.

[142]  Justin W. Hart,et al.  Gesture, Gaze, Touch, and Hesitation: Timing Cues for Collaborative Work , 2014 .

[143]  Bilge Mutlu,et al.  Learning-Based Modeling of Multimodal Behaviors for Humanlike Robots , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[144]  Shuzhi Sam Ge,et al.  Human–Robot Collaboration Based on Motion Intention Estimation , 2014, IEEE/ASME Transactions on Mechatronics.

[145]  Siddhartha S. Srinivasa,et al.  Integrating human observer inferences into robot motion planning , 2014, Auton. Robots.

[146]  K. L. Marsh,et al.  The costs of cooperation: action-specific perception in the context of joint action. , 2014, Journal of experimental psychology. Human perception and performance.

[147]  Bilge Mutlu,et al.  Using gaze patterns to predict task intent in collaboration , 2015, Front. Psychol..

[148]  Thomas B. Schön,et al.  From Pixels to Torques: Policy Learning with Deep Dynamical Models , 2015, ICML 2015.

[149]  Stefanos Nikolaidis,et al.  Efficient Model Learning from Joint-Action Demonstrations for Human-Robot Collaborative Tasks , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[150]  Siddhartha S. Srinivasa,et al.  Effects of Robot Motion on Human-Robot Collaboration , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[151]  T. Foulsham,et al.  Speaking and Listening with the Eyes: Gaze Signaling during Dyadic Interactions , 2015, PloS one.

[152]  Danica Kragic,et al.  Learning Predictive State Representation for in-hand manipulation , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[153]  Brian Scassellati,et al.  Comparing Models of Disengagement in Individual and Group Interactions , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[154]  Jan Peters,et al.  Learning multiple collaborative tasks with a mixture of Interaction Primitives , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[155]  Chun Chen,et al.  Whole-body humanoid robot imitation with pose similarity evaluation , 2015, Signal Process..

[156]  G. Pezzulo,et al.  Interactional leader–follower sensorimotor communication strategies during repetitive joint actions , 2015, Journal of The Royal Society Interface.

[157]  Cordelia Schmid,et al.  A Robust and Efficient Video Representation for Action Recognition , 2015, International Journal of Computer Vision.

[158]  Dmitry Berenson,et al.  Predicting human reaching motion in collaborative tasks using Inverse Optimal Control and iterative re-planning , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[159]  Tao Zhang,et al.  Mental workload and task performance in peer-based human-robot teams , 2015, HRI 2015.

[160]  N. Sebanz,et al.  The role of shared visual information for joint action coordination , 2016, Cognition.

[161]  Anca D. Dragan,et al.  Information gathering actions over human internal state , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[162]  Andrea Lockerd Thomaz,et al.  Learning and grounding haptic affordances using demonstration and human-guided exploration , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[163]  C. Catmur,et al.  Group Dynamics in Automatic Imitation , 2016, PloS one.

[164]  Hema Swetha Koppula,et al.  Anticipating Human Activities Using Object Affordances for Reactive Robotic Response , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[165]  P. König,et al.  Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual search , 2016, Ergonomics.

[166]  Danica Kragic,et al.  A sensorimotor reinforcement learning framework for physical Human-Robot Interaction , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[167]  Bilge Mutlu,et al.  Anticipatory robot control for efficient human-robot collaboration , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[168]  Christoph Bartneck,et al.  Reciprocity in Human-Robot Interaction: A Quantitative Approach Through the Prisoner’s Dilemma and the Ultimatum Game , 2015, International Journal of Social Robotics.

[169]  Danica Kragic,et al.  Self-learning and adaptation in a sensorimotor framework , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[170]  Sergey Levine,et al.  End-to-End Training of Deep Visuomotor Policies , 2015, J. Mach. Learn. Res..

[171]  Siddhartha S. Srinivasa,et al.  Formalizing human-robot mutual adaptation: A bounded memory model , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[172]  Danica Kragic,et al.  Social Affordance Tracking over Time - A Sensorimotor Account of False-Belief Tasks , 2016, CogSci.

[173]  David Whitney,et al.  Interpreting multimodal referring expressions in real time , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[174]  J. Sliwa,et al.  A dedicated network for social interaction processing in the primate brain , 2017, Science.

[175]  Danica Kragic,et al.  Anticipating many futures: Online human motion prediction and synthesis for human-robot collaboration , 2017, ArXiv.

[176]  Danica Kragic,et al.  Deep Representation Learning for Human Motion Prediction and Classification , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[177]  Danica Kragic,et al.  Deep predictive policy training using reinforcement learning , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).