Human-robot Collaborative Navigation Search using Social Reward Sources

This paper proposes a Social Reward Sources (SRS) design for a Human-Robot Collaborative Navigation (HRCN) task: human-robot collaborative search. It is a flexible approach capable of handling the collaborative task, human-robot interaction and environment restrictions, all integrated on a common environment. We modelled task rewards based on unexplored area observability and isolation and evaluated the model through different levels of human-robot communication. The models are validated through quantitative evaluation against both agents' individual performance and qualitative surveying of participants' perception. After that, the three proposed communication levels are compared against each other using the previous metrics.

[1]  Chandimal Jayawardena,et al.  A decision making model for optimizing social relationship for side-by-side robotic wheelchairs in active mode , 2016, 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[2]  Kazuhiro Kosuge,et al.  Progress and prospects of the human–robot collaboration , 2017, Autonomous Robots.

[3]  Chandimal Jayawardena,et al.  A navigation model for side-by-side robotic wheelchairs for optimizing social comfort in crossing situations , 2018, Robotics Auton. Syst..

[4]  Adrian Filipescu,et al.  Virtual Pheromones for Real-Time Control of Autonomous Mobile Robots , 2017 .

[5]  R. Tindale,et al.  Team Reflexivity, Development of Shared Task Representations, and the Use of Distributed Information in Group Decision Making , 2009 .

[6]  G. Aschersleben,et al.  The Theory of Event Coding (TEC): a framework for perception and action planning. , 2001, The Behavioral and brain sciences.

[7]  Tadej Petric,et al.  Robotic assembly solution by human-in-the-loop teaching method based on real-time stiffness modulation , 2018, Auton. Robots.

[8]  Thomas B. Moeslund,et al.  RO-MAN 2016 - Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication , 2016 .

[9]  Guy Hoffman,et al.  Evaluating Fluency in Human–Robot Collaboration , 2019, IEEE Transactions on Human-Machine Systems.

[10]  Masahide Kaneko,et al.  Movement Control of Accompanying Robot Based on Artificial Potential Field Adapted to Dynamic Environments , 2015 .

[11]  Brian Scassellati,et al.  Transparent role assignment and task allocation in human robot collaboration , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[12]  Gustav Pomberger,et al.  Self-organising congestion evasion strategies using ant-based pheromones , 2010 .

[13]  Hermann J. Müller,et al.  Implications of Robot Actions for Human Perception. How Do We Represent Actions of the Observed Robots? , 2014, Int. J. Soc. Robotics.

[14]  Séverin Lemaignan,et al.  Artificial cognition for social human-robot interaction: An implementation , 2017, Artif. Intell..

[15]  Terrence Fong,et al.  Collaboration, Dialogue, Human-Robot Interaction , 2001, ISRR.

[16]  Alberto Sanfeliu,et al.  Aerial social force model: A new framework to accompany people using autonomous flying robots , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[17]  Nikolaos G. Tsagarakis,et al.  Robot adaptation to human physical fatigue in human–robot co-manipulation , 2018, Auton. Robots.

[18]  Siddhartha S. Srinivasa,et al.  Planning with Trust for Human-Robot Collaboration , 2018, 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[19]  Yiannis Demiris,et al.  Collaborative Control for a Robotic Wheelchair: Evaluation of Performance, Attention, and Workload , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[20]  Peter Ford Dominey,et al.  The Coordinating Role of Language in Real-Time Multimodal Learning of Cooperative Tasks , 2013, IEEE Transactions on Autonomous Mental Development.

[21]  M. Winston,et al.  Pheromone Communication in Social Insects , 1997 .

[22]  Gonzalo Ferrer,et al.  Robot companion: A social-force based approach with human awareness-navigation in crowded environments , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[23]  Aurélie Clodic,et al.  Key Elements for Human-Robot Joint Action , 2017 .

[24]  Rachid Alami,et al.  An implemented theory of mind to improve human-robot shared plans execution , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[25]  Eliseo Ferrante,et al.  Swarm robotics: a review from the swarm engineering perspective , 2013, Swarm Intelligence.

[26]  Thomas B. Moeslund,et al.  Projecting robot intentions into human environments , 2016, 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[27]  H. H. Clark,et al.  Common ground at the understanding of demonstrative reference , 1983 .

[28]  Adrian Filipescu,et al.  Real-time control of autonomous mobile robots using virtual pheromones , 2009, 2009 7th Asian Control Conference.

[29]  Cynthia Breazeal,et al.  Collaboration in Human-Robot Teams , 2004, AIAA 1st Intelligent Systems Technical Conference.

[30]  M. Tomasello,et al.  Shared intentionality. , 2007, Developmental science.

[31]  Gonzalo Ferrer,et al.  Robot social-aware navigation framework to accompany people walking side-by-side , 2016, Autonomous Robots.

[32]  Takayuki Kanda,et al.  Walking together , 2012, HRI 2014.

[33]  Siddhartha S. Srinivasa,et al.  Effects of Robot Motion on Human-Robot Collaboration , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[34]  F. Ratnieks,et al.  Communication in ants , 2006, Current Biology.

[35]  Michael E. Bratman,et al.  Shared Cooperative Activity , 1991 .