Transparency in Multi-Human Multi-Robot Interaction

Transparency is a key factor in improving the performance of human-robot interaction. A transparent interface allows humans to be aware of the state of a robot and to assess the progress of the tasks at hand. When multi-robot systems are involved, transparency is an even greater challenge, due to the larger number of variables affecting the behavior of the robots as a whole. Significant effort has been devoted to studying transparency when single operators interact with multiple robots. However, studies on transparency that focus on multiple human operators interacting with a multi-robot systems are limited. This paper aims to fill this gap by presenting a humanswarm interaction interface with graphical elements that can be enabled and disabled. Through this interface, we study which graphical elements are contribute to transparency by comparing four “transparency modes”: (i) no transparency (no operator receives information from the robots), (ii) central transparency (the operators receive information only relevant to their personal task), (iii) peripheral transparency (the operators share information on each others’ tasks), and (iv) mixed transparency (both central and peripheral). We report the results in terms of awareness, trust, and workload of a user study involving 18 participants engaged in a complex multirobot task.

[1]  Joseph B. Lyons,et al.  Being Transparent about Transparency: A Model for Human-Robot Interaction , 2013, AAAI Spring Symposium: Trust and Autonomous Systems.

[2]  Michael J. Barnes,et al.  The Effects of Information Level on Human-Agent Interaction for Route Planning , 2015 .

[3]  Michael A. Rupp,et al.  Intelligent Agent Transparency in Human–Agent Teaming for Multi-UxV Management , 2016, Hum. Factors.

[4]  R. Likert “Technique for the Measurement of Attitudes, A” , 2022, The SAGE Encyclopedia of Research Design.

[5]  Eliseo Ferrante,et al.  Swarm robotics: a review from the swarm engineering perspective , 2013, Swarm Intelligence.

[6]  Heather M. Ross,et al.  Better Teaming Through Visual Cues: How Projecting Imagery in a Workspace Can Improve Human-Robot Collaboration , 2018, IEEE Robotics & Automation Magazine.

[7]  Michael W. Boyce,et al.  Situation Awareness-Based Agent Transparency , 2014 .

[8]  Christoph Lutz,et al.  Robots and Transparency: The Multiple Dimensions of Transparency in the Context of Robot Technologies , 2019, IEEE Robotics & Automation Magazine.

[9]  David Belton,et al.  Methodological Ambiguity and Inconsistency Constrain Unmanned Aerial Vehicles as A Silver Bullet for Monitoring Ecological Restoration , 2019, Remote. Sens..

[10]  Shahin Sirouspour,et al.  Multi-operator/multi-robot teleoperation: an adaptive nonlinear control approach , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Subbarao Kambhampati,et al.  Explicability? Legibility? Predictability? Transparency? Privacy? Security? The Emerging Landscape of Interpretable Agent Behavior , 2018, ICAPS.

[12]  Jessie Y. C. Chen,et al.  Situation awareness-based agent transparency and human-autonomy teaming effectiveness , 2018 .

[13]  Shayne Loft,et al.  Agent Transparency: A Review of Current Theory and Evidence , 2020, IEEE Transactions on Human-Machine Systems.

[14]  Jessie Y. C. Chen,et al.  A Proposed Approach for Determining the Influence of Multimodal Robot-of-Human Transparency Information on Human-Agent Teams , 2016, HCI.

[15]  Svyatoslav Guznov,et al.  Robot Transparency and Team Orientation Effects on Human–Robot Teaming , 2020, Int. J. Hum. Comput. Interact..

[16]  C. Driscoll,et al.  Comparison of an Urban Lake Targeted for Rehabilitation and a Reference Lake Based on Robotic Monitoring , 2007 .

[17]  Albert W. Jones,et al.  Research Roadmap for Smart Fire Fighting , 2015 .

[18]  Luca Maria Gambardella,et al.  Interactive Augmented Reality for understanding and analyzing multi-robot systems , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  Eliseo Ferrante,et al.  ARGoS: a modular, parallel, multi-engine simulator for multi-robot systems , 2012, Swarm Intelligence.

[20]  Thomas Witte,et al.  Debugging Quadrocopter Trajectories in Mixed Reality , 2019, AVR.

[21]  Inman Harvey,et al.  Noise and the Reality Gap: The Use of Simulation in Evolutionary Robotics , 1995, ECAL.

[22]  Gerald Matthews,et al.  Individual Differences in Trust in Autonomous Robots: Implications for Transparency , 2020, IEEE Transactions on Human-Machine Systems.

[23]  G. A. Miller THE PSYCHOLOGICAL REVIEW THE MAGICAL NUMBER SEVEN, PLUS OR MINUS TWO: SOME LIMITS ON OUR CAPACITY FOR PROCESSING INFORMATION 1 , 1956 .

[24]  Michael A. Goodrich,et al.  Transparency: Transitioning From Human–Machine Systems to Human-Swarm Systems , 2019, Journal of Cognitive Engineering and Decision Making.

[25]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[26]  Jessie Y. C. Chen,et al.  Agent Reasoning Transparency: The Influence of Information Level on Automation Induced Complacency , 2017 .

[27]  Anand K. Gramopadhye,et al.  Measurement of trust in complex and dynamic systems using a quantitative approach , 2004 .

[28]  Prasanna Velagapudi,et al.  Choosing Autonomy Modes for Multirobot Search , 2010, Hum. Factors.

[29]  Carlo Pinciroli,et al.  Mixed-Granularity Human-Swarm Interaction , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[30]  Luis Felipe Gonzalez,et al.  Increasing Autonomy Transparency through capability communication in multiple heterogeneous UAV management , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[31]  Julia L. Wright Transparency in Human-agent Teaming and its Effect on Automation-induced Complacency , 2015 .

[32]  Jeong Ho Lee,et al.  Bridge inspection robot system with machine vision , 2009 .

[33]  Tadayoshi Aoyama,et al.  Enhancing the Transparency by Onomatopoeia for Passivity-Based Time-Delayed Teleoperation , 2020, IEEE Robotics and Automation Letters.

[34]  Rafael Fernández Rubio Mining: The Challenge Knocks on our Door , 2012 .

[35]  Carlo Pinciroli,et al.  Improving Human Performance Using Mixed Granularity of Control in Multi-Human Multi-Robot Interaction , 2020, 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN).

[36]  Daniel Barber,et al.  Exploring the Effect of Communication Patterns and Transparency on Performance in a Human-Robot Team , 2019, Proceedings of the Human Factors and Ergonomics Society Annual Meeting.

[37]  R. M. Taylor,et al.  Situational Awareness Rating Technique (Sart): The Development of a Tool for Aircrew Systems Design , 2017 .

[38]  David W. Payton,et al.  World embedded interfaces for human-robot interaction , 2003, 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the.

[39]  Burt L. Monroe,et al.  Partial Justification of the Borda Count , 1998 .

[40]  Katia Sycara,et al.  Influence of Culture, Transparency, Trust, and Degree of Automation on Automation Use , 2020, IEEE Transactions on Human-Machine Systems.

[41]  Francisco S. Melo,et al.  Effects of Agents' Transparency on Teamwork , 2019, EXTRAAMAS@AAMAS.

[42]  M. Friedman The Use of Ranks to Avoid the Assumption of Normality Implicit in the Analysis of Variance , 1937 .