Usability evaluation with different viewpoints of a Human-Swarm interface for UAVs control in formation

A common way to organize a high number of robots, both when moving autonomously and when controlled by a human operator, is to let them move in formation. This is a principle that takes inspiration from the nature, that maximizes the possibility of monitoring the environment and therefore of anticipating risks and finding targets. In robotics, alongside these reasons, the organization of a robot team in a formation allows a human operator to deal with a high number of agents in a simpler way, moving the swarm as a single entity. In this context, the typology of visual feedback is fundamental for a correct situational awareness, but in common practice having an optimal camera configuration is not always possible. Usually human operators use cameras on board the multirotors, with an egocentric point of view, while it is known that in mobile robotics overall awareness and pattern recognition are optimized by exocentric views. In this article we present an analysis of the performance achieved by human operators controlling a swarm of UAVs in formation, accomplishing different tasks and using different point of views. The control architecture is implemented in a ROS framework and interfaced with a 3D simulation environment. Experimental tests show a degradation of performance while using egocentric cameras with respect of an exocentric point of view, although cameras on board the robots allow to satisfactorily accomplish simple tasks.

[1]  P. Ogren Split and join of vehicle formations doing obstacle avoidance , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[2]  Maja J. Mataric,et al.  Minimizing complexity in controlling a mobile robot population , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[3]  Paul Keng-Chieh Wang Navigation strategies for multiple autonomous mobile robots moving in formation , 1991, J. Field Robotics.

[4]  Jessie Y. C. Chen,et al.  Human Performance Issues and User Interface Design for Teleoperated Robots , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[5]  Michael Lewis,et al.  Gravity-Referenced Attitude Display for Mobile Robots: Making Sense of What We See , 2007, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[6]  Nobuto Matsuhira,et al.  Virtual Robot Experimentation Platform V-REP: A Versatile 3D Robot Simulator , 2010, SIMPAR.

[7]  Q. P. Ha,et al.  Motion Coordination for Construction Vehicles using Swarm Intelligence , 2007 .

[8]  Prasanna Velagapudi,et al.  Scaling effects for streaming video vs. static panorama in multirobot search , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Nak Young Chong,et al.  Decentralized formation control for small-scale robot teams with anonymity , 2009 .

[10]  Christopher M. Clark,et al.  Kinematic path-planning for formations of mobile robots with a nonholonomic constraint , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Mario Innocenti,et al.  Human-Swarm Interface for Abstraction Based Control , 2009 .

[12]  Adam R. Richardson,et al.  The effect of feedback training on distance estimation in virtual environments , 2005 .

[13]  Tucker R. Balch,et al.  Behavior-based formation control for multirobot teams , 1998, IEEE Trans. Robotics Autom..

[14]  Paul Milgram,et al.  Real World Teleoperation via Virtual Environment Modelling , 1997 .

[15]  Tucker R. Balch,et al.  AuRA: principles and practice in review , 1997, J. Exp. Theor. Artif. Intell..

[16]  L Gugerty,et al.  Seeing where you are heading: integrating environmental and egocentric reference frames in cardinal direction judgments. , 2001, Journal of experimental psychology. Applied.