Scaling Effects for Synchronous vs. Asynchronous Video in Multi-robot Search

Camera guided teleoperation has long been the preferred mode for controlling remote robots with other modes such as asynchronous control only used when unavoidable. Because controlling multiple robots places additional demands on the operator we hypothesized that removing the forced pace for reviewing camera video might reduce workload and improve performance. In an earlier experiment participants operated four teams performing a simulated urban search and rescue (USAR) task using a conventional streaming video plus map interface or an experimental interface without streaming video but with the ability to store panoramic images on the map to be viewed at leisure. Search performance was somewhat better using the conventional interface; however, ancillary measures suggested that the asynchronous interface succeeded in reducing temporal demands for switching between robots. This raised the possibility that the asynchronous interface might perform better if teams were larger. In this experiment we evaluate the usefulness of asynchronous video for teams of 4, 8, or 12 robots. As in our earlier study we found a slight advantage in accuracy in marking victim locations for streaming video but overall performance was very similar.

[1]  Jean Scholtz,et al.  Beyond usability evaluation: analysis of human-robot interaction at a major robotics competition , 2004 .

[2]  Prasanna Velagapudi,et al.  Human teams for large scale multirobot control , 2009, 2009 IEEE International Conference on Systems, Man and Cybernetics.

[3]  Desney S. Tan,et al.  Exploring 3D navigation: combining speed-coupled flying with orbiting , 2001, CHI.

[4]  Paul Milgram,et al.  Real World Teleoperation via Virtual Environment Modelling , 1997 .

[5]  Douglas E. Mcgovern Experiences in teleoperation of land vehicles , 1989 .

[6]  Robin R. Murphy,et al.  Human-robot interactions during the robot-assisted urban search and rescue response at the World Trade Center , 2003, IEEE Trans. Syst. Man Cybern. Part B.

[7]  Arnoud Visser,et al.  Towards heterogeneous robot teams for disaster mitigation: Results and performance metrics from RoboCup rescue , 2007, J. Field Robotics.

[8]  Ronald L. Boring,et al.  "Turn Off the Television!": Real-World Robotic Exploration Experiments with a Virtual 3-D Display , 2005, Proceedings of the 38th Annual Hawaii International Conference on System Sciences.

[9]  Michael Lewis,et al.  Human control for cooperating robot teams , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[10]  Michael A. Goodrich,et al.  Comparing the usefulness of video and map information in navigation tasks , 2006, HRI '06.

[11]  Michael Lewis,et al.  USARSim: Simulation for the Study of Human-Robot Interaction , 2007 .

[12]  H. Yanco,et al.  Analysis of Human-Robot Interaction for Urban Search and Rescue , 2006 .

[13]  Barry Peterson,et al.  Effects of streaming video quality of service on spatial comprehension in a reconnaissance task , 2001 .

[14]  Prasanna Velagapudi,et al.  Synchronous vs. Asynchronous Video in Multi-robot Search , 2008, First International Conference on Advances in Computer-Human Interaction.

[15]  Mark Fiala Pano-presence for teleoperation , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[17]  Michael A. Goodrich,et al.  Validating human-robot interaction schemes in multitasking environments , 2005, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[18]  Michael A. Goodrich,et al.  Ecological displays for robot interaction: a new perspective , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[19]  Richard Volpe Navigation Results from Desert Field Tests of the Rocky 7 Mars Rover Prototype , 1999, Int. J. Robotics Res..

[20]  J. R. Murphy Application of panospheric imaging to a teleoperated lunar rover , 1995, 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century.

[21]  Prasanna Velagapudi,et al.  How search and its subtasks scale in N robots , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[22]  Julie A. Adams,et al.  Assessing the scalability of a multiple robot interface , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[23]  Michael Lewis,et al.  Gravity-Referenced Attitude Display for Mobile Robots: Making Sense of What We See , 2007, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[24]  Dan R. Olsen,et al.  Fan-out: measuring human control of multiple robots , 2004, CHI.

[25]  F. Matsuno,et al.  Study on effective camera images for mobile robot teleoperation , 2004, RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759).

[26]  Michael Lewis,et al.  Assessing coordination overhead in control of robot teams , 2007, 2007 IEEE International Conference on Systems, Man and Cybernetics.

[27]  Holly A. Yanco,et al.  "Where am I?" Acquiring situation awareness using a remote robot platform , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[28]  Terrence Fong,et al.  Vehicle Teleoperation Interfaces , 2001, Auton. Robots.