Virtually Adapted Reality and Algorithm Visualization for Autonomous Robots

Autonomous mobile robots are often videotaped during operation, whether for later evaluation by their developers or for demonstration of the robots to others. Watching such videos is engaging and interesting. However, clearly the plain videos do not show detailed information about the algorithms running on the moving robots, leading to a rather limited visual understanding of the underlying autonomy. Researchers have resorted to following the autonomous robots algorithms through a variety of methods, most commonly graphical user interfaces running on offboard screens and separated from the captured videos. Such methods enable considerable debugging, but still have limited effectiveness, as there is an inevitable visual mismatch with the video capture. In this work, we aim to break this disconnect, and we contribute the ability to overlay visualizations onto a video, to extract the robot’s algorithms, in particular to follow its route planning and execution. We further provide mechanisms to create and visualize virtual adaptations of the real environment to enable the exploration of the behavior of the algorithms in new situations. We demonstrate the complete implementation with an autonomous quadrotor navigating in a lab environment using the rapidly-exploring random tree algorithm. We briefly motivate and discuss our follow-up visualization work for our complex small-size robot soccer team.

[1]  Robert Sedgewick,et al.  A system for algorithm animation , 1984, SIGGRAPH.

[2]  Manuela M. Veloso,et al.  CMDragons 2015: Coordinated Offense and Defense of the SSL Champions , 2015, RoboCup.

[3]  Jean Ponce,et al.  Computer Vision: A Modern Approach , 2002 .

[4]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[5]  Manuela M. Veloso,et al.  Layered Disclosure: Revealing Agents' Internals , 2000, ATAL.

[6]  Justus H. Piater,et al.  Robust incremental rectification of sports video sequences , 2004, BMVC.

[7]  Andrew H. Fagg,et al.  Real time visualization of robot state with mobile virtual reality , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[8]  Won S. Kim,et al.  Virtual Reality Calibration and Preview/Predictive Displays for Telerobotics , 1996, Presence: Teleoperators & Virtual Environments.

[9]  Manuela M. Veloso,et al.  SSL-Vision: The Shared Vision System for the RoboCup Small Size League , 2009, RoboCup.

[10]  S. LaValle Rapidly-exploring random trees : a new tool for path planning , 1998 .

[11]  Manuela M. Veloso,et al.  Selectively Reactive Coordination for a Team of Robot Soccer Champions , 2016, AAAI.

[12]  Manuela M. Veloso,et al.  Real-Time Randomized Path Planning for Robot Navigation , 2002, RoboCup.

[13]  Jean Serra,et al.  Image Analysis and Mathematical Morphology , 1983 .

[14]  Toby H. J. Collett,et al.  Augmented reality visualisation for mobile robot developers , 2007 .

[15]  Ravi Teja Chadalavada,et al.  That's on my mind! robot to human intention communication through on-board projection on shared floor space , 2015, 2015 European Conference on Mobile Robots (ECMR).