Optical flow for self-supervised learning of obstacle appearance

We introduce a novel setup of self-supervised learning (SSL), in which optical flow provides the supervised outputs. Optical flow requires significant movement for obstacle detection. The main advantage of the introduced method is that after learning, a robot can detect obstacles without moving - reducing the risk of collisions in narrow spaces. We investigate this novel setup of SSL in the context of a Micro Air Vehicle (MAV) that needs to select a suitable landing place. Initially, when the MAV flies over a potential landing area, the optical flow processing estimates a `surface roughness' measure, capturing whether there are obstacles sticking out of the landing surface. This measure allows the MAV to select a safe landing place and then land with other optical flow measures such as the divergence. During flight, SSL takes place. For each image a texton distribution is extracted (capturing the visual appearance of the landing surface in sight), and mapped to the current roughness value by a linear regression function. We first demonstrate this principle to work with offline tests involving images captured on board an MAV, and then demonstrate the principle in flight. The experiments show that the MAV can land safely on the basis of optical flow. After learning it can also successfully select safe landing spots in hover. It is even shown that the appearance learning allows the pixel-wise segmentation of obstacles.

[1]  Olivier Stasse,et al.  MonoSLAM: Real-Time Single Camera SLAM , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Norbert Boeddeker,et al.  A universal strategy for visually guided landing , 2013, Proceedings of the National Academy of Sciences.

[3]  B. Remes,et al.  Design, Aerodynamics, and Vision-Based Control of the DelFly , 2009 .

[4]  Tom Drummond,et al.  Machine Learning for High-Speed Corner Detection , 2006, ECCV.

[5]  Roland Brockers,et al.  Micro air vehicle autonomous obstacle avoidance from stereo-vision , 2014, Defense + Security Symposium.

[6]  Teuvo Kohonen,et al.  The self-organizing map , 1990 .

[7]  P. Zingaretti,et al.  Autonomous safe landing of a vision guided helicopter , 2010, Proceedings of 2010 IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications.

[8]  Sebastian Thrun,et al.  Reverse Optical Flow for Self-Supervised Adaptive Autonomous Robot Navigation , 2007, International Journal of Computer Vision.

[9]  Nicolas H. Franceschini,et al.  Aerial robot piloted in steep relief by optic flow sensors , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Nicholas Roy,et al.  Autonomous Flight in Unknown Indoor Environments , 2009 .

[11]  Mandyam V. Srinivasan,et al.  Visual Control of Flight Speed and Height in the Honeybee , 2006, SAB.

[12]  Soon-Jo Chung,et al.  MVCSLAM: Mono-Vision Corner SLAM for Autonomous Micro-Helicopters in GPS Denied Environments , 2008 .

[13]  Stéphane Viollet,et al.  Neuromimetic Robots Inspired by Insect Vision , 2008 .

[14]  Urs A. Muller,et al.  Real-time adaptive off-road vehicle navigation and terrain classification , 2013, Defense, Security, and Sensing.

[15]  Larry Matthies,et al.  Stereo vision and rover navigation software for planetary exploration , 2002, Proceedings, IEEE Aerospace Conference.

[16]  M. Srinivasan,et al.  Visual control of flight speed in honeybees , 2005, Journal of Experimental Biology.

[17]  Roland Siegwart,et al.  Vision based MAV navigation in unknown and unstructured environments , 2010, 2010 IEEE International Conference on Robotics and Automation.

[18]  Youdan Kim,et al.  Landing Site Searching Algorithm of a Quadrotor Using Depth Map of Stereo Vision on Unknown Terrain , 2012, Infotech@Aerospace.

[19]  B. Remes,et al.  Optic-Flow Based Slope Estimation for Autonomous Landing , 2013 .

[20]  Guido C. H. E. de Croon,et al.  The appearance variation cue for obstacle avoidance , 2012, 2010 IEEE International Conference on Robotics and Biomimetics.

[21]  Robert E. Mahony,et al.  The landing problem of a VTOL Unmanned Aerial Vehicle on a moving platform using optical flow , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Roland Siegwart,et al.  Monocular‐SLAM–based navigation for autonomous micro helicopters in GPS‐denied environments , 2011, J. Field Robotics.

[23]  Robert P. W. Duin,et al.  A Matlab Toolbox for Pattern Recognition , 2004 .

[24]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[25]  Sebastian Thrun,et al.  Stanley: The robot that won the DARPA Grand Challenge , 2006, J. Field Robotics.

[26]  Tom Drummond,et al.  Fusing points and lines for high performance tracking , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[27]  Andrew Zisserman,et al.  Texture classification: are filter banks necessary? , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[28]  G. Gerhart,et al.  Stereo vision and laser odometry for autonomous helicopters in GPS-denied indoor environments , 2009 .

[29]  D. Izzo,et al.  Landing with Time-to-Contact and Ventral Optic Flow Estimates , 2012 .

[30]  H. C. Longuet-Higgins,et al.  The interpretation of a moving retinal image , 1980, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[31]  Nathan Michael,et al.  Vision-based Landing Site Evaluation and Trajectory Generation Toward Rooftop Landing , 2014, Robotics: Science and Systems.

[32]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[33]  Larry H. Matthies,et al.  Towards Autonomous Navigation of Miniature UAV , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops.

[34]  Heinrich H. Bülthoff,et al.  Robust optical-flow based self-motion estimation for a quadrotor UAV , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.