Efficient velocity estimation for MAVs by fusing motion from two frontally parallel cameras

Abstract Efficient velocity estimation is crucial for the robust operation of navigation control loops of micro aerial vehicles (MAVs). Motivated by the research on how animals exploit their visual topographies to rapidly perform locomotion, we propose a bio-inspired method that applies quasi-parallax technique to estimate the velocity of an MAV equipped with a forward-looking stereo camera without GPS. Different to the available optical flow-based methods, our method can realize efficient metric velocity estimation without applying any depth information from either additional distance sensors or from stereopsis. In particular, the quasi-parallax technique, which claims to press maximal benefits from the configuration of two frontally parallel cameras, leverages pairs of parallel visual rays to eliminate rotational flow for translational velocity estimation, followed by refinement of the estimation of rotational velocity and translational velocity iteratively and alternately. Our method fuses the motion information from two frontal-parallel cameras without performing correspondences matching, achieving enhanced robustness and efficiency. Extensive experiments on synthesized and actual scenes demonstrate the effectiveness and efficiency of our method.

[1]  Friedrich Fraundorfer,et al.  Visual Odometry Part I: The First 30 Years and Fundamentals , 2022 .

[2]  Larry H. Matthies,et al.  4DoF drift free navigation using inertial cues and optical flow , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Hilbert J. Kappen,et al.  Efficient Optical Flow and Stereo Vision for Velocity Estimation and Obstacle Avoidance on an Autonomous Pocket Drone , 2016, IEEE Robotics and Automation Letters.

[4]  Mandyam V. Srinivasan,et al.  From Visual Guidance in Flying Insects to Autonomous Aerial Vehicles , 2010, Flying Insects and Robots.

[5]  Hongdong Li,et al.  Motion Estimation for Nonoverlapping Multicamera Rigs: Linear Algebraic and L∞ Geometric Solutions , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  J. M. M. Montiel,et al.  ORB-SLAM: A Versatile and Accurate Monocular SLAM System , 2015, IEEE Transactions on Robotics.

[7]  Loong Fah Cheong,et al.  Quasi-Parallax for Nearly Parallel Frontal Eyes , 2012, International Journal of Computer Vision.

[8]  J H Rieger,et al.  Processing differential image motion. , 1985, Journal of the Optical Society of America. A, Optics and image science.

[9]  Marc Pollefeys,et al.  Motion Estimation for Self-Driving Cars with a Generalized Camera , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[10]  Robert Pless Camera cluster in motion: motion estimation for generalized camera designs , 2004, IEEE Robotics & Automation Magazine.

[11]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[12]  Marc Pollefeys,et al.  Real-time velocity estimation based on optical flow and disparity matching , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[14]  Hilbert J. Kappen,et al.  Local histogram matching for efficient optical flow computation applied to velocity estimation on pocket drones , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[15]  Michael J. Black,et al.  Secrets of optical flow estimation and their principles , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[16]  G. Martin What is binocular vision for? A birds' eye view. , 2009, Journal of vision.

[17]  Dario Floreano,et al.  Optic-Flow Based Control of a 46g Quadrotor , 2013 .

[18]  Marc Pollefeys,et al.  Autonomous Visual Mapping and Exploration With a Micro Aerial Vehicle , 2014, J. Field Robotics.

[19]  Flavio Fontana,et al.  Autonomous, Vision‐based Flight and Live Dense 3D Mapping with a Quadrotor Micro Aerial Vehicle , 2016, J. Field Robotics.

[20]  Shinpei Kato,et al.  An Open Approach to Autonomous Vehicles , 2015, IEEE Micro.

[21]  Marc Pollefeys,et al.  An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications , 2013, 2013 IEEE International Conference on Robotics and Automation.

[22]  Emanuele Trucco,et al.  Introductory techniques for 3-D computer vision , 1998 .