Optimum Camera Angle for Optic Flow-Based Centering Response

We present analytical and empirical investigations into the optimum camera angle to use for the optic flow-based centering response. This technique is commonly used to guide both ground-based and aerial robots between obstacles. A variety of camera angles have been implemented by researchers in the past, but surprisingly little mention is made of the motivation for these camera angle choices, nor has an investigation into the optimum camera angle been conducted. Our investigation shows that camera angle plays a key role in the performance of control strategies for the centering response, and both empirical and analytical investigations show the optimum camera angle to be 45 degrees when traveling between parallel obstacles

[1]  Gaurav S. Sukhatme,et al.  A comparison of two camera configurations for optic-flow based navigation of a UAV through urban canyons , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[2]  Stefan Hrabar Vision-Based 3D Navigation for an Autonomous Helicopter , 2006 .

[3]  Antonis A. Argyros,et al.  Corridor Following by Nonholonomic Mobile Robots Equipped with Panoramic Cameras , 2000 .

[4]  Dario Floreano,et al.  Toward 30-gram Autonomous Indoor Aircraft: Vision-based Obstacle Avoidance and Altitude Control , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[5]  Hateren,et al.  Blowfly flight and optic flow. I. Thorax kinematics and flight dynamics , 1999, The Journal of experimental biology.

[6]  Geoffrey L. Barrows,et al.  Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[7]  Antonis A. Argyros,et al.  Biomimetic centering behavior [mobile robots with panoramic sensors] , 2004, IEEE Robotics & Automation Magazine.

[8]  Ta Camus Calculating time-to-collision with real-time optical flow , 1994, Other Conferences.

[9]  Andrew Howard,et al.  Design and use paradigms for Gazebo, an open-source multi-robot simulator , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[10]  M. Srinivasan,et al.  Range perception through apparent image speed in freely flying honeybees , 1991, Visual Neuroscience.

[11]  Antonis A. Argyros,et al.  Combining central and peripheral vision for reactive robot navigation , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[12]  Svetha Venkatesh,et al.  Robot navigation inspired by principles of insect vision , 1999, Robotics Auton. Syst..