Quantifying the Impact of the Physical Setup of Stereo Camera Systems on Distance Estimations

The ability to perceive the environment accurately is a core requirement for autonomous navigation. In the past, researchers and practitioners have explored a broad spectrum of sensors that can be used to detect obstacles or to recognize navigation targets. Due to their low hardware cost and high fidelity, stereo camera systems are often considered to be a particularly versatile sensing technology. Consequently, there has been a lot of work on integrating them into mobile robots. However, the existing literature focuses on presenting the concepts and algorithms used to implement the desired robot functions on top of a given camera setup. As a result, the rationale and impact of choosing this camera setup are usually neither discussed nor described. Thus, when designing the stereo camera system for a mobile robot, there is not much general guidance beyond isolated setups that worked for a specific robot. To close the gap, this paper studies the impact of the physical setup of a stereo camera system in indoor environments. To do this, we present the results of an experimental analysis in which we use a given software setup to estimate the distance to an object while systematically changing the camera setup. Thereby, we vary the three main parameters of the physical camera setup, namely the angle and distance between the cameras as well as the field of view. Based on the results, we derive several guidelines on how to choose the parameters for an application.

[1]  Min Young Kim,et al.  Survey on zoom-lens calibration methods and techniques , 2017, Machine Vision and Applications.

[2]  Yan Wang,et al.  Pseudo-LiDAR From Visual Depth Estimation: Bridging the Gap in 3D Object Detection for Autonomous Driving , 2018, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[3]  Takeshi Ohashi,et al.  Obstacle avoidance and path planning for humanoid robots using stereo vision , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[4]  Nils Pohl,et al.  Investigation of a 3D Printed Tetrahedral Aligned Sphere Target at 145 GHz for Radar Positioning , 2018, 2018 Asia-Pacific Microwave Conference (APMC).

[5]  Ozgur Yilmaz,et al.  Stereo and kinect fusion for continuous 3D reconstruction and visual odometry , 2013, 2013 International Conference on Electronics, Computer and Computation (ICECCO).

[6]  Yongyi He,et al.  Research on Biomimetic Coordination Action of Service Robot Based on Stereo Vision , 2018, 2018 2nd IEEE Advanced Information Management,Communicates,Electronic and Automation Control Conference (IMCEC).

[7]  Wolfram Burgard,et al.  The Mobile Robot Rhino , 1995, SNN Symposium on Neural Networks.

[8]  Jon Nelson NASA 'Optometrists' Verify Mars 2020 Rover's 20/20 Vision , 2019 .

[9]  Y. M. Mustafah,et al.  Stereo vision images processing for real-time object distance and size measurements , 2012, 2012 International Conference on Computer and Communication Engineering (ICCCE).

[10]  Mun-Ho Jeong,et al.  Stereo Camera Head-Eye Calibration Based on Minimum Variance Approach Using Surface Normal Vectors , 2018, Sensors.

[11]  Sheng-Fuu Lin,et al.  Moving object detection from a moving stereo camera via depth information and visual odometry , 2018, 2018 IEEE International Conference on Applied System Invention (ICASI).

[12]  Mikhail Frank,et al.  Learning spatial object localization from vision on a humanoid robot , 2012 .

[13]  Christoph Stiller,et al.  Velodyne SLAM , 2011, 2011 IEEE Intelligent Vehicles Symposium (IV).

[14]  Kikuo Fujimura,et al.  The intelligent ASIMO: system overview and integration , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Vasyl Teslyuk,et al.  Development of mobile robot using LIDAR technology based on Arduino controller , 2018, 2018 XIV-th International Conference on Perspective Technologies and Methods in MEMS Design (MEMSTECH).

[16]  Udo Frese,et al.  Grab a mug - Object detection and grasp motion planning with the Nao robot , 2012, 2012 12th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2012).

[17]  Widodo Budiharto,et al.  Multiple Moving Obstacles Avoidance for Humanoid Service Robot Using Stereo Vision  and Bayesian Approach , 2012, 2012 Sixth Asia Modelling Symposium.

[18]  Xiuhua Li,et al.  Survey on Camera Calibration Technique , 2013, 2013 5th International Conference on Intelligent Human-Machine Systems and Cybernetics.

[19]  Stefan Hinz,et al.  Improved wide-angle, fisheye and omnidirectional camera calibration , 2015 .

[20]  Masayuki Inaba,et al.  Walking navigation system of humanoid robot using stereo vision based floor recognition and path planning with multi-layered body image , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[21]  R. A. Hamzah,et al.  Visualization of image distortion on camera calibration for stereo vision application , 2012, 2012 IEEE International Conference on Control System, Computing and Engineering.

[22]  B. Cyganek An Introduction to 3D Computer Vision Techniques and Algorithms , 2009 .

[23]  Mansour Alsulaiman,et al.  Stereo vision SLAM based indoor autonomous mobile robot navigation , 2014, 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014).

[24]  Udo Frese,et al.  GRASPY - Object Manipulation with NAO , 2014, Technology Transfer Experiments from the ECHORD Project.

[25]  Heiko Hirschmüller,et al.  Stereo camera based navigation of mobile robots on rough terrain , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[26]  Richard Green,et al.  Evaluation of real time stereo vision system using web cameras , 2010, 2010 25th International Conference of Image and Vision Computing New Zealand.