A novel approach for a double-check of passable vegetation detection in autonomous ground vehicles

The paper introduces an active way to detect vegetation which is at front of the vehicle in order to give a better decision-making in navigation. Blowing devices are to be used for creating strong wind to effect vegetation. Motion compensation and motion detection techniques are applied to detect foreground objects which are presumably judged as vegetation. The approach enables a double-check process for vegetation detection which was done by a multi-spectral approach, but more emphasizing on the purpose of passable vegetation detection. In all real world experiments we carried out, our approach yields a detection accuracy of over 98%. We furthermore illustrate how the active way can improve the autonomous navigation capabilities of autonomous ground vehicles.

[1]  C. Tucker Red and photographic infrared linear combinations for monitoring vegetation , 1979 .

[2]  Emmanuel G. Collins,et al.  Vibration-based terrain classification using surface profile input frequency responses , 2008, 2008 IEEE International Conference on Robotics and Automation.

[3]  C. Jordan Derivation of leaf-area index from quality of light on the forest floor , 1969 .

[4]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[5]  Alex Pentland,et al.  A Bayesian Computer Vision System for Modeling Human Interaction , 1999, ICVS.

[6]  Martial Hebert,et al.  Natural terrain classification using three‐dimensional ladar data for ground robot mobility , 2006, J. Field Robotics.

[7]  Pietro Perona,et al.  Fast Terrain Classification Using Variable-Length Representation for Autonomous Navigation , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[8]  C. Bregler,et al.  Large displacement optical flow , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[9]  Kim L. Boyer,et al.  Linearized vegetation indices based on a formal statistical framework , 2004, IEEE Transactions on Geoscience and Remote Sensing.

[10]  S. Ullman,et al.  The interpretation of visual motion , 1977 .

[11]  Shirley Dex,et al.  JR 旅客販売総合システム(マルス)における運用及び管理について , 1991 .

[12]  W. Eric L. Grimson,et al.  Using adaptive tracking to classify and monitor activities in a site , 1998, Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231).

[13]  Lars Kuhnert,et al.  Structure overview of vegetation detection. A novel approach for efficient vegetation detection using an active lighting system , 2012, Robotics Auton. Syst..

[14]  Roberto Manduchi,et al.  Ladar-Based Discrimination of Grass from Obstacles for Autonomous Navigation , 2000, ISER.

[15]  Aaron C. Courville,et al.  A Generative Model of Terrain for Autonomous Navigation in Vegetation , 2006, Int. J. Robotics Res..

[16]  A L Rankin,et al.  Daytime Mud Detection for Unmanned Ground Vehicle Autonomous Navigation , 2008 .

[17]  David M. Bradley,et al.  Vegetation Detection for Driving in Complex Environments , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[18]  C. A. Shull A Spectrophotometric Study of Reflection of Light from Leaf Surfaces , 1929, Botanical Gazette.

[19]  L. Davis,et al.  W 4 S: a Real-time System for Detecting and Tracking People in 2 1 2 D , 1998 .

[20]  Karl Iagnemma,et al.  Terrain Classification and Classifier Fusion for Planetary Exploration Rovers , 2008, 2007 IEEE Aerospace Conference.

[21]  Alex Pentland,et al.  Pfinder: Real-Time Tracking of the Human Body , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  Larry S. Davis,et al.  W4S : A real-time system for detecting and tracking people in 2 D , 1998, eccv 1998.

[23]  Klaus-Dieter Kuhnert,et al.  Terrain classification based on structure for autonomous navigation in complex environments , 2010, International Conference on Communications and Electronics 2010.

[24]  Majura F. Selekwa,et al.  Online Terrain Classification for Mobile Robots , 2005 .

[25]  Liang Lu,et al.  Terrain surface classification for autonomous ground vehicles using a 2D laser stripe-based structured light sensor , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[26]  Duong Nguyen,et al.  An Integrated Vision System for Vegetation Detection in Autonomous Ground Vehicles , 2011 .

[27]  L. Kuhnert,et al.  A novel approach of terrain classification for outdoor automobile navigation , 2011, 2011 IEEE International Conference on Computer Science and Automation Engineering.

[28]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[29]  Alex Pentland,et al.  A Bayesian Computer Vision System for Modeling Human Interactions , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[30]  A. Huete A soil-adjusted vegetation index (SAVI) , 1988 .

[31]  Kentaro Toyama,et al.  Wallflower: principles and practice of background maintenance , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[32]  Emmanuel G. Collins,et al.  Frequency response method for terrain classification in autonomous ground vehicles , 2008, Auton. Robots.

[33]  Gunnar Farnebäck,et al.  Two-Frame Motion Estimation Based on Polynomial Expansion , 2003, SCIA.

[34]  智一 吉田,et al.  Efficient Graph-Based Image Segmentationを用いた圃場図自動作成手法の検討 , 2014 .

[35]  L. Kuhnert,et al.  Vegetation detection for outdoor automobile guidance , 2011, 2011 IEEE International Conference on Industrial Technology.