RBF-Based Monocular Vision Navigation for Small Vehicles in Narrow Space below Maize Canopy

Maize is one of the major food crops in China. Traditionally, field operations are done by manual labor, where the farmers are threatened by the harsh environment and pesticides. On the other hand, it is difficult for large machinery to maneuver in the field due to limited space, particularly in the middle and late growth stage of maize. Unmanned, compact agricultural machines, therefore, are ideal for such field work. This paper describes a method of monocular visual recognition to navigate small vehicles between narrow crop rows. Edge detection and noise elimination were used for image segmentation to extract the stalks in the image. The stalk coordinates define passable boundaries, and a simplified radial basis function (RBF)-based algorithm was adapted for path planning to improve the fault tolerance of stalk coordinate extraction. The average image processing time, including network latency, is 220 ms. The average time consumption for path planning is 30 ms. The fast processing ensures a top speed of 2 m/s for our prototype vehicle. When operating at the normal speed (0.7 m/s), the rate of collision with stalks is under 6.4%. Additional simulations and field tests further proved the feasibility and fault tolerance of our method.

[1]  Seishu Tojo,et al.  Machine Vision for a Micro Weeding Robot in a Paddy Field , 2003 .

[2]  Tony E Grift,et al.  Effect of blade oblique angle and cutting speed on cutting energy for energycane stems , 2015 .

[3]  Gonzalo Pajares,et al.  Automatic expert system for weeds/crops identification in images from maize fields , 2013, Expert Syst. Appl..

[4]  John Billingsley,et al.  The successful development of a vision guidance system for agriculture , 1997 .

[5]  Dionysis Bochtis,et al.  Advances in agricultural machinery management: A review , 2014 .

[6]  K. Zhou,et al.  Route planning for orchard operations , 2015, Comput. Electron. Agric..

[7]  Gonzalo Pajares,et al.  Automatic detection of crop rows in maize fields with high weeds pressure , 2012, Expert Syst. Appl..

[8]  Tony E Grift,et al.  Variable field-of-view machine vision based row guidance of an agricultural robot , 2012 .

[9]  Tao Mei,et al.  Drivers’ Visual Behavior-Guided RRT Motion Planner for Autonomous On-Road Driving , 2016, Sensors.

[10]  M. Manzone,et al.  Evaluation of seed dressing dust dispersion from maize sowing machines , 2013 .

[11]  Gonzalo Pajares,et al.  Support Vector Machines for crop/weeds identification in maize fields , 2012, Expert Syst. Appl..

[12]  Gang Liu,et al.  Development of agricultural implement system based on machine vision and fuzzy control , 2015, Comput. Electron. Agric..

[13]  Fabrizio Mazzetto,et al.  Design, development and evaluation of a wireless system for the automatic identification of implements , 2014 .

[14]  Angela Ribeiro,et al.  Highlights and preliminary results for autonomous crop protection , 2015, Comput. Electron. Agric..

[15]  Tao Mei,et al.  Motion Planning for Autonomous Vehicle Based on Radial Basis Function Neural Network in Unstructured Environment , 2014, Sensors.

[16]  N. D. Tillett,et al.  Mechanical within-row weed control for transplanted crops using computer vision , 2008 .

[17]  S. Gan-Mor,et al.  Implement lateral position accuracy under RTK-GPS tractor guidance , 2007 .

[18]  Milan Martinov,et al.  Development of a mechatronic intra-row weeding system with rotational hoeing tools: Theoretical approach and simulation , 2013 .

[19]  Keum Shik Hong,et al.  A Path-Planning Algorithm Using Vector Potential Functions in Triangular Regions , 2013, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[20]  Keum-Shik Hong,et al.  A path following control of an unmanned autonomous forklift , 2009 .

[21]  Marco Bietresato,et al.  Evaluation and stability comparison of different vehicle configurations for robotic agricultural operations on side-slopes , 2015 .

[22]  Tony E Grift,et al.  Design and testing of an intra-row mechanical weeding machine for corn , 2011 .

[23]  G. van Straten,et al.  A vision based row detection system for sugar beet , 2005 .

[24]  F. Truchetet,et al.  Crop/weed discrimination in perspective agronomic images , 2008 .

[25]  T. Bakker,et al.  Autonomous navigation using a robot platform in a sugar beet field , 2011 .

[26]  Kiseon Kim,et al.  A review on application of technology systems, standards and interfaces for agriculture and food sector , 2013, Comput. Stand. Interfaces.

[27]  Zetian Fu,et al.  The classification of the drift risk of sprays produced by spinning discs based on wind tunnel measurements , 2008 .

[28]  Ji Zhang,et al.  Monocular visual navigation of an autonomous vehicle in natural scene corridor-like environments , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[29]  Xavier P. Burgos-Artizzu,et al.  Real-time image processing for crop / weed discrimination in maize fields , 2012 .

[30]  Keum-Shik Hong,et al.  Extraction of sparse features of color images in recognizing objects , 2016, International Journal of Control, Automation and Systems.

[31]  Albert-Jan Baerveldt,et al.  A vision based row-following system for agricultural field machinery , 2005 .

[32]  Keum Shik Hong,et al.  Evaluating a color-based active basis model for object recognition , 2012, Comput. Vis. Image Underst..

[33]  Keum-Shik Hong,et al.  Sonar-based obstacle avoidance using region partition scheme , 2010 .