Autonomous navigation for a wolfberry picking robot using visual cues and fuzzy control

Abstract Lycium barbarum commonly known as wolfberry or Goji is considered an important ingredient in Japanese, Korean, Vietnamese, and Chinese food and medicine. It is cultivated extensively in these countries and is usually harvested manually, which is labor-intensive and tedious task. To improve the efficiency of harvesting and reduce manual labor, automatic harvesting technology has been investigated by many researchers in recent years. In this paper, an autonomous navigation algorithm using visual cues and fuzzy control is proposed for Wolfberry orchards. At first, we propose a new weightage (2.4B-0.9G-R) to convert a color image into a grayscale image for better identification of the trunk of Lycium barbarum, the minimum positive circumscribed rectangle is used to describe the contours. Then, using the contour points the least square method is used to fit the navigation line and a region of interest (ROI) is computed that improves the real-time accuracy of the system. Finally, a set of fuzzy controllers, for pneumatic steering system, is designed to achieve real-time autonomous navigation in wolfberry orchard. Static image experiments show that the average accuracy rate of the algorithm is above 90%, and the average time consumption is approximately 162 ms, with good robustness and real-time performance. The experimental results show that when the speed is 1 km/h, the maximum lateral deviation is less than 6.2 cm and the average lateral deviation is 2.9 cm, which meets the requirements of automatic picking of wolfberry picking robot in real-world environments.

[1]  Tony E Grift,et al.  Variable field-of-view machine vision based row guidance of an agricultural robot , 2012 .

[2]  Xavier P. Burgos-Artizzu,et al.  utomatic segmentation of relevant textures in agricultural images , 2010 .

[3]  Rafael Vieira de Sousa,et al.  A Row Crop Following Behavior based on Primitive Fuzzy Behaviors for Navigation System of Agricultural Robots , 2013 .

[4]  Duke M. Bulanon,et al.  Machine vision for orchard navigation , 2018, Comput. Ind..

[5]  Chunlei Zhao,et al.  Using pedotransfer functions to estimate soil hydraulic conductivity in the Loess Plateau of China , 2016 .

[6]  Zhen Li,et al.  Tractor path tracking control based on binocular vision , 2018 .

[7]  Qin Zhang,et al.  A visual navigation algorithm for paddy field weeding robot based on image understanding , 2017, Comput. Electron. Agric..

[8]  N. Otsu A threshold selection method from gray level histograms , 1979 .

[9]  Gonzalo Pajares,et al.  Automatic detection of curved and straight crop rows from images in maize fields , 2017 .

[10]  C. Huck,et al.  Near-infrared reflection spectroscopy and partial least squares regression for determining the total carbon coverage of silica packings for liquid chromatography , 2009 .

[11]  S. Andrew Gadsden,et al.  An overview of autonomous crop row navigation strategies for unmanned ground vehicles , 2019, Engineering in Agriculture, Environment and Food.

[12]  Kyung-Soo Kim,et al.  Morphology-based guidance line extraction for an autonomous weeding robot in paddy fields , 2015, Comput. Electron. Agric..

[13]  Jing Xu,et al.  Visual navigation control for agricultural robot using serial BP neural network , 2011 .

[14]  Christophe Collewet,et al.  Fuzzy adaptive controller design for the joint space control of an agricultural robot , 1998, Fuzzy Sets Syst..

[15]  G. van Straten,et al.  A vision based row detection system for sugar beet , 2005 .

[16]  Gang Liu,et al.  Development of agricultural implement system based on machine vision and fuzzy control , 2015, Comput. Electron. Agric..

[17]  Zhenghe Song,et al.  Multi-crop-row detection algorithm based on binocular vision , 2016 .

[18]  Gonzalo Pajares,et al.  Automatic detection of crop rows in maize fields with high weeds pressure , 2012, Expert Syst. Appl..

[19]  John F. Reid,et al.  A guidance directrix approach to vision-based vehicle guidance systems , 2004 .