Research on Vision-Based Navigation for Plant Protection UAV under the Near Color Background

GPS (Global Positioning System) navigation in agriculture is facing many challenges, such as weak signals in orchards and the high cost for small plots of farmland. With the reduction of camera cost and the emergence of excellent visual algorithms, visual navigation can solve the above problems. Visual navigation is a navigation technology that uses cameras to sense environmental information as the basis of an aircraft flight. It is mainly divided into five parts: Image acquisition, landmark recognition, route planning, flight control, and obstacle avoidance. Here, landmarks are plant canopy, buildings, mountains, and rivers, with unique geographical characteristics in a place. During visual navigation, landmark location and route tracking are key links. When there are significant color-differences (for example, the differences among red, green, and blue) between a landmark and the background, the landmark can be recognized based on classical visual algorithms. However, in the case of non-significant color-differences (for example, the differences between dark green and vivid green) between a landmark and the background, there are no robust and high-precision methods for landmark identification. In view of the above problem, visual navigation in a maize field is studied. First, the block recognition method based on fine-tuned Inception-V3 is developed; then, the maize canopy landmark is recognized based on the above method; finally, local navigation lines are extracted from the landmarks based on the maize canopy grayscale gradient law. The results show that the accuracy is 0.9501. When the block number is 256, the block recognition method achieves the best segmentation. The average segmentation quality is 0.87, and time is 0.251 s. This study suggests that stable visual semantic navigation can be achieved under the near color background. It will be an important reference for the navigation of plant protection UAV (Unmanned Aerial Vehicle).

[1]  Xiushan Wang,et al.  Research on maize canopy center recognition based on nonsignificant color difference segmentation , 2018, PloS one.

[2]  Tristan Perez,et al.  DeepFruits: A Fruit Detection System Using Deep Neural Networks , 2016, Sensors.

[3]  Joonwhoan Lee,et al.  Classification of apple leaf conditions in hyper-spectral images for diagnosis of Marssonina blotch using mRMR and deep neural network , 2018, Comput. Electron. Agric..

[4]  Seong Woo Kwak,et al.  Orchard Free Space and Center Line Estimation Using Naive Bayesian Classifier for Unmanned Ground Self-Driving Vehicle , 2018, Symmetry.

[5]  Hemerson Pistori,et al.  Weed detection in soybean crops using ConvNets , 2017, Comput. Electron. Agric..

[6]  C. Daughtry,et al.  Evaluation of Digital Photography from Model Aircraft for Remote Sensing of Crop Biomass and Nitrogen Status , 2005, Precision Agriculture.

[7]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[8]  Heping Zhu,et al.  Robust Crop and Weed Segmentation under Uncontrolled Outdoor Illumination , 2011, Sensors.

[9]  Study on the adaptability of corn machinery and agronomic requirements in the transition to precision agriculture , 2010, 2010 World Automation Congress.

[10]  Mahmoud Omid,et al.  Color image segmentation with genetic algorithm in a raisin sorting system based on machine vision in variable conditions , 2011, Expert Syst. Appl..

[11]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[12]  Peteris Eizentals,et al.  Fruit Pose Estimation and Stem Touch Detection for Green Pepper Automatic Harvesting , 2016, ISER.

[13]  Santiago Tosetti,et al.  Bounded memory probabilistic mapping of out-of-structure objects in fruit crops environments , 2018, Comput. Electron. Agric..

[14]  Rudolf Scitovski,et al.  Center-based clustering for line detection and application to crop rows detection , 2014 .

[15]  T. Kataoka,et al.  Crop growth estimation system using machine vision , 2003, Proceedings 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003).

[16]  Zhen Liu,et al.  The recognition of litchi clusters and the calculation of picking point in a nocturnal natural environment , 2018 .

[17]  N. D. Tillett,et al.  Automated Crop and Weed Monitoring in Widely Spaced Cereals , 2006, Precision Agriculture.

[18]  Henrik Skov Midtiby,et al.  Plant species classification using deep convolutional neural network , 2016 .

[19]  Kyung-Soo Kim,et al.  Morphology-based guidance line extraction for an autonomous weeding robot in paddy fields , 2015, Comput. Electron. Agric..

[20]  Marcel Salathé,et al.  Using Deep Learning for Image-Based Plant Disease Detection , 2016, Front. Plant Sci..

[21]  Guoquan Jiang,et al.  Wheat rows detection at the early growth stage based on Hough transform and vanishing point , 2016, Comput. Electron. Agric..

[22]  David Jones,et al.  Intensified fuzzy clusters for classifying plant, soil, and residue regions of interest from color images , 2004 .

[23]  Wei Guo,et al.  Illumination invariant segmentation of vegetation for time series wheat images based on decision tree model , 2013 .

[24]  Gonzalo Pajares,et al.  Support Vector Machines for crop/weeds identification in maize fields , 2012, Expert Syst. Appl..

[25]  Sergey Ioffe,et al.  Rethinking the Inception Architecture for Computer Vision , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[26]  Xavier P. Burgos-Artizzu,et al.  utomatic segmentation of relevant textures in agricultural images , 2010 .

[27]  Qin Zhang,et al.  A visual navigation algorithm for paddy field weeding robot based on image understanding , 2017, Comput. Electron. Agric..

[28]  Alberto Tellaeche,et al.  A vision-based method for weeds identification through the Bayesian decision theory , 2008, Pattern Recognit..

[29]  Dongjian He,et al.  Recognition of green apples based on fuzzy set theory and manifold ranking algorithm , 2018 .

[30]  Zhenghong Yu,et al.  Vegetation segmentation robust to illumination variations based on clustering and morphology modelling , 2014 .