Transfer learning between crop types for semantic segmentation of crops versus weeds in precision agriculture

Agricultural robots rely on semantic segmentation for distinguishing between crops and weeds in order to perform selective treatments, increase yield and crop health while reducing the amount of chemicals used. Deep learning approaches have recently achieved both excellent classification performance and real-time execution. However, these techniques also rely on a large amount of training data, requiring a substantial labelling effort, both of which are scarce in precision agriculture. Additional design efforts are required to achieve commercially viable performance levels under varying environmental conditions and crop growth stages. In this paper, we explore the role of knowledge transfer between deep-learning-based classifiers for different crop types, with the goal of reducing the retraining time and labelling efforts required for a new crop. We examine the classification performance on three datasets with different crop types and containing a variety of weeds, and compare the performance and retraining efforts required when using data labelled at pixel level with partially labelled data obtained through a less time-consuming procedure of annotating the segmentation output. We show that transfer learning between different crop types is possible, and reduces training times for up to $80\%$. Furthermore, we show that even when the data used for re-training is imperfectly annotated, the classification performance is within $2\%$ of that of networks trained with laboriously annotated pixel-precision data.

[1]  Jaime Gomez-Gil,et al.  Testing different color spaces based on hue for the environmentally adaptive segmentation algorithm (EASA) , 2009 .

[2]  Wolfram Burgard,et al.  Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields , 2017, Int. J. Robotics Res..

[3]  Mostafa Mehdipour-Ghazi,et al.  Plant identification using deep neural networks via optimization of transfer learning parameters , 2017, Neurocomputing.

[4]  Grzegorz Cielniak,et al.  3D-vision based detection, localization, and sizing of broccoli heads in the field , 2017, J. Field Robotics.

[5]  Jacob Cohen A Coefficient of Agreement for Nominal Scales , 1960 .

[6]  David C. Slaughter,et al.  Autonomous robotic weed control systems: A review , 2008 .

[7]  J. Hemming,et al.  PA—Precision Agriculture: Computer-Vision-based Weed Identification under Field Conditions using Controlled Lighting , 2001 .

[8]  Gonzalo Pajares,et al.  Support Vector Machines for crop/weeds identification in maize fields , 2012, Expert Syst. Appl..

[9]  Cyrill Stachniss,et al.  Effective Vision‐based Classification for Separating Sugar Beets and Weeds for Precision Farming , 2017, J. Field Robotics.

[10]  Anders Krogh Mortensen,et al.  Semantic Segmentation of Mixed Crops using Deep Convolutional Neural Network , 2016 .

[11]  Roberto Cipolla,et al.  Semantic object classes in video: A high-definition ground truth database , 2009, Pattern Recognit. Lett..

[12]  Tristan Perez,et al.  Mixtures of Lightweight Deep Convolutional Neural Networks: Applied to Agricultural Robotics , 2017, IEEE Robotics and Automation Letters.

[13]  Grzegorz Cielniak,et al.  Connected attribute morphology for unified vegetation segmentation and classification in precision agriculture , 2018, Comput. Ind..

[14]  N. Otsu A threshold selection method from gray level histograms , 1979 .

[15]  Grzegorz Cielniak,et al.  Analysis of Morphology-Based Features for Classification of Crop and Weeds in Precision Agriculture , 2018, IEEE Robotics and Automation Letters.

[16]  Roberto Cipolla,et al.  SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[18]  Trevor Darrell,et al.  Caffe: Convolutional Architecture for Fast Feature Embedding , 2014, ACM Multimedia.

[19]  Berrin A. Yanikoglu,et al.  Plant Identification with Deep Learning Ensembles , 2018, CLEF.

[20]  Jörn Ostermann,et al.  Plant classification system for crop /weed discrimination without segmentation , 2014, IEEE Winter Conference on Applications of Computer Vision.

[21]  Roland Siegwart,et al.  weedNet: Dense Semantic Weed Classification Using Multispectral Images and MAV for Smart Farming , 2017, IEEE Robotics and Automation Letters.

[22]  Daniele Nardi,et al.  Fast and Accurate Crop and Weed Identification with Summarized Train Sets for Precision Agriculture , 2016, IAS.

[23]  Rob Fergus,et al.  Predicting Depth, Surface Normals and Semantic Labels with a Common Multi-scale Convolutional Architecture , 2014, 2015 IEEE International Conference on Computer Vision (ICCV).

[24]  Feras Dayoub,et al.  Towards unsupervised weed scouting for agricultural robotics , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[25]  Ciro Potena,et al.  Automatic model based dataset generation for fast and accurate crop and weeds detection , 2016, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[26]  J. A. Schell,et al.  Monitoring vegetation systems in the great plains with ERTS , 1973 .

[27]  Cyrill Stachniss,et al.  Fully Convolutional Networks With Sequential Information for Robust Crop and Weed Detection in Precision Farming , 2018, IEEE Robotics and Automation Letters.

[28]  Cyrill Stachniss,et al.  Real-Time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs , 2017, 2018 IEEE International Conference on Robotics and Automation (ICRA).