Towards weeds identification assistance through transfer learning

Abstract Reducing the use of pesticides through selective spraying is an important component towards a more sustainable computer-assisted agriculture. Weed identification at early growth stage contributes to reduced herbicide rates. However, while computer vision alongside deep learning have overcome the performance of approaches that use hand-crafted features, there are still some open challenges in the development of a reliable automatic plant identification system. These type of systems have to take into account different sources of variability, such as growth stages and soil conditions, with the added constraint of the limited size of usual datasets. This study proposes a novel crop/weed identification system that relies on a combination of fine-tuning pre-trained convolutional networks (Xception, Inception-Resnet, VGNets, Mobilenet and Densenet) with the “traditional” machine learning classifiers (Support Vector Machines, XGBoost and Logistic Regression) trained with the previously deep extracted features. The aim of this approach was to avoid overfitting and to obtain a robust and consistent performance. To evaluate this approach, an open access dataset of two crop [tomato (Solanum lycopersicum L.) and cotton (Gossypium hirsutum L.)] and two weed species [black nightshade (Solanum nigrum L.) and velvetleaf (Abutilon theophrasti Medik.)] was generated. The pictures were taken by different production sites across Greece under natural variable light conditions from RGB cameras. The results revealed that a combination of fine-tuned Densenet and Support Vector Machine achieved a micro F1 score of 99.29% with a very low performance difference between train and test sets. Other evaluated approaches also obtained repeatedly more than 95% F1 score. Additionally, our results analysis provides some heuristics for designing transfer-learning based systems to avoid overfitting without decreasing performance.

[1]  Cyrill Stachniss,et al.  Real-Time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs , 2017, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[2]  Cyrill Stachniss,et al.  UAV-based crop and weed classification for smart farming , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[3]  Kilian Q. Weinberger,et al.  Densely Connected Convolutional Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[4]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[5]  Serena Villata,et al.  Convolutional Ladder Networks for Legal NERC and the Impact of Unsupervised Data in Better Generalizations , 2019, FLAIRS.

[6]  Lazaros Nalpantidis,et al.  Deep learning-based visual recognition of rumex for robotic precision farming , 2019, Comput. Electron. Agric..

[7]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[8]  Sergey Ioffe,et al.  Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning , 2016, AAAI.

[9]  Joris IJsselmuiden,et al.  Transfer learning for the classification of sugar beet and volunteer potato under field conditions , 2018, Biosystems Engineering.

[10]  Jörn Ostermann,et al.  A Crop/Weed Field Image Dataset for the Evaluation of Computer Vision Based Precision Agriculture Tasks , 2014, ECCV Workshops.

[11]  Shun'ichi Kaneko,et al.  Image-based field monitoring of Cercospora leaf spot in sugar beet by robust template matching and pattern recognition , 2015, Comput. Electron. Agric..

[12]  Clive H. Bock,et al.  Plant Disease Severity Estimated Visually, by Digital Photography and Image Analysis, and by Hyperspectral Imaging , 2010 .

[13]  Daniele Nardi,et al.  Fast and Accurate Crop and Weed Identification with Summarized Train Sets for Precision Agriculture , 2016, IAS.

[14]  Sergey Ioffe,et al.  Rethinking the Inception Architecture for Computer Vision , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[15]  Lazaros Nalpantidis,et al.  Weed recognition framework for robotic precision farming , 2016, 2016 IEEE International Conference on Imaging Systems and Techniques (IST).

[16]  Saygin Abdikan,et al.  A Booster Analysis of Extreme Gradient Boosting for Crop Classification using PolSAR Imagery , 2019, 2019 8th International Conference on Agro-Geoinformatics (Agro-Geoinformatics).

[17]  M. Wackernagel,et al.  Shrink and share: humanity's present and future Ecological Footprint , 2008, Philosophical Transactions of the Royal Society B: Biological Sciences.

[18]  G. Meyer,et al.  Color indices for weed identification under various soil, residue, and lighting conditions , 1994 .

[19]  Cyrill Stachniss,et al.  Effective Vision‐based Classification for Separating Sugar Beets and Weeds for Precision Farming , 2017, J. Field Robotics.

[20]  Yoshua Bengio,et al.  Deep Learning of Representations for Unsupervised and Transfer Learning , 2011, ICML Unsupervised and Transfer Learning.

[21]  Abdolabbas Jafari,et al.  Evaluation of support vector machine and artificial neural networks in weed detection using shape features , 2018, Comput. Electron. Agric..

[22]  François Chollet,et al.  Xception: Deep Learning with Depthwise Separable Convolutions , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[23]  Robert L. Zimdahl,et al.  Fundamentals of Weed Science , 1993 .

[24]  E. Oerke Crop losses to pests , 2005, The Journal of Agricultural Science.

[25]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[26]  Jörn Ostermann,et al.  Plant classification system for crop /weed discrimination without segmentation , 2014, IEEE Winter Conference on Applications of Computer Vision.

[27]  Mark D. McDonnell,et al.  Understanding Data Augmentation for Classification: When to Warp? , 2016, 2016 International Conference on Digital Image Computing: Techniques and Applications (DICTA).

[28]  Jayme Garcia Arnal Barbedo,et al.  Factors influencing the use of deep learning for plant disease recognition , 2018, Biosystems Engineering.

[29]  Rong Zhou,et al.  Disease detection of Cercospora Leaf Spot in sugar beet by robust template matching , 2014 .

[30]  Marcel Salathé,et al.  Using Deep Learning for Image-Based Plant Disease Detection , 2016, Front. Plant Sci..

[31]  N. Otsu A threshold selection method from gray level histograms , 1979 .

[32]  Robin Gebbers,et al.  Precision Agriculture and Food Security , 2010, Science.

[33]  Tristan Perez,et al.  Mixtures of Lightweight Deep Convolutional Neural Networks: Applied to Agricultural Robotics , 2017, IEEE Robotics and Automation Letters.

[34]  Konstantinos P. Ferentinos,et al.  Deep learning models for plant disease detection and diagnosis , 2018, Comput. Electron. Agric..

[35]  Yu Sun,et al.  Automatic Image-Based Plant Disease Severity Estimation Using Deep Learning , 2017, Comput. Intell. Neurosci..

[36]  George A. Triantafyllidis,et al.  Image-based recognition framework for robotic weed control systems , 2017, Multimedia Tools and Applications.

[37]  M. A. Rizzardi,et al.  Métodos de quantificação da cobertura foliar da infestação de plantas daninhas e da cultura da soja , 2004 .

[38]  Vassilis P. Plagianakos,et al.  Pathway analysis using XGBoost classification in Biomedical Data , 2018, SETN.

[39]  David G. Kleinbaum,et al.  Logistic Regression. A Self- Learning Text , 1994 .

[40]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.