Transfer learning for the classification of sugar beet and volunteer potato under field conditions

Classification of weeds amongst cash crops is a core procedure in automated weed control. Addressing volunteer potato control in sugar beets, in the EU Smartbot project the aim was to control more than 95% of volunteer potatoes and ensure less than 5% of undesired control of sugar beet plants. A promising way to meet these requirements is deep learning. Training an entire network from scratch, however, requires a large dataset and a substantial amount of time. In this situation, transfer learning can be a promising solution. This study first evaluates a transfer learning procedure with three different implementations of AlexNet and then assesses the performance difference amongst the six network architectures: AlexNet, VGG-19, GoogLeNet, ResNet-50, ResNet-101 and Inception-v3. All nets had been pre-trained on the ImageNet Dataset. These nets were used to classify sugar beet and volunteer potato images taken under ambient varying light conditions in agricultural environments. The highest classification accuracy for different implementations of AlexNet was 98.0%, obtained with an AlexNet architecture modified to generate binary output. Comparing different networks, the highest classification accuracy 98.7%, obtained with VGG-19 modified to generate binary output. Transfer learning proved to be effective and showed robust performance with plant images acquired in different periods of the various years on two types of soils. All scenarios and pre-trained networks were feasible for real-time applications (classification time

[1]  Juan de Lara,et al.  Supporting user-oriented analysis for multi-view domain-specific visual languages , 2009, Inf. Softw. Technol..

[2]  Vijayan K. Asari,et al.  The History Began from AlexNet: A Comprehensive Survey on Deep Learning Approaches , 2018, ArXiv.

[3]  Wojciech Zaremba,et al.  An Empirical Exploration of Recurrent Network Architectures , 2015, ICML.

[4]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  Raquel Urtasun,et al.  Fully Connected Deep Structured Networks , 2015, ArXiv.

[6]  Amy Loutfi,et al.  A review of unsupervised feature learning and deep learning for time-series modeling , 2014, Pattern Recognit. Lett..

[7]  Gui-Song Xia,et al.  Transferring Deep Convolutional Neural Networks for the Scene Classification of High-Resolution Remote Sensing Imagery , 2015, Remote. Sens..

[8]  Albert-Jan Baerveldt,et al.  An Agricultural Mobile Robot with Vision-Based Perception for Mechanical Weed Control , 2002, Auton. Robots.

[9]  Dumitru Erhan,et al.  Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[10]  Ivan Laptev,et al.  Learning and Transferring Mid-level Image Representations Using Convolutional Neural Networks , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[11]  Joris IJsselmuiden,et al.  Sugar beet and volunteer potato classification using Bag-of-Visual-Words model, Scale-Invariant Feature Transform, or Speeded Up Robust Feature descriptors and crop row information , 2018 .

[12]  W. Kühbauch,et al.  A new algorithm for automatic Rumex obtusifolius detection in digital images using colour and texture features and the influence of image resolution , 2007, Precision Agriculture.

[13]  Ronald M. Summers,et al.  Deep Convolutional Neural Networks for Computer-Aided Detection: CNN Architectures, Dataset Characteristics and Transfer Learning , 2016, IEEE Transactions on Medical Imaging.

[14]  S. Christensen,et al.  Colour and shape analysis techniques for weed detection in cereal fields , 2000 .

[15]  Sergey Ioffe,et al.  Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning , 2016, AAAI.

[16]  Stefano Ermon,et al.  Transfer Learning from Deep Features for Remote Sensing and Poverty Mapping , 2015, AAAI.

[17]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[18]  Sang Michael Xie,et al.  Combining satellite imagery and machine learning to predict poverty , 2016, Science.

[19]  Yann LeCun,et al.  Regularization of Neural Networks using DropConnect , 2013, ICML.

[20]  Patrick M. Pilarski,et al.  First steps towards an intelligent laser welding architecture using deep neural networks and reinforcement learning , 2014 .

[21]  Konstantinos Karantzalos,et al.  BENCHMARKING DEEP LEARNING FRAMEWORKS FOR THE CLASSIFICATION OF VERY HIGH RESOLUTION SATELLITE MULTISPECTRAL DATA , 2016 .

[22]  Henrik Skov Midtiby,et al.  Plant species classification using deep convolutional neural network , 2016 .

[23]  Tristan Perez,et al.  DeepFruits: A Fruit Detection System Using Deep Neural Networks , 2016, Sensors.

[24]  Stefan Carlsson,et al.  CNN Features Off-the-Shelf: An Astounding Baseline for Recognition , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops.

[25]  Yu Sun,et al.  Deep Learning for Plant Identification in Natural Environment , 2017, Comput. Intell. Neurosci..

[26]  Ming Dong,et al.  A study of the effectiveness of machine learning methods for classification of clinical interview fragments into a large number of categories , 2016, J. Biomed. Informatics.

[27]  Thomas Brox,et al.  Learning to generate chairs with convolutional neural networks , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[28]  Taghi M. Khoshgoftaar,et al.  A survey of transfer learning , 2016, Journal of Big Data.

[29]  David G. Lowe,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004, International Journal of Computer Vision.

[30]  Xiaogang Wang,et al.  Deep Learning Face Representation from Predicting 10,000 Classes , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[31]  Louis Longchamps,et al.  Discrimination of corn, grasses and dicot weeds by their UV-induced fluorescence spectral signature , 2010, Precision Agriculture.

[32]  Xiang Zhang,et al.  OverFeat: Integrated Recognition, Localization and Detection using Convolutional Networks , 2013, ICLR.

[33]  Sang Cheol Kim,et al.  A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition , 2017, Sensors.

[34]  Yoshua Bengio,et al.  How transferable are features in deep neural networks? , 2014, NIPS.

[35]  Jian Sun,et al.  Identity Mappings in Deep Residual Networks , 2016, ECCV.

[36]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[37]  Ole Green,et al.  Weed identification using an automated active shape matching (AASM) technique , 2011 .

[38]  Jie Wang,et al.  Transferring Pre-Trained Deep CNNs for Remote Scene Classification with General Features Learned from Linear PCA Network , 2017, Remote. Sens..

[39]  Cyrill Stachniss,et al.  An effective classification system for separating sugar beets and weeds for precision farming applications , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[40]  Faisal Ahmed,et al.  Classification of crops and weeds from digital images: A support vector machine approach , 2012 .

[41]  Rob Fergus,et al.  Visualizing and Understanding Convolutional Networks , 2013, ECCV.

[42]  A. T. Nieuwenhuizen,et al.  Automated detection and control of volunteer potato plants , 2009 .

[43]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[44]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[45]  Pablo M. Granitto,et al.  Deep learning for plant identification using vein morphological patterns , 2016, Comput. Electron. Agric..

[46]  David C. Slaughter,et al.  Autonomous robotic weed control systems: A review , 2008 .

[47]  Gamini Dissanayake,et al.  Classification of Bidens in Wheat Farms , 2008, 2008 15th International Conference on Mechatronics and Machine Vision in Practice.

[48]  Jürgen Schmidhuber,et al.  Deep learning in neural networks: An overview , 2014, Neural Networks.

[49]  Zhuowen Tu,et al.  Generalizing Pooling Functions in CNNs: Mixed, Gated, and Tree , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[50]  Tara N. Sainath,et al.  Deep convolutional neural networks for LVCSR , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[51]  Sven Behnke,et al.  Deep Learning , 2012, KI - Künstliche Intelligenz.

[52]  Satoshi Naoi,et al.  Adaptive Local Receptive Field Convolutional Neural Networks for Handwritten Chinese Character Recognition , 2014, CCPR.

[53]  Eugenio Culurciello,et al.  An Analysis of Deep Neural Network Models for Practical Applications , 2016, ArXiv.

[54]  Shengping Zhang,et al.  Computer vision cracks the leaf code , 2016, Proceedings of the National Academy of Sciences.

[55]  Hans Jørgen Andersen,et al.  Exploiting affine invariant regions and leaf edge shapes for weed detection , 2015, Comput. Electron. Agric..

[56]  Sergey Ioffe,et al.  Rethinking the Inception Architecture for Computer Vision , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[57]  Trevor Darrell,et al.  DeCAF: A Deep Convolutional Activation Feature for Generic Visual Recognition , 2013, ICML.

[58]  Andrea Vedaldi,et al.  MatConvNet: Convolutional Neural Networks for MATLAB , 2014, ACM Multimedia.

[59]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[60]  M. G. O'Keeffe The control of Agropyron repens and broad-leaved weeds pre-harvest of wheat and barley with the isopropylamine salt of glyphosate. , 1980 .

[61]  Yangqing Jia,et al.  Deep Convolutional Ranking for Multilabel Image Annotation , 2013, ICLR.

[62]  A. T. Nieuwenhuizen,et al.  Performance evaluation of an automated detection and control system for volunteer potatoes in sugar beet fields , 2010 .

[63]  Yan Liu,et al.  Deep residual learning for image steganalysis , 2018, Multimedia Tools and Applications.

[64]  Björn Åstrand,et al.  Classification of crops and weeds extracted by active shape models , 2008 .