A Unified Model for Real-Time Crop Recognition and Stem Localization Exploiting Cross-Task Feature Fusion

Robotic mechanical weed control is a promising solution to reduce the usage of herbicides. Efficient and accurate detection of crop stems is the premise of most robotic mechanical weeding machines. This paper proposes a unified convolutional neural network model, called UniStemNet, for real-time crop recognition and stem detection. The UniStemNet consists of a backbone network and two subnets to perform the two tasks simultaneously. According to the difference of targets in the two tasks, the varied-span feature fusion structure is established in the subnets. To improve the stem detection performance, a cross-task feature fusion strategy is devised which introduces a top-down guidance from the crop recognition subnet to the stem detection subnet. Experimental results demonstrate that the proposed UniStemNet can significantly outperform the state-of-the-art crop stem detection method, and perform comparably with leadingedge crop recognition methods. The results also validate the remarkable effect of the cross-task feature fusion strategy on improving the stem detection performance. The UniStemNet can process a 400×300 image within 6 ms. The code and dataset are available at https://github.com/ZhangXG001/Real-Time-Crop-Recognition-and-Stem-Localization.git.

[1]  Xiushan Wang,et al.  Research on maize canopy center recognition based on nonsignificant color difference segmentation , 2018, PloS one.

[2]  Rasmus Nyholm Jørgensen,et al.  A Novel Locating System for Cereal Plant Stem Emerging Points’ Detection Using a Convolutional Neural Network , 2018, Sensors.

[3]  Dong Liu,et al.  Deep High-Resolution Representation Learning for Human Pose Estimation , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[4]  Cyrill Stachniss,et al.  Robust joint stem detection and crop‐weed classification using image sequences for plant‐specific treatment in precision farming , 2019, J. Field Robotics.

[5]  A. Jafari,et al.  Crop Detection and Positioning in the Field Using Discriminant Analysis and Neural Networks Based on Shape Features , 2012 .

[6]  Ross B. Girshick,et al.  Mask R-CNN , 2017, 1703.06870.

[7]  Zhuowen Tu,et al.  Deeply Supervised Salient Object Detection with Short Connections , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[8]  Henrik Skov Midtiby,et al.  Estimating the plant stem emerging points (PSEPs) of sugar beets at early growth stages , 2012 .

[9]  Xiaoguang Zhang,et al.  Real-Time Crop Recognition in Transplanted Fields With Prominent Weed Growth: A Visual-Attention-Based Approach , 2019, IEEE Access.

[10]  Gerrit Polder,et al.  A robot to detect and control broad‐leaved dock (Rumex obtusifolius L.) in grassland , 2011, J. Field Robotics.

[11]  Cyrill Stachniss,et al.  Real-Time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs , 2017, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[12]  Antonio Torralba,et al.  LabelMe: A Database and Web-Based Tool for Image Annotation , 2008, International Journal of Computer Vision.

[13]  Li Nan,et al.  Crop positioning for robotic intra-row weeding based on machine vision , 2015 .

[14]  Jörn Ostermann,et al.  Plant Stem Detection and Position Estimation using Machine Vision , 2014 .

[15]  Cyrill Stachniss,et al.  Bonnet: An Open-Source Training and Deployment Framework for Semantic Segmentation in Robotics using CNNs , 2018, 2019 International Conference on Robotics and Automation (ICRA).

[16]  Cyrill Stachniss,et al.  Joint Stem Detection and Crop-Weed Classification for Plant-Specific Treatment in Precision Farming , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[17]  Stella X. Yu,et al.  Adaptive Affinity Fields for Semantic Segmentation , 2018, ECCV.

[18]  Chao Sun,et al.  Deep localization model for intra-row crop detection in paddy field , 2020, Comput. Electron. Agric..

[19]  Yang Wang,et al.  Optimizing Intersection-Over-Union in Deep Neural Networks for Image Segmentation , 2016, ISVC.

[20]  Wen Zhang,et al.  A review on weed detection using ground-based machine vision and image processing techniques , 2019, Comput. Electron. Agric..

[21]  Xinyu Wu,et al.  Review of Machine-Vision-Based Plant Detection Technologies for Robotic Weeding , 2019, 2019 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[22]  Brian L. Steward,et al.  Automated crop plant detection based on the fusion of color and depth images for robotic weed control , 2019, J. Field Robotics.

[23]  Trevor Darrell,et al.  Fully Convolutional Networks for Semantic Segmentation , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[24]  Mark Sandler,et al.  MobileNetV2: Inverted Residuals and Linear Bottlenecks , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.