In-field tea shoot detection and 3D localization using an RGB-D camera

Abstract Tea shoot detection and localization are highly challenging tasks because of varying illumination, inevitable occlusion, tiny targets, and dense growth. To achieve the automatic plucking of tea shoots in a tea garden, a reliable algorithm based on red, green, blue-depth (RGB-D) camera images was developed to detect and locate tea shoots in fields for tea harvesting robots. In this study, labeling criteria were first established for the images collected for multiple periods and varieties in the tea garden. Then, a “you only look once” (YOLO) network was used to detect tea shoot (one bud with one leaf) regions on RGB images collected by an RGB-D camera. Additionally, the detection precision for tea shoots was 93.1% and the recall rate was 89.3%. To achieve the three-dimensional (3D) localization of the plucking position, 3D point clouds of the detected target regions were acquired by fusing the depth image and RGB image captured by an RGB-D camera. Then, noise was removed using point cloud pre-processing and the point cloud of the tea shoots was obtained using Euclidean clustering processing and a target point cloud extraction algorithm. Finally, the 3D plucking position of the tea shoots was determined by combining the tea growth characteristics, point cloud features, and sleeve plucking scheme, which solved the problem that the plucking point may be invisible in fields. To verify the effectiveness of the proposed algorithm, tea shoot localization and plucking experiments were conducted in the tea garden. The plucking success rate for tea shoots was 83.18% and the average localization time for each target was about 24 ms. All the results demonstrate that the proposed method could be used for robotic tea plucking.

[1]  Nan Li,et al.  Research on a Parallel Robot for Tea Flushes Plucking , 2015 .

[2]  Shih-Fang Chen,et al.  Application of Deep Learning Algorithm on Tea Shoot Identification and Localization , 2018 .

[3]  Feng Chen,et al.  "Turn-off" fluorescent sensor for highly sensitive and specific simultaneous recognition of 29 famous green teas based on quantum dots combined with chemometrics. , 2017, Analytica chimica acta.

[4]  Shih-Fang Chen,et al.  Localizing plucking points of tea leaves using deep convolutional neural networks , 2020, Comput. Electron. Agric..

[5]  Xumeng Li,et al.  Real-time monitoring of optimum timing for harvesting fresh tea leaves based on machine vision , 2019 .

[6]  H.D.N.S. Priyankara,et al.  Tea Bud Leaf Identification by Using Machine Learning and Image Processing Techniques , 2020 .

[7]  Xiangjun Zou,et al.  In-field citrus detection and localisation based on RGB-D image analysis , 2019, Biosystems Engineering.

[8]  Markus Vincze,et al.  An Empirical Evaluation of Ten Depth Cameras: Bias, Precision, Lateral Noise, Different Lighting Conditions and Materials, and Multiple Sensor Setups in Indoor Environments , 2019, IEEE Robotics & Automation Magazine.

[9]  Riccardo Russo,et al.  A vision guided robotic system for flexible gluing process in the footwear industry , 2020, Robotics Comput. Integr. Manuf..

[10]  Andreas Kamilaris,et al.  Deep learning in agriculture: A survey , 2018, Comput. Electron. Agric..

[11]  Bin Li,et al.  Using color and 3D geometry features to segment fruit point cloud and improve fruit recognition accuracy , 2020, Comput. Electron. Agric..

[12]  Song Mei,et al.  Developing Situations of Tea Plucking Machine , 2014 .

[13]  Xiangjun Zou,et al.  Localisation of litchi in an unstructured environment using binocular stereo vision , 2016 .

[14]  Matti Pietikäinen,et al.  Deep Learning for Generic Object Detection: A Survey , 2018, International Journal of Computer Vision.

[15]  Jan Peters,et al.  Plucking Motions for Tea Harvesting Robots Using Probabilistic Movement Primitives , 2020, IEEE Robotics and Automation Letters.

[16]  Rui Li,et al.  Application of consumer RGB-D cameras for fruit detection and localization in field: A critical review , 2020, Comput. Electron. Agric..

[17]  Long Chen,et al.  Tender Tea Shoots Recognition and Positioning for Picking Robot Using Improved YOLO-V3 Model , 2019, IEEE Access.

[18]  Lijun Zhao,et al.  Window Zooming–Based Localization Algorithm of Fruit and Vegetable for Harvesting Robot , 2019, IEEE Access.

[19]  Xin Zhang,et al.  Faster R–CNN–based apple detection in dense-foliage fruiting-wall trees using RGB and depth features for robotic harvesting , 2020 .

[20]  Yang Yu,et al.  Fruit detection for strawberry harvesting robot in non-structural environment based on Mask-RCNN , 2019, Comput. Electron. Agric..

[21]  Ali Farhadi,et al.  YOLOv3: An Incremental Improvement , 2018, ArXiv.

[22]  Josse De Baerdemaeker,et al.  Detection of red and bicoloured apples on tree with an RGB-D camera , 2016 .