High-Accuracy Adaptive Low-Cost Location Sensing Subsystems for Autonomous Rover in Precision Agriculture

With the prosperity of artificial intelligence, more and more jobs will be replaced by robots. The future of precision agriculture (PA) will rely on autonomous robots to perform various agricultural operations. Real time kinematic (RTK) assisted global positioning systems (GPS) are able to provide very accurate localization information with a detection error less than $\pm 2$ cm under ideal conditions. Autonomously driving a robotic vehicle within a furrow requires relative localization of the vehicle with respect to the furrow centerline. This relative location acquisition requires both the coordinates of the vehicle as well as all the stalks of the crop rows on both sides of the furrow. This extensive number of coordinate acquisitions of all the crop stalks demand onerous geographical survey of entire fields in advance. Additionally, real-time RTK-GPS localization of moving vehicles may suffer from satellite occlusion. Hence, the above-mentioned $\pm 2$ cm accuracy is often significantly compromised in practice. Against this background, we propose sets of computer vision algorithms to coordinate with a low-cost camera (50 US dollars), and a LiDAR sensor (1500 US dollars) to detect the relative location of the vehicle in the furrow during early, and late growth season respectively. Our solution package is superior than most current computer vision algorithms used for PA, thanks to its improved features, such as a machine-learning enabled dynamic crop recognition threshold, which adaptively adjusts its value according to the environmental changes like ambient light, and crop size. Our in-field tests prove that our proposed algorithms approach the accuracy of an ideal RTK-GPS on cross-track detection, and exceed the ideal RTK-GPS on heading detection. Moreover, our solution package neither relies on satellite communication nor advance geographical surveys. Therefore, our low-complexity, and low-cost solution package is a promising localization strategy as it is able to provide the same level of accuracy as an ideal RTK-GPS, yet more consistently, and more reliably, as it requires no external conditions or hassle of the work demanded by RTK-GPS.

[1]  Chong Xu,et al.  Machine-Vision Based Location Detection Solutions for Autonomous Horticulture Rover During Early Growth Season , 2018, 2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR).

[2]  Charles E. Thorpe,et al.  Vision-based neural network road and intersection detection and traversal , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[3]  G. van Straten,et al.  A vision based row detection system for sugar beet , 2005 .

[4]  Ronelle Geldenhuys,et al.  Centi-pixel accurate real-time inverse distortion correction , 2008, International Symposium on Optomechatronic Technologies.

[5]  E. D. Dickmanns,et al.  A Curvature-based Scheme for Improving Road Vehicle Guidance by Computer Vision , 1987, Other Conferences.

[6]  Qin Zhang,et al.  A Stereovision-based Crop Row Detection Method for Tractor-automated Guidance , 2005 .

[7]  H. T. Søgaard,et al.  Determination of crop rows by image analysis without segmentation , 2003 .

[8]  Kai-Wei Chiang,et al.  An Artificial Neural Network Embedded Position and Orientation Determination Algorithm for Low Cost MEMS INS/GPS Integrated Sensors , 2009, Sensors.

[9]  M. J. Pringle,et al.  Estimating Average and Proportional Variograms of Soil Properties and Their Potential Use in Precision Agriculture , 1999, Precision Agriculture.

[10]  H. J. Olsen Determination of row position in small-grain crops by analysis of video images , 1995 .

[11]  Volker Graefe,et al.  Visual recognition of obstacles on roads , 1994, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94).

[12]  Olivier Stasse,et al.  MonoSLAM: Real-Time Single Camera SLAM , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  J. A. Marchant,et al.  Tracking of row structure in three crops using image analysis , 1996 .

[14]  Qingquan Li,et al.  A Sensor-Fusion Drivable-Region and Lane-Detection System for Autonomous Vehicle Navigation in Challenging Road Scenarios , 2014, IEEE Transactions on Vehicular Technology.

[15]  Ehud Rivlin,et al.  Real-Time Vision-Aided Localization and Navigation Based on Three-View Geometry , 2012, IEEE Transactions on Aerospace and Electronic Systems.

[16]  John A. Marchant,et al.  Real-Time Tracking of Plant Rows Using a Hough Transform , 1995, Real Time Imaging.

[17]  Aboelmagd Noureldin,et al.  Novel EKF-Based Vision/Inertial System Integration for Improved Navigation , 2018, IEEE Transactions on Instrumentation and Measurement.

[18]  Chong Xu,et al.  Dynamic Inversion Based Navigation Algorithm for Furrow-Traversing Autonomous Crop Rover* , 2019, 2019 IEEE Conference on Control Technology and Applications (CCTA).

[19]  David Ball,et al.  Vision based guidance for robot navigation in agriculture , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[20]  J. Bouma,et al.  Future Directions of Precision Agriculture , 2005, Precision Agriculture.

[21]  D. C. Slaughter,et al.  Vision Guided Precision Cultivation , 1999, Precision Agriculture.

[22]  Michel Dhome,et al.  Towards an alternative GPS sensor in dense urban environment from visual memory , 2004, BMVC.

[23]  Jay A. Farrell,et al.  Real-Time Computer Vision/DGPS-Aided Inertial Navigation System for Lane-Level Vehicle Navigation , 2012, IEEE Transactions on Intelligent Transportation Systems.

[24]  Avinash C. Kak,et al.  Vision for Mobile Robot Navigation: A Survey , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[25]  Matthew Turk,et al.  VITS-A Vision System for Autonomous Land Vehicle Navigation , 1988, IEEE Trans. Pattern Anal. Mach. Intell..

[26]  S. W. Searcy,et al.  Vision-based guidance of an agriculture tractor , 1987, IEEE Control Systems Magazine.

[27]  Janne Heikkilä,et al.  A four-step camera calibration procedure with implicit image correction , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[28]  Dean Pomerleau,et al.  Efficient Training of Artificial Neural Networks for Autonomous Navigation , 1991, Neural Computation.

[29]  Dinggang Shen,et al.  Lane detection and tracking using B-Snake , 2004, Image Vis. Comput..

[30]  Albert-Jan Baerveldt,et al.  A vision based row-following system for agricultural field machinery , 2005 .

[31]  Wolfram Burgard,et al.  Crop Row Detection on Tiny Plants With the Pattern Hough Transform , 2018, IEEE Robotics and Automation Letters.

[32]  John Billingsley,et al.  The successful development of a vision guidance system for agriculture , 1997 .

[33]  Ernst Dieter Dickmanns Computer Vision and Highway Automation , 1999 .

[34]  Dean Pomerleau,et al.  ALVINN, an autonomous land vehicle in a neural network , 2015 .

[35]  Bin Sheng,et al.  From Wireless Positioning to Mobile Positioning: An Overview of Recent Advances , 2014, IEEE Systems Journal.

[36]  Larry S. Davis,et al.  A visual navigation system for autonomous land vehicles , 1987, IEEE J. Robotics Autom..

[37]  C. A. HART,et al.  Manual of Photogrammetry , 1947, Nature.

[38]  Sinisa Segvic,et al.  A mapping and localization framework for scalable appearance-based navigation , 2009, Comput. Vis. Image Underst..

[39]  D. Karlen,et al.  Relationship Between Six Years of Corn Yields and Terrain Attributes , 2003, Precision Agriculture.

[40]  Takeo Kanade,et al.  Vision and Navigation for the Carnegie-Mellon Navlab , 1987 .

[41]  Vincent Leemans,et al.  Line cluster detection using a variant of the Hough transform for culture row localisation , 2006, Image Vis. Comput..

[42]  A. E. Conrady Decentred Lens-Systems , 1919 .

[43]  S. Christensen,et al.  Colour and shape analysis techniques for weed detection in cereal fields , 2000 .

[44]  Dean Brown,et al.  Decentering distortion of lenses , 1966 .

[45]  E D Dickmanns,et al.  AUTONOMOUS HIGH SPEED ROAD VEHICLE GUIDANCE BY COMPUTER VISION , 1987 .

[46]  Surender K. Kenue Lanelok: Detection Of Lane Boundaries And Vehicle Tracking Using Image-Processing Techniques -Part II: Template Matching Algorithms , 1990, Other Conferences.

[47]  L. Guttman,et al.  Statistical Adjustment of Data , 1944 .

[48]  Volker Graefe,et al.  Vision-based autonomous road vehicles , 1992 .

[49]  N. D. Tillett,et al.  Inter-row vision guidance for mechanical weed control in sugar beet , 2002 .