Vision-Aided Outdoor Navigation of an Autonomous Horticultural Vehicle

An autonomous outdoor vehicle has been developed at the Silsoe Research Institute as a testbed to investigate precise crop protection. The vehicle is able to navigate along rows of crop by using a Kalman filter to fuse information from proprioceptive sensing (odometry and inertial sensors) with data from an on-board computer vision system to generate estimates of its position and orientation. This paper describes a novel implementation of a previously proposed vision algorithm which uses a model of the crop planting pattern to extract vehicle position and orientation information from observations of many plants in each image. It is demonstrated that by implementing the vision system to compress the multiple plant observations into a single "pseudo observation" of vehicle position and orientation, it is possible to separate the vision system from the main body of the vehicle navigation Kalman filter, thus simplifying the task of fusing data from different sources. The algorithm is also used to segment the image sequences into areas of crop and weed, thus providing potential for targeting treatment. The implementation is tested on the vehicle, and results are shown from trials both in an indoor test area and outdoors on a field of real crop. Segmentation results are given for images captured from the vehicle.

[1]  N. D. Tillett,et al.  Navigation and control of an autonomous horticultural robot , 1996 .

[2]  Tiziana D'Orazio,et al.  Mobile robot position determination using visual landmarks , 1994, IEEE Trans. Ind. Electron..

[3]  Ernst D. Dickmanns,et al.  Expectation-based dynamic scene understanding , 1993 .

[4]  Rachid Deriche,et al.  Tracking line segments , 1990, Image Vis. Comput..

[5]  Lindsay Kleeman,et al.  Sonar based map building for a mobile robot , 1997, Proceedings of International Conference on Robotics and Automation.

[6]  Bobby Rao,et al.  Data association methods for tracking systems , 1993 .

[7]  John A. Marchant,et al.  Real-Time Tracking of Plant Rows Using a Hough Transform , 1995, Real Time Imaging.

[8]  Hugh F. Durrant-Whyte,et al.  A Fully Decentralized Multi-Sensor System For Tracking and Surveillance , 1993, Int. J. Robotics Res..

[9]  Hugh F. Durrant-Whyte,et al.  Mobile robot localization by tracking geometric beacons , 1991, IEEE Trans. Robotics Autom..

[10]  Herbert Freeman,et al.  On the Encoding of Arbitrary Geometric Configurations , 1961, IRE Trans. Electron. Comput..

[11]  John A. Marchant,et al.  Model Based Tracking for Navigation and Segmentation , 1998, ECCV.

[12]  Tiziana D'Orazio,et al.  Mobile robot navigation by multi-sensory integration , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[13]  C. Chang,et al.  Kalman filter algorithms for a multi-sensor system , 1976, 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes.

[14]  Sumit Roy,et al.  Decentralized structures for parallel Kalman filtering , 1988 .

[15]  Y. Bar-Shalom Tracking and data association , 1988 .

[16]  Chris Harris,et al.  Geometry from visual motion , 1993 .