This paper describes a vision-guided tractor for crop row management, suitable for such tasks as weeding and the application of pesticide and herbicide. The vehicle strides over the crop row, capturing an image from which the heading and offset errors are obtained to correct the steering angle. The algorithm of the vision system performs by, first, separating the furrow and crop areas according to color differences using HST color transformation; then the least squares method is applied to find the boundary between the two areas. Based on these boundary lines, perspective view transformation is used to identify the heading and offset errors from the desired line. The steering angle for control is determined from a combination of the offset, heading error and the current steering angle. Results show that the vehicle follows the desired line along the crop row to within an acceptable tolerance. [Keywords] image analysis, autonomous navigation, perspective view transformation, row crop tracking
[1]
Tsuguo Okamoto,et al.
The Recognition of Visible Injury of Plant Disease Using Bandpass Filtered Images
,
1998
.
[2]
Tsuguo Okamoto,et al.
Image Analysis of Crop Row and Position Identification (Part 2)
,
1997
.
[3]
Tsuguo Okamoto,et al.
Crop row tracking by an autonomous vehicle using machine vision (part 1): indoor experiment using a model vehicle.
,
2000
.
[4]
Tsuguo Okamoto,et al.
Image Analysis of Crop Row Used for Agricultural Mobile Robot (Part 2)
,
1995
.
[5]
Tsuguo Okamoto,et al.
Image Analysis of Crop Row Used for Agricultural Mobile Robot (Part 1)
,
1995
.