Testing different color spaces based on hue for the environmentally adaptive segmentation algorithm (EASA)

Research work was made about crop segmentation in order to achieve real-time processing in real farm fields. The environmentally adaptive segmentation algorithm (EASA) for crop recognition was in-depth analyzed to take advantage of its key points and to look for potential improvements. The EASA was modified to test different color spaces than rgb. Hue-saturation (HS) and hue (H) were proposed as new color spaces with the objective of improving the algorithm performance by reducing the computer calculation time. In these modified algorithms, the clustering process and Bayesian classifier only require two and one variables, respectively, instead of the three variables necessary in the rgb algorithm. The segmentation effectiveness was also analyzed to evaluate the compromise between increased errors and decreased computation time. An image bank was used to develop and evaluate the algorithms, which contained sunflower crop images taken at real farm fields. As expected, the results of the segmentation process for HS and H color spaces differed slightly from the results obtained by the EASA on rgb. Specifically, the average rates of false negatives for the EASA on HS and H were 24% and 20%, respectively, while the rates of false positives were 22% and 26%, respectively, slightly worse than the EASA on rgb. This was due to the fact that the HS and H color spaces did not include an intensity component, as in rgb space, which was normalized from RGB (red, green and blue) space. In this way, variations in illumination, typical in real farm fields, did not significantly affect the segmentation results. The time spent in the global segmentation process was 25 and 46 times lower for HS and H, respectively, compared to the original EASA. Therefore, compared to the EASA on rgb, the EASA on HS and H color spaces showed a significant reduction in processing time without a significant loss in effectiveness.

[1]  J. A. Marchant,et al.  Tracking of row structure in three crops using image analysis , 1996 .

[2]  Bernd Neumann,et al.  Computer Vision — ECCV’98 , 1998, Lecture Notes in Computer Science.

[3]  Alberto Tellaeche,et al.  A new vision-based approach to differential spraying in precision agriculture , 2008 .

[4]  David G. Stork,et al.  Pattern Classification , 1973 .

[5]  Lei Tian,et al.  Environmentally adaptive segmentation algorithm for outdoor image segmentation , 1998 .

[6]  W. Kühbauch,et al.  A new algorithm for automatic Rumex obtusifolius detection in digital images using colour and texture features and the influence of image resolution , 2007, Precision Agriculture.

[7]  H. T. Søgaard,et al.  Determination of crop rows by image analysis without segmentation , 2003 .

[8]  Raymond E. Massey,et al.  Precision Agriculture: An Introduction (1998) , 1998 .

[9]  John A. Marchant,et al.  Model Based Tracking for Navigation and Segmentation , 1998, ECCV.

[10]  J. Schellberg,et al.  Identification of broad-leaved dock (Rumex obtusifolius L.) on grassland by means of digital image processing , 2006, Precision Agriculture.

[11]  J. A. Marchant,et al.  Segmentation of plants and weeds for a precision crop protection robot using infrared images , 1996 .

[12]  John A. Marchant,et al.  Real-Time Tracking of Plant Rows Using a Hough Transform , 1995, Real Time Imaging.

[13]  N. D. Tillett,et al.  Automated Crop and Weed Monitoring in Widely Spaced Cereals , 2006, Precision Agriculture.

[14]  W. S. Lee,et al.  Robotic Weed Control System for Tomatoes , 2004, Precision Agriculture.

[15]  Vincent Leemans,et al.  A computer-vision based precision seed drill guidance assistance , 2007 .