LE2i, UMR 5158 uB-CNRS, 12 rue de la Fonderie F-71200 Le Creusot, FR c.gee@enesad.fr Abstract To measure and compare the effectiveness of algorithms aiming at estimating the weed infestation, a new method based on modelling photographs taken from a virtual camera placed in a virtual field is proposed. As an example, we tested a crop/inter-row weed discrimination algorithm based on the detection of crop rows (Hough transform) and on discrimination of plant areas by a region based-segmentation analysis. This test is done by a comparison between the weed infestation density detected by these algorithms with the true one. This evaluation was completed with a crop/weed pixel classification and it demonstrates that a theoretical accuracy of better than 90% is possible on simulated images. An extension of the method to real images is discussed. Keywords: simulated images, spatial statistics, weed infestation, Hough transform, vanishing point Introduction For site-specific weed management, many online systems using different optical sensors have been developed enabling to spray specifically the weed-infested areas (Felton & McCloy, 1992, Felton, 1995; Tian et al., 1999). In this context, an efficient image processing procedure for crop/weed discrimination is required in order to quantify weed infestation rates. But a manual evaluation of the weed infestation rate (WIR) is a tricky task: it will take a very long time for either a manual segmentation of the image or a manual counting of weed plants in the field and practically an evaluation of method accuracy can only be very approximate and based on statistical tests on few samples of ground. Very few articles have reported on the evaluation of the robustness of crop/weed discrimination algorithms which have actually been validated from real images with natural weed patterns taken from a camera under natural outdoor lighting conditions (Andreasen et al, 1997; Tang et al, 1999; Onyango & Marchant, 2005). Some algorithms have been developed in our lab, we have tested them on real data and in real in-field conditions but assessing and comparing them appeared difficult and unsure (Vioix et al., 2002; Bossu et al. 2006). Then in this context, we developed a new, mastered and original method to test and validate the effectiveness of any algorithm aiming at estimating the weed infestation rate. We propose to model photographs taken from a virtual camera placed in a virtual crop field with different well-known weed infestation rates. In fact, a simulated image under various conditions, with the knowledge of every parameter (weed and crop density and localization) is a perfect tool for evaluating the accuracy of any algorithms aiming at discriminating between crop and weed.
[1]
J. F. Reid,et al.
Texture-Based Weed Classification Using Gabor Wavelets and Neural Network for Real-time Selective Herbicide Applications
,
2000
.
[2]
François Goreaud,et al.
Apports de l'analyse de la structure spatiale en forêt tempérée à l'étude et la modélisation des peuplements complexes
,
2000
.
[3]
D. J. Mulla,et al.
Weed classification based on spectral properties.
,
2004
.
[4]
Richard O. Duda,et al.
Use of the Hough transformation to detect lines and curves in pictures
,
1972,
CACM.
[5]
John F. Reid,et al.
DEVELOPMENT OF A PRECISION SPRAYER FOR SITE-SPECIFIC WEED MANAGEMENT
,
1999
.
[6]
B. Ripley.
Computer generation of random variables: a tutorial
,
1983
.
[7]
R. B. Brown,et al.
Prescription Maps for Spatially Variable Herbicide Application in No-till Corn
,
1995
.
[8]
Frédéric Truchetet,et al.
Spatial and Spectral Methods for Weed Detection and Localization
,
2002,
EURASIP J. Adv. Signal Process..
[9]
Christine Onyango,et al.
Image Processing Performance Assessment Using Crop Weed Competition Models
,
2005,
Precision Agriculture.
[10]
Frederic Truchetet,et al.
Development of a machine vision system for a real-time precision sprayer
,
2007,
International Conference on Quality Control by Artificial Vision.
[11]
Mats Rudemo,et al.
Assessment of weed density at an early stage by use of image processing
,
1997
.