DETECTION OF SOIL MOISTURE BY REMOTE SURVEILLANCE

Three general regions of the electromagnetic spectrum are being used in feasibility studies of remote sensing of soil moisture - the short-wave region, the thermal of long-wave region, and radar or micro-wave region. Simple solarimeters have been used to correlate albedo values (ratio of reflected incoming solar radiation depends on the amount of water in surface soil) gravimetrically measured water-content values of soil layers. Albedo values could also be used to delineate the three classical stages of soil drying. Studies have suggested that a thermal inertia concept could be used to monitor soil moisture, i.e. soil temperatures are measured when they are maximum and minimum and the difference between them is related to water content. The governing principle of microwave radiation techniques derives from the fact that the dielectric constant of water at microwave frequencies is quite large (80) whereas that of dry soil is typically less than 5. A wide range of wavelengths has been used in feasibility studies of soil-moisture detection by microwave techniques - from fractions of a centimeter to tens of centimeters - and the depth of the soil layer monitored has varied from a few millimeters to a few centimeters. Microwave techniques however, suffer from the need to know surface temperatures in order to resolve moisture-induced emittance changes, and albedo techniques always require specific knowledge of the soil type viewed. Cloud-cover (encountered in satellite based programs), salinity, and the presence of vegetation also create problems for these techniques. Some of the primary benefits of remote detection of soil moisture include the ability to predict crop yields, pest outbreaks, plant disease. It could also be an aid in the management of crops and rangelands.