Water Level Monitoring—Achievable Accuracy and Precision

Measurement of the depth to ground water is a basic element in all hydrogeologic investigations providing data for gradient, flow direction, seepage velocity, and aquifer constant calculations. The U.S. Environmental Protection Agency (EPA) Technical Enforcement Guidance Document (TEGD) specifies a measurement accuracy goal of ′ 0.01 ft for Resource Conservation and Recovery Act (RCRA) facilities. This accuracy goal may be unrealistic, since measurements are limited by their precision, and both accuracy and precision are affected by random and systematic sources of error and uncertainty. Random precision uncertainties include instrument sensitivities, the measuring point location, and operator technique. Random accuracy problems include short-term climatic effects (precipitation, temperature, barometric pressure) and instrument calibration. Experience has demonstrated that these accumulated uncertainties range from ′0.02 to ′0.20 ft. Systematic errors are both anthropogenic and site related and include surveying accuracy, well deviation from vertical, instrument deterioration (e.g., cable stretching), and special site problems (multiphasic liquids, high gas pressures, foaming, and other problems). These errors may increase inaccuracy or make readings highly variable. The cumulative uncertainty from both random and systematic error sources is ′ 0.10 to ′ 0.30 ft for a "pristine" shallow, unconfined aquifer, while for difficult installations or where anthropogenic factors are not well controlled, the accumulated error may be several feet. This paper describes sources of error and uncertainty and reports on several practical experiments to quantify the uncertainty in water table measurements. The importance of understanding these sources in setting accuracy goals is stressed.