Time vs. Money: A Quantitative Evaluation of Monitoring Frequency vs. Monitoring Duration

The National Research Council has estimated that over 126,000 contaminated groundwater sites are unlikely to achieve low ug/L clean‐up goals in the foreseeable future. At these sites, cost‐effective, long‐term monitoring schemes are needed in order to understand the long‐term changes in contaminant concentrations. Current monitoring optimization schemes rely on site‐specific evaluations to optimize groundwater monitoring frequency. However, when using linear regression to estimate the long‐term zero‐order or first‐order contaminant attenuation rate, the effect of monitoring frequency and monitoring duration on the accuracy and confidence for the estimated attenuation rate is not site‐specific. For a fixed number of monitoring events, doubling the time between monitoring events (e.g., changing from quarterly monitoring to semi‐annual monitoring) will double the accuracy of estimated attenuation rate. For a fixed monitoring frequency (e.g., semi‐annual monitoring), increasing the number of monitoring events by 60% will double the accuracy of the estimated attenuation rate. Combining these two factors, doubling the time between monitoring events (e.g., quarterly monitoring to semi‐annual monitoring) while decreasing the total number of monitoring events by 38% will result in no change in the accuracy of the estimated attenuation rate. However, the time required to collect this dataset will increase by 25%. Understanding that the trade‐off between monitoring frequency and monitoring duration is not site‐specific should simplify the process of optimizing groundwater monitoring frequency at contaminated groundwater sites.