Temporal Variations in Global Seismic Station Ambient Noise Power Levels

Recent concerns about time-dependent response changes in broadband seismometers have motivated the need for methods to monitor sensor health at Global Seismographic Network (GSN) stations. We present two new methods for monitoring temporal changes in data quality and instrument response transfer functions that are independent of Earth seismic velocity and attenuation models by comparing power levels against different baseline values. Our methods can resolve changes in both horizontal and vertical components in a broad range of periods (∼0.05 to 1,000 seconds) in near real time. In this report, we compare our methods with existing techniques and demonstrate how to resolve instrument response changes in long-period data (>100 seconds) as well as in the microseism bands (5 to 20 seconds). High quality broadband data recorded by the GSN are fundamental to characterizing a wide range of Earth science issues including: the size and rupture of large earthquakes ( e.g. , Tsai et al. 2005); imaging the interior of the Earth ( e.g. , Van der Hilst et al. 1997); tracking global climate variation (Aster et al. 2008); and monitoring calving glaciers (Ekstrom et al. 2003, 2006a). Recent studies based on theoretical Earth models (Ekstrom et al. 2006b; Davis and Berger 2007) suggest that broadband seismometer gain levels can vary with time. This has also been confirmed, for the STS-1 sensor, experimentally (Yuki and Ishihara 2002). It therefore has become necessary to systematically check for temporal changes in amplitude at GSN stations. Many of these changes are frequency-dependent in nature and not a priori predictable (Ekstrom et al. 2006b). Robust methods that can be applied to a large number of stations in a broad range of frequency bands are necessary. Seismic data from long-running GSN stations allows for good resolution of a broad range of periods for nearly two decades (Figure …