A simple and testable model for earthquake clustering

Earthquakes are regarded as the realization of a point process modeled by a generalized Poisson distribution. We assume that the Gutenberg-Richter law describes the magnitude distribution of all the earthquakes in a sample, with a constant b value. We model the occurrence rate density of earthquakes in space and time as the sum of two terms, one representing the independent, or spontaneous, activity and the other representing the activity induced by previous earthquakes. The first term depends only on space and is modeled by a continuous function of the geometrical coordinates, obtained by smoothing the discrete distribution of the past instrumental seismicity. The second term also depends on time, and it is factorized in two terms that depend on the space distance (according to an isotropic normal distribution) and on the time difference (according to the generalized Omori law), respectively, from the past earthquakes. Knowing the expected rate density, the likelihood of any realization of the process (actually represented by an earthquake catalog) can be computed straightforwardly. This algorithm was used in two ways: (1) during the learning phase, for the maximum likelihood estimate of the few free parameters of the model, and (2) for hypothesis testing. For the learning phase we used the catalog of Italian seismicity (M≥3.5) from May 1976 to December 1998. The model was tested on a new and independent data set (January-December 1999). We demonstrated for this short time period that in the Italian region this time-dependent model has a significantly better performance than a stationary Poisson model, even if its likelihood is computed excluding the obvious component of main shock-aftershock interaction.

[1]  R. E. Habermann Teleseismic detection in the Aleutian Island Arc , 1983 .

[2]  R. Evans Assessment of schemes for earthquake prediction: Editor's introduction , 1997 .

[3]  David A. Rhoades,et al.  The precursory earthquake swarm in New Zealand: Hypothesis tests , 1993 .

[4]  P. Reasenberg,et al.  Earthquake Hazard After a Mainshock in California , 1989, Science.

[5]  D. Vere-Jones,et al.  A space-time clustering model for historical earthquakes , 1992 .

[6]  A. Frankel Mapping Seismic Hazard in the Central and Eastern United States , 1995 .

[7]  Yan Y. Kagan,et al.  New seismic gap hypothesis: Five years after , 1995 .

[8]  Y. Ogata Space-Time Point-Process Models for Earthquake Occurrences , 1998 .

[9]  D. A. Rhoades,et al.  On the reliability of precursors , 1989 .

[10]  Y. Ogata ESTIMATION OF THE PARAMETERS IN THE MODIFIED OMORI FORMULA FOR AFTERSHOCK FREQUENCIES BY THE MAXIMUM LIKELIHOOD PROCEDURE , 1983 .

[11]  C. Lomnitz Magnitude stability in earthquake sequences , 1966 .

[12]  R. Geller Debate on evaluation of the VAN Method: Editor's introduction , 1996 .

[13]  D. Vere-Jones A note on the statistical interpretation of Båth's law , 1969 .

[14]  R. Console,et al.  Seismicity rate change before the Irpinia (M = 6.9) 1980 earthquake , 1997, Bulletin of the Seismological Society of America.

[15]  D. Jackson,et al.  Hypothesis testing and earthquake prediction. , 1996, Proceedings of the National Academy of Sciences of the United States of America.

[16]  Yan Y. Kagan,et al.  Testable Earthquake Forecasts for 1999 , 1999 .