Faster Fourier analysis
暂无分享,去创建一个
In seismology Fourier analyses of long time series are used chiefly in connection with studies of free oscillations and earth tides. In the past, these analyses have required large amounts of time on large computers, and many seismologists who had important data, but did not have access to large computers, were unable to analyze their data. In this letter we point out a recent alogrithm [Cooley and Tukey, 1965] which considerably reduces the time required for Fourier analysis.
As an example, we have analyzed a record obtained on a strain seismograph after the large Rat Island earthquake of February 4, 1965 (magnitude 7¾). The record consists of 2048 equally spaced data points extending over a period of approximately 13½ hours. The data were analyzed not only with the ‘fast Fourier transform’ but also with a Fourier analysis program prepared by one of the authors (A.A.N.). This latter program purposely sacrifices speed for accuracy, and it is not surprising that the difference in time required for the analysis: 2.4 sec for the fast Fourier transform as compared with 1567.8 sec for the conventional program, is greater than predicted. In general, the Cooley-Tukey algorithm should be faster according to the ratio n/log2n, where n is the number of data points.
[1] J. Tukey,et al. An algorithm for the machine calculation of complex Fourier series , 1965 .