A study is made of the various aspects of the algorithm theoretically, comparing the NBS (US National Bureau of Standards) algorithm with a Kalman filter to discuss questions of optimality. It is shown that since the time of a clock is not measured, but only the time difference between clocks, a time scale should not attempt to optimize time accuracy, since that has no meaning. However, time uniformity and frequency stability can be optimized. The authors further study the practice of monitoring the clocks in a time scale for frequency steps, and removing a clock from the scale when a step has been detected until the new frequency is learned. The authors show that the effect of this practice on the algorithm is to translate random walk behavior in the individual clocks, due to the frequency steps of the clocks, to flicker noise for the ensemble. The implication is that careful monitoring of the scale can significantly improve its long-term performance. >
[1]
Peter V. Tryon,et al.
Continuous time series models for unequally spaced data applied to modeling atomic clocks
,
1987
.
[2]
David W. Allan,et al.
The effect of humidity on commercial cesium beam atomic clocks
,
1988,
Proceedings of the 42nd Annual Frequency Control Symposium, 1988..
[3]
Marc A. Weiss,et al.
An NBS calibration procedure for providing time and frequency at a remote site by weighting and smoothing of GPS common view data
,
1987,
IEEE Transactions on Instrumentation and Measurement.
[4]
S. R. Stein,et al.
Application of Kalman Filters and ARIMA Models to Digital Frequency and Phase Lock Loops
,
1987
.
[5]
D. W. Allan,et al.
A Study of Long-Term Stability of Atomic Clocks
,
1987
.