A study of the NBS time scale algorithm

A study is made of the various aspects of the algorithm theoretically, comparing the NBS (US National Bureau of Standards) algorithm with a Kalman filter to discuss questions of optimality. It is shown that since the time of a clock is not measured, but only the time difference between clocks, a time scale should not attempt to optimize time accuracy, since that has no meaning. However, time uniformity and frequency stability can be optimized. The authors further study the practice of monitoring the clocks in a time scale for frequency steps, and removing a clock from the scale when a step has been detected until the new frequency is learned. The authors show that the effect of this practice on the algorithm is to translate random walk behavior in the individual clocks, due to the frequency steps of the clocks, to flicker noise for the ensemble. The implication is that careful monitoring of the scale can significantly improve its long-term performance. >