A Novel Definition and Measurement Method of Group Delay and Its Application

The time synchronization accuracy demanded in satellite navigation systems, deep space exploration, and spread spectrum radar has reached the subnanosecond range. The phase distortion induced by RF cables, amplifiers, and mixers affects the spread spectrum signal delay and causes dominant error in time synchronization. To measure and calibrate the influence of phase distortion, it is necessary to devise a high-accuracy group delay measurement method and study its relationship with signal delay. The traditional definition and measurement formula of group delay is based on the derivative, which has some inherent faults such as inconsistency of resolution and accuracy, incapacity to describe the global phase character of signal bandwidth, and difficulty in establishing a relationship with spread spectrum signal delay. To overcome such shortcomings, a novel group delay definition and measurement formula has been put forward based on Taylor series expansion, which has been used in Global Positioning System timing receiver RF cable group delay measurement. The experiment showed that the accuracy of the new group delay measurement was better than 0.01 ns and was very consistent with the signal delay. The maximum bias between the signal delay and the zeroth-order group delay was less than 0.3 ns.