Phase dispersion characteristics during fade in a microwave line-of-sight Radio channel

Measurements of phase and amplitude dispersion over a 20-MHz band have been made on a 42-km, 6-GHz, line-of-sight microwave link. A novel technique is introduced for measuring the phase dispersion induced by the propagation path. Specifically, the amplitudes and relative phases of four tones separated equally by 6.6 MHz have been continuously monitored over a period of four months. The data show that there is usually measurable (0.02 degree/(MHz)2) phase distortion over the 20-MHz band during those fades whose depth exceeds about 20 dB. These dispersive fades, which usually last a few seconds, typically occur along with shallow and essentially nondispersive fades that have durations of several minutes. However, only the dispersive fades exhibit a phase nonlinearity. Analysis of 16 events measured in the autumn of 1970 yield the following results. (i) The distribution curve describing the fraction of time that phase nonlinearity (quadratic) exceeds a given value follows a lognormal distribution. (ii) The quadratic phase nonlinear coefficient exceeds an average value of 0.1 degree/(MHz)2 for fades with depth larger than 34 dB from the nominal level. This corresponds to a time delay distortion of 0.55 nanosecond over 1-MHz band. (iii) The correlation between log-amplitude and phase nonlinear coefficients yields a correlation coefficient whose magnitude is fade-depth dependent and whose sign varies from event to event. The experimental technique of measuring phase dispersion reported here may be of interest not only for propagation studies but also in other systems such as measurement of characteristics of electrical networks. The statistical results obtained on the phase characteristics may prove of interest in formulating an analytical model. Further, they may be of significance in the design of existing and future microwave systems.