On the Detection Mode of Spectrum Analyzers in the Measurement of OFDM Out-of-Band Distortion

In wireless industry, the out-of-band (OOB) leakage power of transmit signals is widely used as a simple yet important metric to evaluate the interference to the signals communicated in adjacent frequency bands. A common way of measuring the OOB emission is via a spectrum analyzer (SA) that allows us to directly read the OOB level from the displayed trace of the signal spectra. Although this task, as such, is simple and there seems to be no potential for failure, we argue that deliberate consideration upon the detection mode of the SA should be made by looking into its intricate operation principle. Due to several internal signal processings for detection and visualization, the spectrum trace finally displayed on the monitor does not necessarily equal to the real power of the RF signal fed to the SA and varies depending on detection modes. In this paper, we demonstrate by simulation that the measurement using SAs can possibly underestimate the actual OOB leakage power unless a root mean square (RMS) detector is employed. Other type of detectors, such as peak and sample detectors, are subject to a non-trivial mismatch. The underlying reason for this is the difference of the envelope amplitude statistics inside and outside the channel, and this fact seems to have been given no particular attention so far.