A simulation technique for the evaluation of random error effects in time-domain measurement systems

While many papers deal with time-domain network analyzer calibration procedures for the correction of systematic errors, little work has been published about the treatment of random errors. This paper is focused on the evaluation of random error effects in time-domain measurement systems. As a first step, an experimental identification of the measurement system random errors is achieved. Random errors addressed are jitter, vertical noise, and fast time drifts. Based on this identification, mathematical models are developed to simulate random errors. At a second step, time-domain measurements are simulated with these random errors. These simulations are used to predict measurement system repeatability and dynamic range. Then, as an application example, simulations of the measurement of the complex propagation coefficient and S parameters of a lossy mismatched microstrip line are achieved. By comparison with real measurements, it is shown that random error effects can be accurately predicted by Monte Carlo simulations.