QUANTIZATION ERROR IN TIME-TO-DIGITAL CONVERTERS

Methods of time interval measurement can be divided into asynchronous and synchronous approaches. It is well known that in asynchronous methods of time-interval measurement, uncertainty can be reduced by using statistical averaging. The motivation of this paper is an investigation of averaging in time interval measurements, especially in a synchronous measurement. In this article, authors are considering the method of averaging to reduce the influence of quantization error on measurement uncertainty in synchronous time-interval measurement systems, when dispersion of results, caused by noise is present. A mathematical model of avera ging, which is followed by the results of numerical simulations of averaging of measurement series is presented. The analysis of results leads to the conclusion that in particular conditions the influence of the quantization error on measurement uncertainty can be minimized by statistical averaging, similar to asynchronous measurements.