Interpolative Digital-to-Analog Converters

Interpolative digital-to-analog (D/A) converters produce a final output via a two-step process. First, each digital input word is used to control a circuit whose output oscillates rapidly (i.e., many times faster than new digital input values are provided) between coarsely spaced analog values (i.e., many times coarser than the resolution specified by the input word). Second, the oscillating analog signal is low-pass filtered to give the final output. The oscillation pattern is chosen to produce an average value that corresponds to the fine resolution specified by the input word and to ensure that the power of the error (the difference between the oscillating signal and the desired fine resolution output) occurs predominantly out of band. By this means, high-speed operation reduces the need for many finely spaced analog signal amplitudes, a tradeoff which is especially desirable for integrated circuit implementation. In this paper, the basic operation of interpolative D/A converters is described. Three alternative means of generating patterns are compared with respect to circuit complexity, and amount of baseband distortion introduced. The relative insensitivity of these converters to circuit value variations is emphasized. Applications of the interpolative technique to decoding digital words in both linear and piecewise linearly companded formats are given.