Fixed-Tap ADPCM System Divergence and a Bound on the Robust Quantizer Overload Point

The divergence of ADPCM systems with fixed, multipletap predictors and a Jayant quantizer is investigated. It is shown that system divergence occurs due to excessive quantization noise in the feedback loop coupled with the infinite quantizer memory. Further, divergence may result for even finer quantization if the predictor is poorly matched with the system input. New insight into quantizer/ predictor interaction is provided by a demonstration that for all average speech data available in the literature and more than one feedback tap, the system that describes the quantization noise evolution is unstable whenever the predictor is stable. It is noted that robust quantizer designs originally proposed for transmission error suppression are also effective in preventing the ADPCM system divergence problem discussed here, and a bound on the robust quantizer overload point is derived which illustrates the effect of the finite quantizer memory. Simulation results which validate the bound are presented.