Analysis of K-Channel Multiple Description Quantization

This paper studies the tight rate-distortion bound for K-channel symmetric multiple-description coding for a memory less Gaussian source. We find that the product of a function of the individual side distortions (for single received descriptions) and the central distortion (for K received descriptions) is asymptotically independent of the redundancy among the descriptions. Using this property, we analyze the asymptotic behaviors of two different practical  multiple-description lattice vector quantizers (MDLVQ). Our analysis includes the treatment of a MDLVQ system from a new geometric viewpoint, which results in an expression for the side distortions using the normalized second moment of a sphere of higher dimensionality than the quantization space. The expression of the distortion product derived from the lower bound is then applied as a criterion to assess the performance losses of the considered MDLVQ systems.