Encoding of correlated observations

An important class of engineering problems involves sensing an environment and making estimates based on the phenomena sensed. In the traditional model of this problem, the sensors' observations are available to the estimator without alteration. There is .growing interest in {\em distributed} sensing systems in which several observations are communicated to the estimator over channels of limited capacity. The observations must be separately encoded so that the target can be estimated with minimum distortion. Two questions are addressed for a special case of this problem wherein there are two sensors which observe noisy data and communicate with a single estimator: 1) if the encoder is unlimited in complexity, what communication rates and distortions can be achieved, 2) if the encoder must be a quantizer (a mapping of a single observation sample into a digital output), how can it be designed for good performance? The first question is treated by the techniques of information theory. It is proved that a given combination of rates and distortion is achievable if there exist degraded versions of the observations that satisfy certain formulas. The second question is treated by two approaches. In the first, the outputs of the quantizers undergo a second stage of encoding which exploits their correlation to reduce the output rate. Algorithms which design the second stage are presented and tested. The second approach is based on the {\em distributional distance}, a measure of dissimilarity between two probability distributions. An algorithm to modify a quantizer for increased distributional distance is derived and tested.

[1]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[2]  Joel Max,et al.  Quantizing for minimum distortion , 1960, IRE Trans. Inf. Theory.

[3]  Boris Tsybakov,et al.  Information transmission with additional noise , 1962, IRE Trans. Inf. Theory.

[4]  P. Schultheiss,et al.  Block Quantization of Correlated Gaussian Random Variables , 1963 .

[5]  S. M. Ali,et al.  A General Class of Coefficients of Divergence of One Distribution from Another , 1966 .

[6]  Thomas J. Goblick,et al.  Analog source digitization: A comparison of theory and practice (Corresp.) , 1967, IEEE Trans. Inf. Theory.

[7]  T. Kailath The Divergence and Bhattacharyya Distance Measures in Signal Selection , 1967 .

[8]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[9]  Robert M. Gray,et al.  Source coding for a simple network , 1974 .

[10]  John B. Anderson,et al.  Tree encoding of speech , 1975, IEEE Trans. Inf. Theory.

[11]  Aaron D. Wyner,et al.  On source coding with side information at the decoder , 1975, IEEE Trans. Inf. Theory.

[12]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[13]  H. V. Poor,et al.  Applications of Ali-Silvey Distance Measures in the Design of Generalized Quantizers for Binary Decision Systems , 1977, IEEE Trans. Commun..

[14]  Edward C. van der Meulen,et al.  A survey of multi-way channels in information theory: 1961-1976 , 1977, IEEE Trans. Inf. Theory.

[15]  Robert M. Gray,et al.  An Algorithm for Vector Quantizer Design , 1980, IEEE Trans. Commun..

[16]  A. El Gamal,et al.  Multiple user information theory , 1980, Proceedings of the IEEE.

[17]  Abbas El Gamal,et al.  Achievable rates for multiple descriptions , 1982, IEEE Trans. Inf. Theory.

[18]  R. Gray,et al.  Vector quantization , 1984, IEEE ASSP Magazine.

[19]  Robert M. Gray,et al.  An Algorithm for the Design of Labeled-Transition Finite-State Vector Quantizers , 1985, IEEE Trans. Commun..