On Rate-Constrained Estimation in Unreliable Sensor Networks

We study a network of non-collaborating sensors that make noisy measurements of some physical process X and communicate their readings to a central processing unit. Limited power resources of the sensors severely restrict communication rates. Sensors and their communication links are both subject to failure; however, the central unit is guaranteed to receive data from a minimum fraction of the sensors, say k out of n sensors. The goal of the central unit is to optimally estimate X from the received transmissions under a specified distortion metric. In this work, we derive an information-theoretically achievable rate-distortion region for this network under symmetric sensor measurement statistics. When all processes are jointly Gaussian and independent, and we have a squared-error distortion metric, the proposed distributed encoding and estimation framework has the following interesting optimality property: when any k out of n rate-R bits/sec sensor transmissions are received, the central unit's estimation quality matches the best estimation quality that can be achieved from a completely reliable network of k sensors, each transmitting at rate R. Furthermore, when more than k out of the n sensor transmissions are received, the estimation quality strictly improves. When the network has clusters of collaborating sensors should clusters compress their raw measurements or should they first try to estimate the source from their measurements and compress the estimates instead. For some interesting cases, we show that there is no loss of performance in the distributed compression of local estimates over the distributed compression of raw data in a rate-distortion sense, i.e., encoding the local sufficient statistics is good enough.

[1]  David J. Sakrison,et al.  Source encoding in the presence of random disturbance (Corresp.) , 1967, IEEE Trans. Inf. Theory.

[2]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[3]  Kannan Ramchandran,et al.  Distributed source coding using syndromes (DISCUS): design and construction , 2003, IEEE Trans. Inf. Theory.

[4]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[5]  Yasutada Oohama,et al.  The Rate-Distortion Function for the Quadratic Gaussian CEO Problem , 1998, IEEE Trans. Inf. Theory.

[6]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[7]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[8]  Kannan Ramchandran,et al.  Distributed source coding using syndromes (DISCUSS): design and construction , 1999 .

[9]  H. Vincent Poor,et al.  An Introduction to Signal Detection and Estimation , 1994, Springer Texts in Electrical Engineering.

[10]  H. Vincent Poor,et al.  An introduction to signal detection and estimation (2nd ed.) , 1994 .

[11]  Kannan Ramchandran,et al.  n-channel symmetric multiple descriptions - part I: (n, k) source-channel erasure codes , 2004, IEEE Transactions on Information Theory.

[12]  Kannan Ramchandran,et al.  Distributed source coding: symmetric rates and applications to sensor networks , 2000, Proceedings DCC 2000. Data Compression Conference.

[13]  Michael Gastpar,et al.  To code, or not to code: lossy source-channel communication revisited , 2003, IEEE Trans. Inf. Theory.