Probing Capacity

We consider the problem of optimal probing of states of a channel by transmitter and receiver for maximizing rate of reliable communication. The channel is discrete memoryless (DMC) with i.i.d. states. The encoder takes probing actions dependent on the message. It then uses the state information obtained from probing causally or noncausally to generate channel input symbols. The decoder may also take channel probing actions as a function of the observed channel output and use the channel state information thus acquired, along with the channel output, to estimate the message. We refer to the maximum achievable rate for reliable communication for such systems as the “Probing Capacity”. We characterize this capacity when the encoder and decoder actions are cost constrained. To motivate the problem, we begin by characterizing the trade-off between the capacity and fraction of channel states the encoder is allowed to observe, while the decoder is aware of channel states. In this setting of `to observe or not to observe' state at the encoder, we compute certain numerical examples which exhibit a pleasing phenomenon, where encoder can observe a relatively small fraction of states and yet communicate at maximum rate, i.e., rate when observing states at encoder is not cost constrained.

[1]  Tsachy Weissman,et al.  Capacity of Channels With Action-Dependent States , 2009, IEEE Transactions on Information Theory.

[2]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[3]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[4]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[5]  Tobias J. Oechtering,et al.  Source and channel coding with action-dependent partially known two-sided state information , 2010, 2010 IEEE International Symposium on Information Theory.

[6]  Abbas El Gamal,et al.  On the capacity of computer memory with defects , 1983, IEEE Trans. Inf. Theory.

[7]  Haim H. Permuter,et al.  To feed or not to feed back , 2010, 2011 IEEE International Symposium on Information Theory Proceedings.

[8]  Abbas El Gamal,et al.  Lecture Notes on Network Information Theory , 2010, ArXiv.

[9]  Luc Vandendorpe,et al.  Lower Bounds on the Capacity Regions of the Relay Channel and the Cooperative Relay-Broadcast Channel with Non-Causal Side Information. , 2007, 2007 IEEE International Conference on Communications.

[10]  David L Donoho,et al.  Compressed sensing , 2006, IEEE Transactions on Information Theory.

[11]  Claude E. Shannon,et al.  Channels with Side Information at the Transmitter , 1958, IBM J. Res. Dev..

[12]  Pravin Varaiya,et al.  Capacity of fading channels with channel side information , 1997, IEEE Trans. Inf. Theory.

[13]  Neri Merhav,et al.  Channel Coding in the Presence of Side Information , 2008, Found. Trends Commun. Inf. Theory.

[14]  Haim H. Permuter,et al.  Source coding with a side information 'vending machine' at the decoder , 2009, ISIT.