Fountain Capacity

Fountain codes are currently employed for reliable and efficient transmission of information via erasure channels with unknown erasure rates. This correspondence introduces the notion of fountain capacity for arbitrary channels. In contrast to the conventional definition of rate, in the fountain setup the definition of rate penalizes the reception of symbols by the receiver rather than their transmission. Fountain capacity measures the maximum rate compatible with reliable reception regardless of the erasure pattern. We show that fountain capacity and Shannon capacity are equal for stationary memoryless channels. In contrast, Shannon capacity may exceed fountain capacity if the channel has memory or is not stationary.

[1]  Meir Feder,et al.  Source broadcasting with unknown amount of receiver side information , 2002, Proceedings of the IEEE Information Theory Workshop.

[2]  Shlomo Shamai,et al.  Universal variable-length data compression of binary sources using fountain codes , 2004, Information Theory Workshop.

[3]  Tsachy Weissman,et al.  The Information Lost in Erasures , 2008, IEEE Transactions on Information Theory.

[4]  R. Ahlswede Elimination of correlation in random codes for arbitrarily varying channels , 1978 .

[5]  Jonathan S. Yedidia,et al.  Rateless codes on noisy channels , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[6]  John M. Cioffi,et al.  Increase in capacity of multiuser OFDM system using dynamic subchannel allocation , 2000, VTC2000-Spring. 2000 IEEE 51st Vehicular Technology Conference Proceedings (Cat. No.00CH37026).

[7]  Prakash Narayan,et al.  Reliable Communication Under Channel Uncertainty , 1998, IEEE Trans. Inf. Theory.

[8]  Sergio Verdú,et al.  A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.

[9]  Omid Etesami,et al.  Raptor codes on binary memoryless symmetric channels , 2006, IEEE Transactions on Information Theory.

[10]  Shlomo Shamai,et al.  The Gaussian Erasure Channel , 2007, 2007 IEEE International Symposium on Information Theory.

[11]  Michael Luby,et al.  LT codes , 2002, The 43rd Annual IEEE Symposium on Foundations of Computer Science, 2002. Proceedings..

[12]  Thomas M. Cover,et al.  Gaussian feedback capacity , 1989, IEEE Trans. Inf. Theory.

[13]  Thomas M. Cover,et al.  Comments on Broadcast Channels , 1998, IEEE Trans. Inf. Theory.

[14]  Gregory W. Wornell,et al.  Rateless Coding and Perfect Rate-Compatible Codes for Gaussian Channels , 2006, 2006 IEEE International Symposium on Information Theory.

[15]  S. Zielinski,et al.  International Court of Justice , 2002, International Organization.

[16]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[17]  M. Feder,et al.  Static broadcasting , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).

[18]  Sergio Verdú,et al.  On channel capacity per unit cost , 1990, IEEE Trans. Inf. Theory.

[19]  Omid Etesami,et al.  Raptor codes on symmetric channels , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[20]  Payam Pakzad,et al.  Design Principles for Raptor Codes , 2006, 2006 IEEE Information Theory Workshop - ITW '06 Punta del Este.