Asymptotic rates of the information transfer ratio

Information processing is performed when a system preserves aspects of the input related to what the input represents while it removes other aspects. To describe a system's information processing capability, input and output need to be compared in a way invariant to the way signals represent information, Kullback-Leibler distance, information-theoretic measure that reflects the data processing theorem, is calculated on the input and output separately and compared to obtain information transfer ratio. We consider the special case where input serves several parallel systems and show that this configuration has the capability to represent the input information without loss. We also derive bounds for asymptotic rates at which the loss decreases as more parallel systems are added and show that the rate depends on the input distribution.

[1]  I. Csiszár Information Theory , 1981 .

[2]  M. Melamed Detection , 2021, SETI: Astronomy as a Contact Sport.

[3]  M. Basseville Distance measures for signal processing and pattern recognition , 1989 .

[4]  S. Kullback,et al.  Information Theory and Statistics , 1959 .

[5]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[6]  Solomon Kullback,et al.  Information Theory and Statistics , 1970, The Mathematical Gazette.

[7]  L. Goddard Information Theory , 1962, Nature.

[8]  H. Cramér Mathematical methods of statistics , 1947 .

[9]  S. M. Ali,et al.  A General Class of Coefficients of Divergence of One Distribution from Another , 1966 .

[10]  Don H. Johnson,et al.  Toward a theory of information processing , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).

[11]  E. L. Lehmann,et al.  Theory of point estimation , 1950 .

[12]  Don H. Johnson,et al.  On the asymptotics of M-hypothesis Bayesian detection , 1997, IEEE Trans. Inf. Theory.

[13]  Harry L. Van Trees,et al.  Detection, Estimation, and Modulation Theory, Part I , 1968 .