On the capacity of noisy computations

This paper presents an analysis of the concept of capacity for noisy computations, i.e. algorithms implemented by unreliable computing devices (e.g. noisy Turing Machines). The capacity of a noisy computation is defined and justified by companion coding theorems. Under some constraints on the encoding process, capacity is the upper bound of input rates allowing reliable computation, i.e. decodability of noisy outputs into expected outputs. A model of noisy computation of a perfect function f thanks to an unreliable device F is given together with a model of reliable computation based on input encoding and output decoding. A coding lemma (extending the Feinstein's theorem to noisy computations), a joint source-computation coding theorem and its converse are proved. They apply if the input source, the function f, the noisy device F and the cascade f−1F induce AMS and ergodic one-sided random processes.

[1]  Robert M. Gray,et al.  Probability, Random Processes, And Ergodic Properties , 1987 .

[2]  Y. Kakihara Abstract Methods in Information Theory , 1999 .

[3]  W. W. Peterson,et al.  On Codes for Checking Logical Operations , 1959, IBM J. Res. Dev..

[4]  E.,et al.  A. Computation in the Presence of Noise , 2009 .

[5]  Contents , 2009, Neuroscience Letters.

[6]  Eiji Fujiwara,et al.  Error-control coding for computer systems , 1989 .

[7]  Peter Elias,et al.  Computation in the Presence of Noise , 1958, IBM J. Res. Dev..

[8]  J. von Neumann,et al.  Probabilistic Logic and the Synthesis of Reliable Organisms from Unreliable Components , 1956 .

[9]  Shmuel Winograd,et al.  Coding for Logical Operations , 1962, IBM J. Res. Dev..

[10]  Arnold M. Faden,et al.  The Existence of Regular Conditional Probabilities: Necessary and Sufficient Conditions , 1985 .

[11]  Robert M. Gray,et al.  Asymptotically mean stationary channels , 1981, IEEE Trans. Inf. Theory.

[12]  John E. Savage,et al.  A framework for coded computation , 2008, 2008 IEEE International Symposium on Information Theory.

[13]  Andrei E. Romashchenko Reliable Computations Based on Locally Decodable Codes , 2006, STACS.

[14]  R. Gray Entropy and Information Theory , 1990, Springer New York.

[15]  S. Winograd,et al.  Reliable Computation in the Presence of Noise , 1963 .

[16]  Christoforos N. Hadjicostis,et al.  Coding approaches to fault tolerance in linear dynamic systems , 2005, IEEE Transactions on Information Theory.

[17]  Péter Gács,et al.  A Turing Machine Resisting Isolated Bursts Of Faults , 2012, Chic. J. Theor. Comput. Sci..

[18]  T. T. Kadota Generalization of Feinstein's fundamental lemma (Corresp.) , 1970, IEEE Trans. Inf. Theory.

[19]  Rudolf Ahlswede Improvements of Winograd's result on computation in the presence of noise , 1984, IEEE Trans. Inf. Theory.

[20]  Daniel A. Spielman,et al.  Highly fault-tolerant parallel computation , 1996, Proceedings of 37th Conference on Foundations of Computer Science.

[21]  Michael Nicolaidis,et al.  Efficient implementations of self-checking adders and ALUs , 1993, FTCS-23 The Twenty-Third International Symposium on Fault-Tolerant Computing.

[22]  François Simon Capacity of a noisy function , 2010, 2010 IEEE Information Theory Workshop.

[23]  Ina Fourie,et al.  Entropy and Information Theory (2nd ed.) , 2012 .

[24]  Michael Gastpar,et al.  Computation Over Multiple-Access Channels , 2007, IEEE Transactions on Information Theory.

[25]  Eugene Asarin,et al.  Noisy Turing Machines , 2005, ICALP.

[26]  R. Gray,et al.  Asymptotically Mean Stationary Measures , 1980 .

[27]  Valérie Girardin,et al.  On the Different Extensions of the Ergodic Theorem of Information Theory , 2005 .