Evidence, information, and surprise

A numerical measure for “evidence” is defined in a probabilistic framework. The established mathematical concept of information or entropy (as defined in ergodic theory) can be obtained from this definition in a special case, although in general information is greater than evidence. In another, somewhat complementary, special case a numerical measure for “surprise” is derived from the definition of evidence. Some applications of the new concept of evidence are discussed, concerning statistics in general and the special kind of statistics performed by neurophysiologists, when they analyze the “response” of neurons, and perhaps by the neurons themselves.

[1]  B. Mandelbrot,et al.  RANDOM WALK MODELS FOR THE SPIKE ACTIVITY OF A SINGLE NEURON. , 1964, Biophysical journal.

[2]  P. Billingsley,et al.  Ergodic theory and information , 1966 .

[3]  D. Marr A theory for cerebral neocortex , 1970, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[4]  D Marr,et al.  Simple memory: a theory for archicortex. , 1971, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[5]  F. Schlögl,et al.  On stability of steady states , 1971 .

[6]  E. Pfaffelhuber Learning and information theory. , 1972, The International journal of neuroscience.

[7]  G L Gerstein,et al.  Mutual temporal relationships among neuronal spike trains. Statistical techniques for display and analysis. , 1972, Biophysical journal.

[8]  G. Palm A Common Generalization of Topological and Measure-Theoretic Entropy , 1975 .

[9]  C R Legéndy Three principles of brain function and structure. , 1975, The International journal of neuroscience.

[10]  A. Holden Models of the stochastic activity of neurones , 1976 .

[11]  L. Young Entropy in dynamical systems , 2003 .

[12]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .