Information and Estimation Theory

Shannon’s seminal paper of 1948 [32] is popularly seen as the cornerstone of information theory. Arguably Shannon’s paper addressed a significant problem of the age, namely the scientific quantification of the rather abstract concept of information. Despite numerous earlier works, predominantly by Nyquist, Kupfmuller, Gabor, and Hartley [15, 17, 22, 28], each fundamentally failed in a number of ways. Foremost of these deficiencies is that no rigorous treatment was given to the role of noise, although its importance was noted by both Nyquist and Hartley. Noise is an essential component to information transfer because random, and hence unpredictable, perturbations to an information carrying signal ultimately prohibit exact determination of the original signal or its implicit meaning. The probabilistic treatment of Shannon (and later contributors [30, 34, 39]) allowed the stochastic nature of noise to be considered in a concise and rigorous fashion, and thus a definition for information was born.

[1]  S. Kay Fundamentals of statistical signal processing: estimation theory , 1993 .

[2]  H. Nyquist,et al.  Certain factors affecting telegraph speed , 1924, Journal of the A.I.E.E..

[3]  Jean-François Bercher,et al.  Analysis of signals in the Fisher–Shannon information plane , 2003 .

[4]  C. R. Rao,et al.  Information and the Accuracy Attainable in the Estimation of Statistical Parameters , 1992 .

[5]  Mary L. Boas Mathematical Methods in the Physical Sciences: 2nd Ed , 1983 .

[6]  H. Sompolinsky,et al.  Mutual information of population codes and distance measures in probability space. , 2001, Physical review letters.

[7]  M. Crowder On constrained maximum likelihood estimation with non-i.i.d. observations , 1984 .

[8]  B. J. Meers,et al.  Recycling in laser-interferometric gravitational-wave detectors. , 1988, Physical review. D, Particles and fields.

[9]  Norbert Wiener,et al.  Extrapolation, Interpolation, and Smoothing of Stationary Time Series , 1964 .

[10]  Eric Walter,et al.  Qualitative and quantitative experiment design for phenomenological models - A survey , 1990, Autom..

[11]  Brian M. Sadler,et al.  Maximum-Likelihood Estimation, the CramÉr–Rao Bound, and the Method of Scoring With Parameter Constraints , 2008, IEEE Transactions on Signal Processing.

[12]  R. Fisher,et al.  On the Mathematical Foundations of Theoretical Statistics , 1922 .

[13]  Nicolas Brunel,et al.  Mutual Information, Fisher Information, and Population Coding , 1998, Neural Computation.

[14]  Craig K Abbey,et al.  Measures of performance in nonlinear estimation tasks: prediction of estimation performance at low signal-to-noise ratio , 2005, Physics in medicine and biology.

[15]  Cidambi Srinivasan,et al.  On Fisher information inequalities in the presence of nuisance parameters , 1994, Annals of the Institute of Statistical Mathematics.

[16]  Thomas L. Marzetta,et al.  A simple derivation of the constrained multiple parameter Cramer-Rao bound , 1993, IEEE Trans. Signal Process..

[17]  R. F. Wagner,et al.  Objective assessment of image quality. II. Fisher information, Fourier crosstalk, and figures of merit for task performance. , 1995, Journal of the Optical Society of America. A, Optics, image science, and vision.

[18]  Harry L. Van Trees,et al.  Detection, Estimation, and Modulation Theory, Part I , 1968 .

[19]  Philippe Réfrégier,et al.  Quantum limits in image processing , 2008 .

[20]  Stefan P. Mueller,et al.  Barankin bound: a model of detection with location uncertainty , 1992, Optics & Photonics.

[21]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[22]  Adriaan van den Bos,et al.  A Cramer-Rao lower bound for complex parameters , 1994, IEEE Trans. Signal Process..

[23]  B. C. Ng,et al.  On the Cramer-Rao bound under parametric constraints , 1998, IEEE Signal Processing Letters.

[24]  R. Hartley Transmission of information , 1928 .

[25]  Dennis Gabor,et al.  Theory of communication , 1946 .

[26]  S. Rice Mathematical analysis of random noise , 1944 .

[27]  E. Barankin Locally Best Unbiased Estimates , 1949 .

[28]  Rory A. Fisher,et al.  Theory of Statistical Estimation , 1925, Mathematical Proceedings of the Cambridge Philosophical Society.

[29]  Raman K. Mehra,et al.  Optimal input signals for parameter estimation in dynamic systems--Survey and new results , 1974 .

[30]  W. G. Tuller,et al.  Theoretical Limitations on the Rate of Transmission of Information , 1949, Proceedings of the IRE.

[31]  C. Boccara,et al.  Ultrahigh-resolution full-field optical coherence tomography. , 2004, Applied optics.

[32]  B. Frieden,et al.  Physics from Fisher Information by B. Roy Frieden , 1998 .

[33]  B. Frieden,et al.  Physics from Fisher Information: A Unification , 1998 .

[34]  L. Scharf,et al.  Statistical Signal Processing: Detection, Estimation, and Time Series Analysis , 1991 .

[35]  Matthew A. Kupinski,et al.  Objective Assessment of Image Quality , 2005 .

[36]  Alfred O. Hero,et al.  Lower bounds for parametric estimation with constraints , 1990, IEEE Trans. Inf. Theory.

[37]  Peter Török,et al.  Photon statistics in single molecule orientational imaging. , 2007, Optics express.