The relationship between causal and non-causal mismatched estimation in continuous-time AWGN channels
暂无分享,去创建一个
[1] E. L. Lehmann,et al. Theory of point estimation , 1950 .
[2] Andrei N. Kolmogorov,et al. On the Shannon theory of information transmission in the case of continuous signals , 1956, IRE Trans. Inf. Theory.
[3] A. J. Stam. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..
[4] I. V. Girsanov. On Transforming a Certain Class of Stochastic Processes by Absolutely Continuous Substitution of Measures , 1960 .
[5] Amiel Feinstein,et al. Information and information stability of random variables and processes , 1964 .
[6] Tyrone E. Duncan,et al. Evaluation of Likelihood Functions , 1968, Inf. Control..
[7] T. Duncan. ON THE CALCULATION OF MUTUAL INFORMATION , 1970 .
[8] Jacob Ziv,et al. Mutual information of the white Gaussian channel with and without feedback , 1971, IEEE Trans. Inf. Theory.
[9] T. Kailath. The Structure of Radon-Nikodym Derivatives with Respect to Wiener and Related Measures , 1971 .
[10] T. T. Kadota,et al. Capacity of a continuous memoryless channel with feedback , 1971, IEEE Trans. Inf. Theory.
[11] Steven Orey,et al. Conditions for the absolute continuity of two diffusions , 1974 .
[12] Alʹbert Nikolaevich Shiri︠a︡ev,et al. Statistics of random processes , 1977 .
[13] Aaron D. Wyner,et al. A Definition of Conditional Mutual Information for Arbitrary Ensembles , 1978, Inf. Control..
[14] A. Barron. ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .
[15] Ioannis Karatzas,et al. Brownian Motion and Stochastic Calculus , 1987 .
[16] Neri Merhav,et al. A strong version of the redundancy-capacity theorem of universal coding , 1995, IEEE Trans. Inf. Theory.
[17] Amir Dembo,et al. Large Deviations Techniques and Applications , 1998 .
[18] Shlomo Shamai,et al. Mutual Information and Conditional Mean Estimation in Poisson Channels , 2004, IEEE Transactions on Information Theory.
[19] K. Ball,et al. Solution of Shannon's problem on the monotonicity of entropy , 2004 .
[20] Daniel Pérez Palomar,et al. Gradient of mutual information in linear vector Gaussian channels , 2006, IEEE Transactions on Information Theory.
[21] Shlomo Shamai,et al. Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.
[22] Moshe Zakai,et al. On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel , 2004, IEEE Transactions on Information Theory.
[23] Antonia Maria Tulino,et al. Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.
[24] Antonia Maria Tulino,et al. Optimum power allocation for parallel Gaussian channels with arbitrary input distributions , 2006, IEEE Transactions on Information Theory.
[25] Jacob Binia. Divergence and minimum mean-square error in continuous-time additive white Gaussian noise channels , 2006, IEEE Transactions on Information Theory.
[26] Sergio Verdú,et al. A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.
[27] Moshe Zakai,et al. Some relations between mutual information and estimation error in Wiener space , 2006 .
[28] Gregoire Nicolis,et al. Stochastic resonance , 2007, Scholarpedia.
[29] Daniel Pérez Palomar,et al. Representation of Mutual Information Via Input Estimates , 2007, IEEE Transactions on Information Theory.
[30] Mokshay M. Madiman,et al. Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.
[31] Tsachy Weissman,et al. Scanning and Sequential Decision Making for Multidimensional Data—Part II: The Noisy Case , 2008, IEEE Transactions on Information Theory.
[32] Dongning Guo,et al. Relative entropy and score function: New information-estimation relationships through arbitrary additive perturbation , 2009, 2009 IEEE International Symposium on Information Theory.
[33] Haim H. Permuter,et al. Directed information and causal estimation in continuous time , 2009, 2009 IEEE International Symposium on Information Theory.
[34] Sergio Verdú,et al. Mismatched Estimation and Relative Entropy , 2009, IEEE Transactions on Information Theory.
[35] Tyrone E. Duncan. Mutual Information for Stochastic Signals and LÉvy Processes , 2010, IEEE Transactions on Information Theory.
[36] Tsachy Weissman,et al. The Relationship Between Causal and Noncausal Mismatched Estimation in Continuous-Time AWGN Channels , 2010, IEEE Transactions on Information Theory.
[37] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .