How biased are maximum entropy models?
暂无分享,去创建一个
[1] Maya R. Gupta,et al. Bayesian estimation of the entropy of the multivariate Gaussian , 2008, 2008 IEEE International Symposium on Information Theory.
[2] Stefano Panzeri,et al. The Upward Bias in Measures of Information Derived from Limited Data Samples , 1995, Neural Computation.
[3] Helmut Bölcskei,et al. Characterizing the statistical properties of mutual information in MIMO channels: insights into diversity-multiplexing tradeoff , 2002, Conference Record of the Thirty-Sixth Asilomar Conference on Signals, Systems and Computers, 2002..
[4] Michael J. Berry,et al. Weak pairwise correlations imply strongly correlated network states in a neural population , 2005, Nature.
[5] Alexander Borst,et al. Information theory and neural coding , 1999, Nature Neuroscience.
[6] N. R. Goodman. The Distribution of the Determinant of a Complex Wishart Distributed Matrix , 1963 .
[7] D. V. Gokhale,et al. Entropy expressions and their estimators for multivariate distributions , 1989, IEEE Trans. Inf. Theory.
[8] Erik Aurell,et al. Frontiers in Computational Neuroscience , 2022 .
[9] Stefano Panzeri,et al. Correcting for the sampling bias problem in spike train information measures. , 2007, Journal of neurophysiology.
[10] Michael J. Berry,et al. Spin glass models for a network of real neurons , 2009, 0912.5409.
[11] Stefano Panzeri,et al. Open Source Tools for the Information Theoretic Analysis of Neural Data , 2009, Frontiers in neuroscience.
[12] William Bialek,et al. Entropy and information in neural spike trains: progress on the sampling problem. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.
[13] W. Bialek,et al. Maximum entropy models for antibody diversity , 2009, Proceedings of the National Academy of Sciences.
[14] Ga Miller,et al. Note on the bias of information estimates , 1955 .
[15] M. Bethge,et al. Common input explains higher-order correlations and entropy in a simple model of neural population activity. , 2011, Physical review letters.
[16] J. Hertz,et al. Ising model for neural data: model quality and approximate methods for extracting functional connectivity. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.
[17] Harshinder Singh,et al. Estimation of the entropy of a multivariate normal distribution , 2005 .
[18] William Bialek,et al. Spikes: Exploring the Neural Code , 1996 .
[19] Maya R. Gupta,et al. Parametric Bayesian Estimation of Differential Entropy and Relative Entropy , 2010, Entropy.
[20] Liam Paninski,et al. Estimation of Entropy and Mutual Information , 2003, Neural Computation.
[21] A. Pouget,et al. Neural correlations, population coding and computation , 2006, Nature Reviews Neuroscience.
[22] Habib Benali,et al. Large-Sample Asymptotic Approximations for the Sampling and Posterior Distributions of Differential Entropy for Multivariate Normal Distributions , 2011, Entropy.
[23] R. Quiroga,et al. Extracting information from neuronal populations : information theory and decoding approaches , 2022 .
[24] Alexander S. Ecker,et al. Generating Spike Trains with Specified Correlation Coefficients , 2009, Neural Computation.
[25] Ifije E. Ohiorhenuan,et al. Sparse coding and high-order correlations in fine-scale cortical networks , 2010, Nature.
[26] P. Cochat,et al. Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.
[27] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[28] E. Ising. Beitrag zur Theorie des Ferromagnetismus , 1925 .
[29] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[30] Jonathon Shlens,et al. The Structure of Multi-Neuron Firing Patterns in Primate Retina , 2006, The Journal of Neuroscience.