Calculation of Differential Entropy for a Mixed Gaussian Distribution

In this work, an analytical expression is developed for the differential entropy of a mixed Gaussian distribution. One of the terms is given by a tabulated function of the ratio of the distribution parameters.

[1]  N.J.I. Mars,et al.  Time delay estimation in nonlinear systems , 1981 .

[2]  Joseph Lipka,et al.  A Table of Integrals , 2010 .

[3]  J. V. Michalowicz,et al.  Signal Estimation based on Mutual Information Maximization , 2007, 2007 Conference Record of the Forty-First Asilomar Conference on Signals, Systems and Computers.

[4]  Leslie M. Collins,et al.  Cramer-Rao lower bound for estimating quadrupole resonance signals in non-Gaussian noise , 2004, IEEE Signal Processing Letters.

[5]  I. M. Pyshik,et al.  Table of integrals, series, and products , 1965 .

[6]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[7]  Pushpa N. Rathie,et al.  On the entropy of continuous probability distributions (Corresp.) , 1978, IEEE Trans. Inf. Theory.

[8]  Harald Haas,et al.  Asilomar Conference on Signals, Systems, and Computers , 2006 .

[9]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[10]  Zhiwu Lu An iterative algorithm for entropy regularized likelihood learning on Gaussian mixture with automatic model selection , 2006, Neurocomputing.

[11]  Lenan Wu,et al.  Nonlinear signal detection from an array of threshold devices for non-Gaussian noise , 2007, Digit. Signal Process..

[12]  Deniz Erdogmus,et al.  Convolutive blind source separation by minimizing mutual information between segments of signals , 2005, IEEE Transactions on Circuits and Systems I: Regular Papers.

[13]  Bernard Mulgrew,et al.  Non-parametric likelihood based channel estimator for Gaussian mixture noise , 2007, Signal Process..