On the entropy minimization of a linear mixture of variables for source separation

The marginal entropy h(Z) of a weighted sum of two variables Z = αX + βY, expressed as a function of its weights, is a usual cost function for blind source separation (BSS), and more precisely for independent component analysis (ICA). Even if some theoretical investigations were done about the relevance from the BSS point of view of the global minimum of h(Z), very little is known about possible local spurious minima.In order to analyze the global shape of this entropy as a function of the weights, its analytical expression is derived in the ideal case of independent variables. Because of the ICA assumption that distributions are unknown, simulation results are used to show how and when local spurious minima may appear. Firstly, the entropy of a whitened mixture, as a function of the weights and under the constraint of independence between the source variables, is shown to have only relevant minima for ICA if at most one of the source distributions is multimodal. Secondly, it is shown that if independent multimodal sources are involved in the mixture, spurious local minima may appear. Arguments are given to explain the existence of spurious minima of h(Z) in the case of multimodal sources. The presented justification can also explain the location of these minima knowing the source distributions. Finally, it results from numerical examples that the maximum-entropy mixture is not necessarily reached for the 'most mixed' one (i.e. equal mixture weights), but depends of the entropy of the mixed variables.

[1]  Nathalie Delfosse,et al.  Adaptive blind separation of independent sources: A deflation approach , 1995, Signal Process..

[2]  Jean-Francois Cardoso,et al.  Source separation using higher order moments , 1989, International Conference on Acoustics, Speech, and Signal Processing,.

[3]  Jean-Francois Cardoso,et al.  Blind signal separation: statistical principles , 1998, Proc. IEEE.

[4]  David F. Findley,et al.  Applied Time Series Analysis II. , 1983 .

[5]  D. Donoho ON MINIMUM ENTROPY DECONVOLUTION , 1981 .

[6]  J. Rice Mathematical Statistics and Data Analysis , 1988 .

[7]  W. Godwin Article in Press , 2000 .

[8]  G. Darmois,et al.  Analyse générale des liaisons stochastiques: etude particulière de l'analyse factorielle linéaire , 1953 .

[9]  P. Comon Independent Component Analysis , 1992 .

[10]  Dinh-Tuan Pham,et al.  Blind separation of instantaneous mixture of sources based on order statistics , 2000, IEEE Trans. Signal Process..

[11]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .

[12]  Sergio Cruces,et al.  From blind signal extraction to blind instantaneous signal separation: criteria, algorithms, and stability , 2004, IEEE Transactions on Neural Networks.

[13]  E. Oja,et al.  Independent Component Analysis , 2013 .

[14]  Pierre Comon,et al.  Independent component analysis, A new concept? , 1994, Signal Process..

[15]  Christian Jutten,et al.  A general approach for mutual information minimization and its application to blind source separation , 2005, Signal Process..

[16]  Vwani P. Roychowdhury,et al.  ON THE UNIQUENESS OF THE MINIMUM OF THE INFORMATION-THEORETIC COST FUNCTION FOR THE SEPARATION OF MIXTURES OF NEARLY GAUSSIAN SIGNALS , 2003 .

[17]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[18]  Jean-François Cardoso,et al.  Dependence, Correlation and Gaussianity in Independent Component Analysis , 2003, J. Mach. Learn. Res..

[19]  Vwani P. Roychowdhury,et al.  Independent component analysis based on nonparametric density estimation , 2004, IEEE Transactions on Neural Networks.

[20]  Terrence J. Sejnowski,et al.  An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.

[21]  Sergio Cruces,et al.  The Minimum Entropy and Cumulants Based Contrast Functions for Blind Source Extraction , 2001, IWANN.

[22]  P. Laguna,et al.  Signal Processing , 2002, Yearbook of Medical Informatics.