Nonparametric Bayesian Nonnegative Matrix Factorization

Nonnegative Matrix Factorization (NMF) is an important tool in machine learning for blind source separation and latent factor extraction. Most of existing NMF algorithms assume a specific noise kernel, which is insufficient to deal with complex noise in real scenarios. In this study, we present a hierarchical nonparametric nonnegative matrix factorization (NPNMF) model in which the Gaussian mixture model is used to approximate the complex noise distribution. The model is cast in the nonparametric Bayesian framework by using Dirichlet process mixture to infer the necessary number of Gaussian components. We derive a mean-field variational inference algorithm for the proposed nonparametric Bayesian model. Experimental results on both synthetic data and electroencephalogram (EEG) demonstrate that NPNMF performs better in extracting the latent nonnegative factors in comparison with state-of-the-art methods.

[1]  Yu-Jin Zhang,et al.  Nonnegative Matrix Factorization: A Comprehensive Review , 2013, IEEE Transactions on Knowledge and Data Engineering.

[2]  Ali Taylan Cemgil,et al.  Bayesian Inference for Nonnegative Matrix Factorisation Models , 2009, Comput. Intell. Neurosci..

[3]  Elmar Wolfgang Lang,et al.  A new Bayesian approach to nonnegative matrix factorization: Uniqueness and model order selection , 2014, Neurocomputing.

[4]  Patrik O. Hoyer,et al.  Non-negative Matrix Factorization with Sparseness Constraints , 2004, J. Mach. Learn. Res..

[5]  Hugo Van hamme,et al.  Automatic relevance determination for nonnegative dictionary learning in the gamma-Poisson model , 2017, Signal Process..

[6]  Klaus-Robert Müller,et al.  The BCI competition 2003: progress and perspectives in detection and discrimination of EEG single trials , 2004, IEEE Transactions on Biomedical Engineering.

[7]  Michael I. Jordan,et al.  Variational inference for Dirichlet process mixtures , 2006 .

[8]  Wing-Kin Ma,et al.  Nonnegative Matrix Factorization for Signal and Data Analytics: Identifiability, Algorithms, and Applications , 2018, IEEE Signal Processing Magazine.

[9]  V. Maz'ya,et al.  On approximate approximations using Gaussian kernels , 1996 .

[10]  John Shawe-Taylor,et al.  MahNMF: Manhattan Non-negative Matrix Factorization , 2012, ArXiv.

[11]  Nikolaos D. Sidiropoulos,et al.  Putting nonnegative matrix factorization to the test: a tutorial derivation of pertinent cramer—rao bounds and performance benchmarking , 2014, IEEE Signal Processing Magazine.

[12]  Morten Mørup,et al.  Probabilistic Sparse Non-negative Matrix Factorization , 2018, LVA/ICA.

[13]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.