Gaussian Compression Stream: Principle and Preliminary Results

Random projections became popular tools to process big data. In particular, when applied to Nonnegative Matrix Factorization (NMF), it was shown that structured random projections were far more efficient than classical strategies based on Gaussian compression. However, they remain costly and might not fully benefit from recent fast random projection techniques. In this paper, we thus investigate an alternative to structured ran-om projections-named Gaussian compression stream-which (i) is based on Gaussian compressions only, (ii) can benefit from the above fast techniques, and (iii) is shown to be well-suited to NMF.

[1]  Fei Wang,et al.  Efficient Nonnegative Matrix Factorization with Random Projections , 2010, SDM.

[2]  Farouk Yahaya,et al.  Faster-than-fast NMF using random projections and Nesterov iterations , 2018, ArXiv.

[3]  Guillermo Sapiro,et al.  Online Learning for Matrix Factorization and Sparse Coding , 2009, J. Mach. Learn. Res..

[4]  Yu-Jin Zhang,et al.  Nonnegative Matrix Factorization: A Comprehensive Review , 2013, IEEE Transactions on Knowledge and Data Engineering.

[5]  Zhigang Luo,et al.  NeNMF: An Optimal Gradient Method for Nonnegative Matrix Factorization , 2012, IEEE Transactions on Signal Processing.

[6]  Guillermo Sapiro,et al.  Compressed Nonnegative Matrix Factorization Is Fast and Accurate , 2015, IEEE Transactions on Signal Processing.

[7]  David P. Woodruff,et al.  How to Fake Multiply by a Gaussian Matrix , 2016, ICML.

[8]  Nicolas Gillis,et al.  The Why and How of Nonnegative Matrix Factorization , 2014, ArXiv.

[9]  Andrzej Cichocki,et al.  Fast Nonnegative Matrix/Tensor Factorization Based on Low-Rank Approximation , 2012, IEEE Transactions on Signal Processing.

[10]  Vatsal Sharan,et al.  Compressed Factorization: Fast and Accurate Low-Rank Factorization of Compressively-Sensed Data , 2017, ICML.

[11]  Farouk Yahaya,et al.  How to Apply Random Projections to Nonnegative Matrix Factorization with Missing Entries? , 2019, 2019 27th European Signal Processing Conference (EUSIPCO).

[12]  Hyunsoo Kim,et al.  Nonnegative Matrix Factorization Based on Alternating Nonnegativity Constrained Least Squares and Active Set Method , 2008, SIAM J. Matrix Anal. Appl..

[13]  Florent Krzakala,et al.  Random projections through multiple optical scattering: Approximating Kernels at the speed of light , 2015, 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[14]  W. B. Johnson,et al.  Extensions of Lipschitz mappings into Hilbert space , 1984 .

[15]  J. Nathan Kutz,et al.  Randomized nonnegative matrix factorization , 2017, Pattern Recognit. Lett..

[16]  Nathan Halko,et al.  Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions , 2009, SIAM Rev..

[17]  Chao Liu,et al.  Distributed nonnegative matrix factorization for web-scale dyadic data analysis on mapreduce , 2010, WWW '10.

[18]  David P. Woodruff,et al.  Low rank approximation and regression in input sparsity time , 2012, STOC '13.

[19]  Matthieu Puigt,et al.  Nonlinear mobile sensor calibration using informed semi-nonnegative matrix factorization with a Vandermonde factor , 2016, 2016 IEEE Sensor Array and Multichannel Signal Processing Workshop (SAM).

[20]  Matthieu Puigt,et al.  Informed Nonnegative Matrix Factorization Methods for Mobile Sensor Network Calibration , 2018, IEEE Transactions on Signal and Information Processing over Networks.