Local Learning Rules for Nonnegative Tucker Decomposition

Analysis of data with high dimensionality in modern applications, such as spectral analysis, neuroscience, chemometrices naturally requires tensorial approaches different from standard matrix factorizations (PCA, ICA, NMF). The Tucker decomposition and its constrained versions with sparsity and/or nonnegativity constraints allow for the extraction of different numbers of hidden factors in each of the modes, and permits interactions within each modality having many potential applications in computational neuroscience, text mining, and data analysis. In this paper, we propose a new algorithm for Nonnegative Tucker Decomposition (NTD) based on a constrained minimization of a set of local cost functions which is suitable for large scale problems. Extensive experiments confirm the validity and high performance of the developed algorithms in comparison with other well-known algorithms.