Performance of nonnegative latent factor models with β-distance functions in recommender systems

Nonnegative latent factor (NLF) models are able to well represent high-dimensional and sparse (HiDS) matrices filled with nonnegative data, which are frequently encountered in industrial applications like recommender systems. Current NLF models mostly adopt the Euclidean distance or Kullback-Leibler divergence as the objective function, which actually correspond to the special case of β=2 or 1 in β-distance functions. With β not limited in such special cases, an NLF model's performance varies, making it highly attractive to investigate the resultant performance variations. We first divide the ß-distance-based function into three categories, i.e., β=0, β=1, and β≠0 or 1, respectively. Subsequently, we deduce the nonnegative training rules corresponding for different kinds of objectives to achieve different NLF models. Experimental results on industrial matrices indicate that the frequently adopted cases of β=2 or 1 are probably not able to achieve the most accurate or efficient models. It is promising to further improve the performance of NLF models with carefully-tuned β-distance functions as the training objective.

[1]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[2]  Frank Y. Shih,et al.  A maxima-tracking method for skeletonization from Euclidean distance function , 1991, [Proceedings] Third International Conference on Tools for Artificial Intelligence - TAI 91.

[3]  Gediminas Adomavicius,et al.  Toward the next generation of recommender systems: a survey of the state-of-the-art and possible extensions , 2005, IEEE Transactions on Knowledge and Data Engineering.

[4]  Gang Chen,et al.  Collaborative Filtering Using Orthogonal Nonnegative Matrix Tri-factorization , 2007, Seventh IEEE International Conference on Data Mining Workshops (ICDMW 2007).

[5]  Chris H. Q. Ding,et al.  Convex and Semi-Nonnegative Matrix Factorizations , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  MengChu Zhou,et al.  A Nonnegative Latent Factor Model for Large-Scale Sparse Matrices in Recommender Systems via Alternating Direction Method , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[7]  Paul Resnick,et al.  Recommender systems , 1997, CACM.

[8]  Giuseppe De Nicolao,et al.  Client–Server Multitask Learning From Distributed Datasets , 2008, IEEE Transactions on Neural Networks.

[9]  Gediminas Adomavicius,et al.  Improving Aggregate Recommendation Diversity Using Ranking-Based Techniques , 2012, IEEE Transactions on Knowledge and Data Engineering.

[10]  John Riedl,et al.  Item-based collaborative filtering recommendation algorithms , 2001, WWW '01.

[11]  Josep Lluís de la Rosa i Esteva,et al.  A Negotiation-Style Recommender Based on Computational Ecology in Open Negotiation Environments , 2011, IEEE Transactions on Industrial Electronics.

[12]  Xiaoxing Yin,et al.  Time Domain Objective Function Based on Euclidean Distance Matrix and its Application in Optimization of Short Pulse Power Divider , 2016, IEEE Microwave and Wireless Components Letters.

[13]  Fernando Ortega,et al.  A non negative matrix factorization for collaborative filtering recommender systems based on a Bayesian probabilistic model , 2016, Knowl. Based Syst..

[14]  MengChu Zhou,et al.  An Efficient Second-Order Approach to Factorize Sparse Matrices in Recommender Systems , 2015, IEEE Transactions on Industrial Informatics.

[15]  Masaki Nakagawa,et al.  Objective Function Design for MCE-Based Combination of On-line and Off-line Character Recognizers for On-line Handwritten Japanese Text Recognition , 2011, 2011 International Conference on Document Analysis and Recognition.

[16]  Chih-Jen Lin,et al.  Projected Gradient Methods for Nonnegative Matrix Factorization , 2007, Neural Computation.

[17]  MengChu Zhou,et al.  An Incremental-and-Static-Combined Scheme for Matrix-Factorization-Based Collaborative Filtering , 2016, IEEE Transactions on Automation Science and Engineering.

[18]  Aihua Li,et al.  Select Objective Functions for Multiple Criteria Programming Classification , 2008, 2008 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology.

[19]  MengChu Zhou,et al.  A Novel Approach to Extracting Non-Negative Latent Factors From Non-Negative Big Sparse Matrices , 2016, IEEE Access.

[20]  Yehuda Koren,et al.  Advances in Collaborative Filtering , 2011, Recommender Systems Handbook.

[21]  Shuai Li,et al.  Symmetric and Nonnegative Latent Factor Models for Undirected, High-Dimensional, and Sparse Networks in Industrial Applications , 2017, IEEE Transactions on Industrial Informatics.

[22]  Jonathan L. Herlocker,et al.  Evaluating collaborative filtering recommender systems , 2004, TOIS.

[23]  David L. Woodruff,et al.  Experiments concerning sequential versus simultaneous maximization of objective function and distance , 2008, J. Heuristics.

[24]  MengChu Zhou,et al.  An Efficient Non-Negative Matrix-Factorization-Based Approach to Collaborative Filtering for Recommender Systems , 2014, IEEE Transactions on Industrial Informatics.

[25]  Mohamed Ali Kâafar,et al.  A differential privacy framework for matrix factorization recommender systems , 2016, User Modeling and User-Adapted Interaction.

[26]  Guy Shani,et al.  Evaluating Recommendation Systems , 2011, Recommender Systems Handbook.

[27]  David L. Woodruff,et al.  A distance function to support optimized selection decisions , 2005, Decis. Support Syst..

[28]  MengChu Zhou,et al.  An Inherently Nonnegative Latent Factor Model for High-Dimensional and Sparse Matrices from Industrial Applications , 2018, IEEE Transactions on Industrial Informatics.

[29]  Zhigang Luo,et al.  Online Nonnegative Matrix Factorization With Robust Stochastic Approximation , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[30]  Mingsheng Shang,et al.  Symmetric Non-negative Latent Factor Models for Undirected Large Networks , 2017, IJCAI.

[31]  MengChu Zhou,et al.  Generating Highly Accurate Predictions for Missing QoS Data via Aggregating Nonnegative Latent Factor Models , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[32]  Luc Vincent,et al.  Exact Euclidean distance function by chain propagations , 1991, Proceedings. 1991 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[33]  Fillia Makedon,et al.  Learning from Incomplete Ratings Using Non-negative Matrix Factorization , 2006, SDM.

[34]  P. Paatero,et al.  Positive matrix factorization: A non-negative factor model with optimal utilization of error estimates of data values† , 1994 .

[35]  Genevieve Gorrell,et al.  Generalized Hebbian Algorithm for Incremental Singular Value Decomposition in Natural Language Processing , 2006, EACL.

[36]  Michael Lindenbaum,et al.  Nonnegative Matrix Factorization with Earth Mover's Distance Metric for Image Analysis , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[37]  Dacheng Tao,et al.  Large-Cone Nonnegative Matrix Factorization , 2017, IEEE Transactions on Neural Networks and Learning Systems.