Computational Complexity Reduction for Factorization-Based Collaborative Filtering Algorithms

Alternating least squares (ALS) is a powerful matrix factorization (MF) algorithm for both implicit and explicit feedback based recommender systems. We show that by using the Sherman-Morrison formula (SMF), we can reduce the computational complexity of several ALS based algorithms. It also reduces the complexity of greedy forward and backward feature selection algorithms by an order of magnitude. We propose linear kernel ridge regression (KRR) for users with few ratings. We show that both SMF and KRR can efficiently handle new ratings.

[1]  Yehuda Koren,et al.  Scalable Collaborative Filtering with Jointly Derived Neighborhood Interpolation Weights , 2007, Seventh IEEE International Conference on Data Mining (ICDM 2007).

[2]  Domonkos Tikk,et al.  Major components of the gravity recommendation system , 2007, SKDD.

[3]  B. Nemeth,et al.  A unified approach of factor models and neighbor based methods for large recommender systems , 2008, 2008 First International Conference on the Applications of Digital Information and Web Technologies (ICADIWT).

[4]  Domonkos Tikk,et al.  Investigation of Various Matrix Factorization Methods for Large Recommender Systems , 2008, 2008 IEEE International Conference on Data Mining Workshops.

[5]  Yehuda Koren,et al.  The BellKor solution to the Netflix Prize , 2007 .

[6]  Yifan Hu,et al.  Collaborative Filtering for Implicit Feedback Datasets , 2008, 2008 Eighth IEEE International Conference on Data Mining.

[7]  Dennis S. Bernstein,et al.  Matrix Mathematics: Theory, Facts, and Formulas with Application to Linear Systems Theory , 2005 .