A Boosting Framework of Factorization Machine

Recently, Factorization Machines (FM) has become more and more popular for recommendation systems, due to its effectiveness in finding informative interactions between features. Usually, the weights for the interactions is learnt as a low rank weight matrix, which is formulated as an inner product of two low rank matrices. This low rank can help improve the generalization ability of Factorization Machines. However, to choose the rank properly, it usually needs to run the algorithm for many times using different ranks, which clearly is inefficient for some large-scale datasets. To alleviate this issue, we propose an Adaptive Boosting framework of Factorization Machines (AdaFM), which can adaptively search for proper ranks for different datasets without re-training. Instead of using a fixed rank for FM, the proposed algorithm will adaptively gradually increases its rank according to its performance until the performance does not grow, using boosting strategy. To verify the performance of our proposed framework, we conduct an extensive set of experiments on many real-world datasets. Encouraging empirical results shows that the proposed algorithms are generally more effective than state-of-the-art other Factorization Machines.

[1]  Feng Liang,et al.  Exploiting ranking factorization machines for microblog retrieval , 2013, CIKM.

[2]  Thomas Hofmann,et al.  Learning to Rank with Nonsmooth Cost Functions , 2006, NIPS.

[3]  David P. Helmbold,et al.  Boosting Methods for Regression , 2002, Machine Learning.

[4]  Chunyan Miao,et al.  A Boosting Algorithm for Item Recommendation with Implicit Feedback , 2015, IJCAI.

[5]  Weinan Zhang,et al.  LambdaFM: Learning Optimal Ranking with Factorization Machines Using Lambda Surrogates , 2016, CIKM.

[6]  Tie-Yan Liu,et al.  Learning to Rank for Information Retrieval , 2011 .

[7]  Yehuda Koren,et al.  Factorization meets the neighborhood: a multifaceted collaborative filtering model , 2008, KDD.

[8]  Gert R. G. Lanckriet,et al.  Metric Learning to Rank , 2010, ICML.

[9]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[10]  Tommi S. Jaakkola,et al.  Maximum-Margin Matrix Factorization , 2004, NIPS.

[11]  Nello Cristianini,et al.  An introduction to Support Vector Machines , 2000 .

[12]  Yoram Singer,et al.  An Efficient Boosting Algorithm for Combining Preferences by , 2013 .

[13]  Steffen Rendle,et al.  Factorization Machines , 2010, 2010 IEEE International Conference on Data Mining.

[14]  Tong Zhang,et al.  Gradient boosting factorization machines , 2014, RecSys '14.

[15]  Lars Schmidt-Thieme,et al.  Fast context-aware recommendations with factorization machines , 2011, SIGIR.

[16]  Steffen Rendle,et al.  Factorization Machines with libFM , 2012, TIST.

[17]  Roberto Turrin,et al.  Performance of recommender algorithms on top-n recommendation tasks , 2010, RecSys '10.

[18]  Steffen Rendle Scaling Factorization Machines to Relational Data , 2013, Proc. VLDB Endow..

[19]  Tie-Yan Liu,et al.  Learning to rank for information retrieval , 2009, SIGIR.

[20]  Hailong Sun,et al.  AdaMF: Adaptive Boosting Matrix Factorization for Recommender System , 2014, WAIM.

[21]  Yoram Singer,et al.  Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.

[22]  Lars Schmidt-Thieme,et al.  Pairwise interaction tensor factorization for personalized tag recommendation , 2010, WSDM '10.

[23]  Mehryar Mohri,et al.  AUC Optimization vs. Error Rate Minimization , 2003, NIPS.

[24]  Hang Li,et al.  AdaRank: a boosting algorithm for information retrieval , 2007, SIGIR.

[25]  Zhendong Niu,et al.  Novel Boosting Frameworks to Improve the Performance of Collaborative Filtering , 2013, ACML.