Learning Label Trees for Probabilistic Modelling of Implicit Feedback
暂无分享,去创建一个
[1] Qiang Yang,et al. One-Class Collaborative Filtering , 2008, 2008 Eighth IEEE International Conference on Data Mining.
[2] Geoffrey E. Hinton,et al. A Scalable Hierarchical Distributed Language Model , 2008, NIPS.
[3] Ruslan Salakhutdinov,et al. Probabilistic Matrix Factorization , 2007, NIPS.
[4] Rong Pan,et al. Mind the gaps: weighting the unknown in large-scale one-class collaborative filtering , 2009, KDD.
[5] Yehuda Koren,et al. Factorization meets the neighborhood: a multifaceted collaborative filtering model , 2008, KDD.
[6] Yoshua Bengio,et al. Hierarchical Probabilistic Neural Network Language Model , 2005, AISTATS.
[7] Jason Weston,et al. Label Embedding Trees for Large Multi-Class Tasks , 2010, NIPS.
[8] Yifan Hu,et al. Collaborative Filtering for Implicit Feedback Datasets , 2008, 2008 Eighth IEEE International Conference on Data Mining.
[9] Joshua Goodman,et al. Classes for fast maximum entropy training , 2001, 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.01CH37221).
[10] Benjamin M. Marlin,et al. Collaborative Filtering: A Machine Learning Perspective , 2004 .
[11] John Langford,et al. Conditional Probability Tree Estimation Analysis and Algorithms , 2009, UAI.
[12] Tommi S. Jaakkola,et al. Maximum-Margin Matrix Factorization , 2004, NIPS.
[13] Lars Schmidt-Thieme,et al. BPR: Bayesian Personalized Ranking from Implicit Feedback , 2009, UAI.
[14] Léon Bottou,et al. Large-Scale Machine Learning with Stochastic Gradient Descent , 2010, COMPSTAT.