JUMP: a Jointly Predictor for User Click and Dwell Time

With the recent proliferation of recommendation system, there have been a lot of interests in sessionbased prediction methods, particularly those based on Recurrent Neural Network (RNN) and their variants. However, existing methods either ignore the dwell time prediction that plays an important role in measuring user’s engagement on the content, or fail to process very short or noisy sessions. In this paper, we propose a joint predictor, JUMP, for both user click and dwell time in session-based settings. To map its input into a feature vector, JUMP adopts a novel three-layered RNN structure which includes a fast-slow layer for very short sessions and an attention layer for noisy sessions. Experiments demonstrate that JUMP outperforms state-of-the-art methods in both user click and dwell time prediction.

[1]  Jason Weston,et al.  End-To-End Memory Networks , 2015, NIPS.

[2]  Peifeng Yin,et al.  Silence is also evidence: interpreting dwell time for recommendation from psychological perspective , 2013, KDD.

[3]  Alexandros Karatzoglou,et al.  Recurrent Neural Networks with Top-k Gains for Session-based Recommendations , 2017, CIKM.

[4]  Yiqun Liu,et al.  Time-Aware Click Model , 2016, ACM Trans. Inf. Syst..

[5]  Chandan K. Reddy,et al.  Machine Learning for Survival Analysis: A Survey , 2017, ArXiv.

[6]  Suju Rajan,et al.  Beyond clicks: dwell time for personalization , 2014, RecSys '14.

[7]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[8]  M. V. Rossum,et al.  In Neural Computation , 2022 .

[9]  Bowen Zhou,et al.  A Structured Self-attentive Sentence Embedding , 2017, ICLR.

[10]  Alexandros Karatzoglou,et al.  Session-based Recommendations with Recurrent Neural Networks , 2015, ICLR.

[11]  Peter Tino,et al.  IEEE Transactions on Neural Networks , 2009 .

[12]  Joseph Suarez,et al.  Language Modeling with Recurrent Highway Hypernetworks , 2017, NIPS.

[13]  Angelika Steger,et al.  Fast-Slow Recurrent Neural Networks , 2017, NIPS.

[14]  Gang Chen,et al.  Personal recommendation using deep recurrent neural networks in NetEase , 2016, 2016 IEEE 32nd International Conference on Data Engineering (ICDE).

[15]  Pasquale Lops,et al.  Word Embedding Techniques for Content-based Recommender Systems: An Empirical Evaluation , 2015, RecSys Posters.

[16]  Shuliang Wang,et al.  Data Mining and Knowledge Discovery , 2012, Springer Handbook of Geographic Information.

[17]  Greg Linden,et al.  Amazon . com Recommendations Item-to-Item Collaborative Filtering , 2001 .

[18]  Alexander J. Smola,et al.  Neural Survival Recommender , 2017, WSDM.

[19]  Yoshua Bengio,et al.  Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.

[20]  Lior Rokach,et al.  Proceedings of the 3rd Workshop on Deep Learning for Recommender Systems , 2016, DLRS.

[21]  Sebastian Ruder,et al.  An Overview of Multi-Task Learning in Deep Neural Networks , 2017, ArXiv.

[22]  Jonathan Baxter,et al.  A Bayesian/Information Theoretic Model of Learning to Learn via Multiple Task Sampling , 1997, Machine Learning.

[23]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[24]  Andreas Hotho,et al.  Improving Session Recommendation with Recurrent Neural Networks by Exploiting Dwell Time , 2017, ArXiv.

[25]  Le Song,et al.  Joint Modeling of Event Sequence and Time Series with Attentional Twin Recurrent Neural Networks , 2017, ArXiv.

[26]  Utkarsh Upadhyay,et al.  Recurrent Marked Temporal Point Processes: Embedding Event History to Vector , 2016, KDD.

[27]  Alexandros Karatzoglou,et al.  Personalizing Session-based Recommendations with Hierarchical Recurrent Neural Networks , 2017, RecSys.