Mood-Aware Music Recommendation via Adaptive Song Embedding

In this paper, we propose an autonomous and adaptive recommendation system that relies on the user's mood and implicit feedback to recommend songs without any prior knowledge about the user preferences. Our method builds autonomously a latent factor model from the available online data of many users (generic song map per mood) based on the associations extracted between user, song, user mood and song emotion. It uses a combination of the Reinforcement Learning (RL) framework and Page-Hinkley (PH) test to personalize the general song map for each mood according to user implicit reward. We conduct a series of tests using LiveJournal two-million (LJ2M) dataset to show the effect of mood in music recommendation and how the proposed solution can improve the performance of music recommendation over time compared to other conventional solutions in terms of hit rate and Fl score.