Mood Classfication from Musical Audio Using User Group-Dependent Models

In this paper, we propose a music mood classification system that reflects a user's profile based on a belief that music mood perception is subjective and can vary depending on the user's profile such as age or gender. To this end, we first define a set of generic mood descriptors. Secondly, we make up several user profiles according to the age and gender. We then obtain musical items, for each group, to separately train the statistical models. Using the two different user models, we verify our hypothesis that the user profiles play an important role in mood perception by showing that both models achieve higher classification accuracy when the test data and the mood model are of the same kind. Applying our system to automatic play list generation, we also demonstrate that considering the difference between the user groups in mood perception has a significant effect in computing music similarity.