Predicting Viewer’s Preference for Music Videos Using EEG Dataset

We study to predict a viewer’s preference for a music video through the recorded EEG signals in the DEAP dataset. The experimental results show that using full-length EEG recordings the accuracy reaches 70% with 12-channel EEG. However, if the EEG is reduced to 5 channels with ICA (independent component analysis), only a few EEG segments are needed to achieve even higher accuracy, up to around 75% with the use of an autoencoder network.

[1]  Yuan-Pin Lin,et al.  EEG-Based Emotion Recognition in Music Listening , 2010, IEEE Transactions on Biomedical Engineering.

[2]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[3]  Shingchern D. You,et al.  Classification of User Preference for Music Videos Based on EEG Recordings , 2020, 2020 IEEE 2nd Global Conference on Life Sciences and Technologies (LifeTech).