Over the past decade, a lot of research has been done in audio content analysis for extracting various kinds of information, especially the moods it denotes, from an audio signal, because music expresses emotions in a concise and succinct way, yet in an effective way. People select music in congruence to their moods and emotions, making the need to classify music in accordance to moods more of a demand. Since different individuals have different perceptions about classifying music according to mood, it becomes a much more difficult task. This paper proposes an automated and efficient method to perceive the mood of any given music piece, or the "emotions" related to it, by drawing out a link between the spectral and harmonic features and human perception of music and moods. Features such as rhythm, harmony, spectral feature, and so on, are studied in order to classify the songs according to its mood, based on Thayer's model. The values of the quantified features are then compared against the threshold value using neural networks before classifying them according to different mood labels. The method analyzes many different features of the music piece, including spectra of beat and roughness, before classifying it under any mood. A total of 8 different moods are considered. In particular, the paper classifies both western and Indian Hindi film music, taking into consideration, a database of over 100 songs in total. The efficiency of this method was found to reach 94.44% at the best.
[1]
Gregory D. Webster,et al.
Emotional Responses to Music: Interactive Effects of Mode, Texture, and Tempo
,
2005
.
[2]
S. O. Ali,et al.
Intensity of Emotions Conveyed and Elicited by Familiar and Unfamiliar Music
,
2010
.
[3]
A. Gabrielsson,et al.
The influence of musical structure on emotional expression.
,
2001
.
[4]
David Huron.
Perceptual and Cognitive Applications in Music Information Retrieval
,
2000,
ISMIR.
[5]
Elias Pampalk,et al.
Content-based organization and visualization of music archives
,
2002,
MULTIMEDIA '02.
[6]
Ulrich Schimmack,et al.
Mixed affective responses to music with conflicting cues
,
2008
.
[7]
E. Schellenberg,et al.
Feelings and Perceptions of Happiness and Sadness Induced by Music: Similarities, Differences, and Mixed Emotions
,
2010
.
[8]
George Tzanetakis,et al.
Musical genre classification of audio signals
,
2002,
IEEE Trans. Speech Audio Process..
[9]
P. Juslin,et al.
Cue Utilization in Communication of Emotion in Music Performance: Relating Performance to Perception Studies of Music Performance
,
2022
.
[10]
Shingo Uchihashi,et al.
The beat spectrum: a new approach to rhythm analysis
,
2001,
IEEE International Conference on Multimedia and Expo, 2001. ICME 2001..