Emotion recognition from Moroccan dialect speech and energy band distribution

The aim of this study is to propose an approach to the automatic recognition of emotions from the signal speech [18], [13]: it relies on the extraction of spectral and prosodies characteristics, more particularly on the energy and the distribution of this energy in six bands. These characteristics are obtained from syllabic units (CV) where C denotes a consonant and V a vowel. To this end, we constructed a Moroccan emotional database (MEDB) from broadcasts on the YouTube channel representing different real emotional states. The features set is obtained from Matlab codes and Praat [23] software. The emotions used in this study are anger, joy, neutral state and sadness. The emotion recognition systems are developed using neural networks, Support Vector Machine and decision trees algorithms. The aim of this work is firstly to explore syllabic units to identify the emotions embedded in the speech signal and secondly to demonstrate the relevance of the energy distribution in determining emotional states.

[1]  K. Scherer,et al.  Acoustic profiles in vocal emotion expression. , 1996, Journal of personality and social psychology.

[2]  Björn Schuller,et al.  The Automatic Recognition of Emotions in Speech , 2011 .

[3]  Pierre-Yves Oudeyer,et al.  The production and recognition of emotions in speech: features and algorithms , 2003, Int. J. Hum. Comput. Stud..

[4]  Abdelmajid Farchi,et al.  Energy bands and spectral cues for Arabic vowels recognition , 2016, Int. J. Speech Technol..

[5]  Jean-Claude Martin,et al.  Collection and Annotation of a Corpus of Human-Human Multimodal Interactions: Emotion and Others Anthropomorphic Characteristics , 2007, ACII.

[6]  A. Farchi,et al.  Effect of Negative Emotions on the Fundamental Frequency and Formants , 2016 .

[7]  Mann Oo. Hay Emotion recognition in human-computer interaction , 2012 .

[8]  Jean-Claude Martin,et al.  Coding Emotional Events in Audiovisual Corpora , 2008, LREC.

[9]  Roddy Cowie,et al.  Emotional speech: Towards a new generation of databases , 2003, Speech Commun..

[10]  Astrid Paeschke,et al.  Prosodic Characteristics of Emotional Speech: Measurements of Fundamental Frequency Movements , 2000 .

[11]  Klaus R. Scherer,et al.  Vocal communication of emotion , 2000 .

[12]  Neerincx,et al.  Affective collaborative robots for safety & crisis management in the field , 2007 .

[13]  Heysem Kaya,et al.  Efficient and effective strategies for cross-corpus acoustic emotion recognition , 2018, Neurocomputing.

[14]  Fabien Ringeval,et al.  Affective and behavioural computing: Lessons learnt from the First Computational Paralinguistics Challenge , 2019, Comput. Speech Lang..

[15]  Diego H. Milone,et al.  Spoken emotion recognition using hierarchical classifiers , 2011, Comput. Speech Lang..

[16]  Ioannis Pitas,et al.  The eNTERFACE’05 Audio-Visual Emotion Database , 2006, 22nd International Conference on Data Engineering Workshops (ICDEW'06).

[17]  Alex Pappachen James,et al.  Detection and Analysis of Emotion From Speech Signals , 2015, ArXiv.

[18]  P. Laukka,et al.  Communication of emotions in vocal expression and music performance: different channels, same code? , 2003, Psychological bulletin.

[19]  Carlos Busso,et al.  Interrelation Between Speech and Facial Gestures in Emotional Utterances: A Single Subject Study , 2007, IEEE Transactions on Audio, Speech, and Language Processing.

[20]  Paul Boersma,et al.  Praat, a system for doing phonetics by computer , 2002 .

[21]  Abdelmajid Farchi,et al.  Arabic stop consonants characterisation and classification using the normalized energy in frequency bands , 2017, International Journal of Speech Technology.

[22]  Muhammad Ghulam,et al.  Study on pharyngeal and uvular consonants in foreign accented Arabic for ASR , 2010, Comput. Speech Lang..

[23]  Laurence Devillers,et al.  Représentation et détection des émotions dans des dialogues enregistrés dans un centre d'appel. Des émotions complexes dans des données réelles , 2006, Rev. d'Intelligence Artif..