A Database of Japanese Emotional Signals Elicited by Real Experiences

This paper presents a Japanese emotional database that contains speech and physiological signals that can be used to develop algorithms for emotion recognition using audio, physiological signals, or several combined signals. Research on emotions was underpinned by using this database, and health-care oriented applications were the main reason this database was constructed. All six basic human emotions were elicited by using real emotional experiences, which had different impacts on health conditions. We also describe the experimental setup and protocols. Finally, signals from more than 50 people were included in the database.

[1]  Jason Williams,et al.  Emotion Recognition Using Bio-sensors: First Steps towards an Automatic System , 2004, ADS.

[2]  Astrid Paeschke,et al.  A database of German emotional speech , 2005, INTERSPEECH.

[3]  Shin'ichi Warisawa,et al.  An Approach for Emotion Recognition using Purely Segment-Level Acoustic Features , 2014 .

[4]  Tsang-Long Pao,et al.  Segment-based emotion recognition from continuous Mandarin Chinese speech , 2011, Comput. Hum. Behav..

[5]  L. Trainor,et al.  Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions , 2001 .

[6]  Michael G. Strintzis,et al.  ECG pattern recognition and classification using non-linear transformations and neural networks: A review , 1998, Int. J. Medical Informatics.

[7]  Matthew M. Burg,et al.  Negative Emotion and Coronary Heart Disease , 2003, Behavior modification.

[8]  Kazuhiko Takahashi,et al.  Remarks on emotion recognition from multi-modal bio-potential signals , 2004, 2004 IEEE International Conference on Industrial Technology, 2004. IEEE ICIT '04..

[9]  Emiel Krahmer,et al.  Real vs. acted emotional speech , 2006, INTERSPEECH.

[10]  Björn W. Schuller,et al.  Timing levels in segment-based speech emotion recognition , 2006, INTERSPEECH.

[11]  Laila Dybkjær,et al.  Affective Dialogue Systems , 2004, Lecture Notes in Computer Science.

[12]  P. Ekman,et al.  Emotion in the Human Face: Guidelines for Research and an Integration of Findings , 1972 .

[13]  R. Davidson Affective neuroscience and psychophysiology: toward a synthesis. , 2003, Psychophysiology.

[14]  R. A. Mcfarland Relationship of skin temperature changes to the emotions accompanying music , 1985, Biofeedback and self-regulation.

[15]  Shin'ichi Warisawa,et al.  Emotional Valence Detection based on a Novel Wavelet Feature Extraction Strategy using EEG Signals , 2014, HEALTHINF.

[16]  E. Clipp,et al.  Combat experience and emotional health: impairment and resilience in later life. , 1989, Journal of personality.

[17]  R. Hinchcliffe,et al.  Emotion as a Precipitating Factor in Meniere's Disease , 1967, The Journal of Laryngology & Otology.

[18]  Yongzhao Zhan,et al.  A novel hierarchical speech emotion recognition method based on improved DDAGSVM , 2010, Comput. Sci. Inf. Syst..

[19]  Bo Cheng,et al.  Emotion Recognition from Surface EMG Signal Using Wavelet Transform and Neural Network , 2008 .