AMHUSE: a multimodal dataset for HUmour SEnsing

We present AMHUSE (A Multimodal dataset for HUmour SEnsing) along with a novel web-based annotation tool named DANTE (Dimensional ANnotation Tool for Emotions). The dataset is the result of an experiment concerning amusement elicitation, involving 36 subjects in order to record the reactions in presence of 3 amusing and 1 neutral video stimuli. Gathered data include RGB video and depth sequences along with physiological responses (electrodermal activity, blood volume pulse, temperature). The videos were later annotated by 4 experts in terms of valence and arousal continuous dimensions. Both the dataset and the annotation tool are made publicly available for research purposes.

[1]  Björn W. Schuller,et al.  Categorical and dimensional affect analysis in continuous input: Current trends and future directions , 2013, Image Vis. Comput..

[2]  P. Niedenthal,et al.  Fashioning the Face: Sensorimotor Simulation Contributes to Facial Expression Recognition , 2016, Trends in Cognitive Sciences.

[3]  M. Mermillod,et al.  Is eye contact the key to the social brain? , 2010, Behavioral and Brain Sciences.

[4]  Holger Hoffmann,et al.  OPEN_EmoRec_II- A Multimodal Corpus of Human-Computer Interaction , 2015 .

[5]  Clifford Nass,et al.  The media equation - how people treat computers, television, and new media like real people and places , 1996 .

[6]  L. Cronbach Coefficient alpha and the internal structure of tests , 1951 .

[7]  H. Prendinger,et al.  Emotion Recognition from Electromyography and Skin Conductance , 2005 .

[8]  R. Adolphs Recognizing emotion from facial expressions: psychological and neurological mechanisms. , 2002, Behavioral and cognitive neuroscience reviews.

[9]  J. Russell Core affect and the psychological construction of emotion. , 2003, Psychological review.

[10]  Anton Nijholt,et al.  The MAHNOB Mimicry Database: A database of naturalistic human interactions , 2015, Pattern Recognit. Lett..

[11]  Francis Tuerlinckx,et al.  A hierarchical latent stochastic differential equation model for affective dynamics. , 2011, Psychological methods.

[12]  Yu-Chen Chan Neural Correlates of Sex/Gender Differences in Humor Processing for Different Joke Types , 2016, Front. Psychol..

[13]  Maja Pantic,et al.  AFEW-VA database for valence and arousal estimation in-the-wild , 2017, Image Vis. Comput..

[14]  J. Russell,et al.  An approach to environmental psychology , 1974 .

[15]  N. Selvaraj,et al.  Assessment of heart rate variability derived from finger-tip photoplethysmography as compared to electrocardiography , 2008, Journal of medical engineering & technology.

[16]  P. Ekman,et al.  What the face reveals : basic and applied studies of spontaneous expression using the facial action coding system (FACS) , 2005 .

[17]  Tamás D. Gedeon,et al.  Collecting Large, Richly Annotated Facial-Expression Databases from Movies , 2012, IEEE MultiMedia.

[18]  Giuseppe Boccignone,et al.  Affective Facial Expression Processing via Simulation: A Probabilistic Model , 2014, Biologically Inspired Cognitive Architectures.

[19]  Joost Broekens,et al.  AffectButton: A method for reliable and valid affective self-report , 2013, Int. J. Hum. Comput. Stud..

[20]  Howard J. Rosen,et al.  Damage to left frontal regulatory circuits produces greater positive emotional reactivity in frontotemporal dementia , 2015, Cortex.

[21]  R. Cowie,et al.  A new emotion database: considerations, sources and scope , 2000 .

[22]  Barbara L. Fredrickson,et al.  The Value of Positive Emotions , 2003, American Scientist.

[23]  Erik Cambria,et al.  A review of affective computing: From unimodal analysis to multimodal fusion , 2017, Inf. Fusion.

[24]  K. Scherer,et al.  The World of Emotions is not Two-Dimensional , 2007, Psychological science.

[25]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[26]  L. Lin,et al.  A concordance correlation coefficient to evaluate reproducibility. , 1989, Biometrics.

[27]  Sidney K. D'Mello,et al.  A Review and Meta-Analysis of Multimodal Affect Detection Systems , 2015, ACM Comput. Surv..

[28]  M. Bradley,et al.  Emotion and motivation II: sex differences in picture processing. , 2001, Emotion.

[29]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[30]  Robert Plutchik,et al.  The Psychophysiology of Skin Temperature: A Critical Review , 1956 .

[31]  Angeliki Metallinou,et al.  Annotation and processing of continuous emotional attributes: Challenges and opportunities , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[32]  K. Kroschel,et al.  Evaluation of natural emotions using self assessment manikins , 2005, IEEE Workshop on Automatic Speech Recognition and Understanding, 2005..

[33]  M. Bradley,et al.  Looking at pictures: affective, facial, visceral, and behavioral reactions. , 1993, Psychophysiology.

[34]  M. Nussbaum Upheavals of Thought: The Intelligence of Emotions , 2001 .

[35]  C.D. Martin,et al.  The Media Equation: How People Treat Computers, Television and New Media Like Real People and Places [Book Review] , 1997, IEEE Spectrum.

[36]  Maja Pantic,et al.  Web-based database for facial expression analysis , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[37]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[38]  Peter Robinson,et al.  Cross-dataset learning and person-specific normalisation for automatic Action Unit detection , 2015, 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[39]  Fabien Ringeval,et al.  AV+EC 2015: The First Affect Recognition Challenge Bridging Across Audio, Video, and Physiological Data , 2015, AVEC@ACM Multimedia.

[40]  L. Pessoa On the relationship between emotion and cognition , 2008, Nature Reviews Neuroscience.

[41]  Hatice Gunes,et al.  Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space , 2011, IEEE Transactions on Affective Computing.

[42]  Guillaume Dubuisson Duplessis,et al.  Multimodal data collection of human-robot humorous interactions in the Joker project , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[43]  Andrea Cavallaro,et al.  Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[44]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[45]  F. Tuerlinckx,et al.  Journal of Personality and Social Psychology Feelings Change : Accounting for Individual Differences in the Temporal Dynamics of Affect , 2010 .

[46]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[47]  Fabien Ringeval,et al.  Introducing the RECOLA multimodal corpus of remote collaborative and affective interactions , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[48]  Tamás D. Gedeon,et al.  Static facial expression analysis in tough conditions: Data, evaluation protocol and benchmark , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[49]  A. Goldman,et al.  Simulationist models of face-based emotion recognition , 2005, Cognition.

[50]  Daniel Johnson,et al.  Gaming well: links between videogames and flourishing mental health , 2014, Front. Psychol..

[51]  David R. Herring,et al.  Coherent with laughter: subjective experience, behavior, and physiological responses during amusement and joy. , 2011, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[52]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[53]  Anna Esposito,et al.  The New Italian Audio and Video Emotional Database , 2009, COST 2102 Training School.

[54]  Stephanie E. Moser,et al.  Feeling good: autonomic nervous system responding in five positive emotions. , 2011, Emotion.