A dataset of continuous affect annotations and physiological signals for emotion analysis
暂无分享,去创建一个
Friedhelm Schwenker | Claudio Castellini | Karan Sharma | Alin Albu-Schäffer | Egon L. van den Broek | Claudio Castellini | A. Albu-Schäffer | E. Broek | F. Schwenker | Karan Sharma
[1] Ellen Elizabeth Bartolini,et al. Eliciting Emotion with Film: Development of a Stimulus Set , 2011 .
[2] E. Broek. Affective Signal Processing (ASP): Unraveling the mystery of emotions , 2011 .
[3] Claudio Castellini,et al. Continuous, Real-Time Emotion Annotation: A Novel Joystick-Based Analysis Framework , 2020, IEEE Transactions on Affective Computing.
[4] Georgios N. Yannakakis,et al. Grounding truth via ordinal annotation , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).
[5] Emmanuel Dellandréa,et al. Deep learning vs. kernel methods: Performance for emotion prediction in videos , 2015, ACII.
[6] Fabien Ringeval,et al. Introducing the RECOLA multimodal corpus of remote collaborative and affective interactions , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).
[7] Hiromi Nishiguchi,et al. Continuous Affect Rating in Cartesian Space of Pleasure and Arousal Scale by Joystick Without Visual Feedback , 2017, HCI.
[8] Andrew McStay,et al. Emotional AI: The Rise of Empathic Media , 2018 .
[9] M. Bradley,et al. Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.
[10] Subramanian Ramanathan,et al. DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses , 2015, IEEE Transactions on Affective Computing.
[11] J. Gross,et al. Emotion elicitation using films , 1995 .
[12] Markus Kächele,et al. The Influence of Annotation, Corpus Design, and Evaluation on the Outcome of Automatic Classification of Human Emotions , 2016, Front. ICT.
[13] Friedhelm Schwenke,et al. Multimodal Pattern Recognition of Social Signals in Human-Computer-Interaction , 2012, Lecture Notes in Computer Science.
[14] Mohammad Soleymani,et al. Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection , 2016, IEEE Transactions on Affective Computing.
[15] Rosalind W. Picard. Affective computing: (526112012-054) , 1997 .
[16] Jeff Sauro,et al. When designing usability questionnaires, does it hurt to be positive? , 2011, CHI.
[17] Friedhelm Schwenker,et al. Multimodal Pattern Recognition of Social Signals in Human-Computer-Interaction , 2014, Lecture Notes in Computer Science.
[18] Friedhelm Schwenker,et al. A functional data analysis approach for continuous 2-D emotion annotations , 2019, Web Intell..
[19] Thierry Pun,et al. DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.
[20] Egon L. van den Broek,et al. Affective Man-Machine Interface: Unveiling Human Emotions through Biosignals , 2009, BIOSTEC.
[21] Jeffrey M Girard,et al. DARMA: Software for dual axis rating and media annotation , 2017, Behavior Research Methods.
[22] Roddy Cowie,et al. FEELTRACE: an instrument for recording perceived emotion in real time , 2000 .
[23] Claudio Castellini,et al. Continuous affect state annotation using a joystick-based user interface : Exploratory data analysis , 2016 .
[24] Markus Ringnér,et al. What is principal component analysis? , 2008, Nature Biotechnology.
[25] Heiga Zen,et al. WaveNet: A Generative Model for Raw Audio , 2016, SSW.
[26] Maja Pantic,et al. This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING , 2022 .
[27] Mohammad Soleymani,et al. Continuous emotion detection in response to music videos , 2011, Face and Gesture 2011.
[28] J. Russell. Core affect and the psychological construction of emotion. , 2003, Psychological review.
[29] Johannes Hewig,et al. A revised film set for the induction of basic emotions. , 2005 .
[30] Michael Hanke,et al. A studyforrest extension, simultaneous fMRI and eye gaze recordings during prolonged natural stimulation , 2016, Scientific Data.
[31] Eckart Altenmüller,et al. EMuJoy: Software for continuous measurement of perceived emotions in music , 2007, Behavior research methods.
[32] Fabien Ringeval,et al. Proceedings of the 7th Annual Workshop on Audio/Visual Emotion Challenge , 2017, AVEC@ACM Multimedia.
[33] Thierry Pun,et al. Toolbox for Emotional feAture extraction from Physiological signals (TEAP) , 2017, Front. ICT.
[34] Egon L. van den Broek,et al. Continuous affect state annotation using a joystick-based user interface , 2014 .
[35] Fabien Ringeval,et al. AVEC 2017: Real-life Depression, and Affect Recognition Workshop and Challenge , 2017, AVEC@ACM Multimedia.
[36] Ioannis Pavlidis,et al. A multimodal dataset for various forms of distracted driving , 2016, Scientific Data.
[37] Daniel Leidner,et al. EDAN: EMG-controlled Daily Assistant , 2017, HRI.
[38] Angeliki Metallinou,et al. Annotation and processing of continuous emotional attributes: Challenges and opportunities , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).
[39] Marianna Obrist,et al. Emotional ratings and skin conductance response to visual, auditory and haptic stimuli , 2018, Scientific Data.
[40] E. Vesterinen,et al. Affective Computing , 2009, Encyclopedia of Biometrics.