MEEC: Second Workshop on Momentary Emotion Elicitation and Capture

Recognizing human emotions and responding appropriately has the potential to radically change the way we interact with technology. However, to train machines to sensibly detect and recognize human emotions, we need valid emotion ground truths. A fundamental challenge here is the momentary emotion elicitation and capture (MEEC) from individuals continuously and in real-time, without adversely affecting user experience nor breaching ethical standards. In this virtual half-day CHI 2021 workshop, we will (1) have participant talks and an inspirational keynote presentation (2) ideate elicitation, sensing, and annotation techniques (3) create mappings of when to apply an elicitation method.

[1]  J. Russell A circumplex model of affect. , 1980 .

[2]  Kenji Suzuki,et al.  A Wearable Device for Fast and Subtle Spontaneous Smile Recognition , 2017, IEEE Transactions on Affective Computing.

[3]  Rainer Malaka,et al.  Breaking The Experience: Effects of Questionnaires in VR User Studies , 2020, CHI.

[4]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[5]  Denzil Ferreira,et al.  AWARE: Mobile Context Instrumentation Framework , 2015, Front. ICT.

[6]  R. Zhou,et al.  A New Standardized Emotional Film Database for Asian Culture , 2017, Front. Psychol..

[7]  P. Ekman An argument for basic emotions , 1992 .

[8]  Marina Krakovsky Artificial (emotional) intelligence , 2018, Commun. ACM.

[9]  Johannes Schöning,et al.  Measuring, Understanding, and Classifying News Media Sympathy on Twitter after Crisis Events , 2018, CHI.

[10]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[11]  Rana el Kaliouby Large-scale observational evidence of cross-cultural differences in facial behavior , 2018, BDJ.

[12]  Yong Peng,et al.  EEG-based emotion classification using deep belief networks , 2014, 2014 IEEE International Conference on Multimedia and Expo (ICME).

[13]  Albert Ali Salah,et al.  Are You Really Smiling at Me? Spontaneous versus Posed Enjoyment Smiles , 2012, ECCV.

[14]  Rafael A. Calvo,et al.  Beyond the basic emotions: what should affective computing compute? , 2013, CHI Extended Abstracts.

[15]  Daniel McDuff,et al.  Wearable ESM: differences in the experience sampling method across wearable devices , 2016, MobileHCI.

[16]  W. Meck,et al.  How emotions colour our perception of time , 2007, Trends in Cognitive Sciences.

[17]  Jorge Gonçalves,et al.  Effect of experience sampling schedules on response rate and recall accuracy of objective self-reports , 2019, Int. J. Hum. Comput. Stud..

[18]  Bruce N. Walker,et al.  CARoma Therapy: Pleasant Scents Promote Safer Driving, Better Mood, and Improved Well-Being in Angry Drivers , 2020, CHI.

[19]  Saho Ayabe-Kanamura,et al.  The Invisible Potential of Facial Electromyography: A Comparison of EMG and Computer Vision when Distinguishing Posed from Spontaneous Smiles , 2019, CHI.

[20]  Abdallah El Ali,et al.  ThermalWear: Exploring Wearable On-chest Thermal Displays to Augment Voice Messages with Affect , 2020, CHI.

[21]  Abdallah El Ali MEEC: First Workshop on Momentary Emotion Elicitation and Capture , 2020, CHI Extended Abstracts.

[22]  Federica Pallavicini,et al.  Effectiveness of Virtual Reality Survival Horror Games for the Emotional Elicitation: Preliminary Insights Using Resident Evil 7: Biohazard , 2018, HCI.

[23]  Tilman Dingler,et al.  Faces of Focus: A Study on the Facial Cues of Attentional States , 2020, CHI.

[24]  Torben Wallbaum,et al.  Face2Emoji: Using Facial Emotional Expressions to Filter Emojis , 2017, CHI Extended Abstracts.

[25]  J. Gratch,et al.  The Oxford Handbook of Affective Computing , 2014 .

[26]  H. Critchley,et al.  Interoception and emotion. , 2017, Current opinion in psychology.

[27]  Paul Dourish,et al.  How emotion is made and measured , 2007, Int. J. Hum. Comput. Stud..

[28]  Aleix M. Martinez,et al.  Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements , 2019, Psychological science in the public interest : a journal of the American Psychological Society.

[29]  Markus Funk,et al.  Implicit Engagement Detection for Interactive Museums Using Brain-Computer Interfaces , 2015, MobileHCI Adjunct.

[30]  Chen Wang,et al.  RCEA: Real-time, Continuous Emotion Annotation for Collecting Precise Mobile Video Ground Truth Labels , 2020, CHI.

[31]  Jeremy N. Bailenson,et al.  A Public Database of Immersive VR Videos with Corresponding Ratings of Arousal, Valence, and Correlations between Head Movements and Self Report Measures , 2017, Front. Psychol..

[32]  Pablo César,et al.  Designing Real-time, Continuous Emotion Annotation Techniques for 360° VR Videos , 2020, CHI Extended Abstracts.

[33]  Albrecht Schmidt,et al.  Cognitive Heat: Exploring the Usage of Thermal Imaging to Unobtrusively Estimate Cognitive Load , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[34]  Florian Alt,et al.  HeartChat: Heart Rate Augmented Mobile Chat to Support Empathy and Awareness , 2017, CHI.

[35]  Selin A. Malkoc,et al.  Discounting Time and Time Discounting: Subjective Time Perception and Intertemporal Preferences , 2008 .

[36]  Saho Ayabe-Kanamura,et al.  Human perception and biosignal-based identification of posed and spontaneous smiles , 2019, PloS one.