Towards Real-Time Multimodal Emotion Recognition among Couples

Researchers are interested in understanding the emotions of couples as it relates to relationship quality and dyadic management of chronic diseases. Currently, the process of assessing emotions is manual, time-intensive, and costly. Despite the existence of works on emotion recognition among couples, there exists no ubiquitous system that recognizes the emotions of couples in everyday life while addressing the complexity of dyadic interactions such as turn-taking in couples? conversations. In this work, we seek to develop a smartwatch-based system that leverages multimodal sensor data to recognize each partner's emotions in daily life. We are collecting data from couples in the lab and in the field and we plan to use the data to develop multimodal machine learning models for emotion recognition. Then, we plan to implement the best models in a smartwatch app and evaluate its performance in real-time and everyday life through another field study. Such a system could enable research both in the lab (e.g. couple therapy) or in daily life (assessment of chronic disease management or relationship quality) and enable interventions to improve the emotional well-being, relationship quality, and chronic disease management of couples.

[1]  Fenne große Deters,et al.  Naturalistic Observation of Health-Relevant Social Processes: The Electronically Activated Recorder Methodology in Psychosomatics , 2012, Psychosomatic medicine.

[2]  E. Ceulemans,et al.  All's well that ends well? A test of the peak‐end rule in couples’ conflict discussions , 2018, European Journal of Social Psychology.

[3]  J. Simpson,et al.  The Oxford handbook of close relationships , 2013 .

[4]  John M. Gottman,et al.  The Specific Affect Coding System (SPAFF). , 2007 .

[5]  F. Tuerlinckx,et al.  Complex affect dynamics add limited information to the prediction of psychological well-being , 2019, Nature Human Behaviour.

[6]  Panayiotis G. Georgiou,et al.  Behavioral signal processing for understanding (distressed) dyadic interactions: some recent developments , 2011, J-HGBU '11.

[7]  Athanasios Katsamanis,et al.  Toward automating a human behavioral coding system for married couples' interactions using speech acoustic features , 2013, Speech Commun..

[8]  Panayiotis G. Georgiou,et al.  Behavioral Signal Processing: Deriving Human Behavioral Informatics From Speech and Language , 2013, Proceedings of the IEEE.

[9]  CambriaErik,et al.  A review of affective computing , 2017 .

[10]  Angeliki Metallinou,et al.  Annotation and processing of continuous emotional attributes: Challenges and opportunities , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[11]  Timothy W. Smith,et al.  Emotion, Social Relationships, and Physical Health: Concepts, Methods, and Evidence for An Integrative Perspective. , 2019, Psychosomatic medicine.

[12]  G. Prati,et al.  The relation of perceived and received social support to mental health among first responders: a meta‐analytic review , 2010 .

[13]  Sarah C. E. Stanton,et al.  Affective processes as mediators of links between close relationships and physical health. , 2018, Social and personality psychology compass.

[14]  M. Mehl,et al.  Cancer conversations in context: naturalistic observation of couples coping with breast cancer. , 2014, Journal of family psychology : JFP : journal of the Division of Family Psychology of the American Psychological Association.

[15]  J. Gottman,et al.  A valid procedure for obtaining self-report of affect in marital interaction. , 1985, Journal of consulting and clinical psychology.

[16]  Panayiotis G. Georgiou,et al.  Couples Behavior Modeling and Annotation Using Low-Resource LSTM Language Models , 2016, INTERSPEECH.

[17]  Athanasios Katsamanis,et al.  Automatic classification of married couples' behavior using audio features , 2010, INTERSPEECH.

[18]  Fei-Fei Li,et al.  Large-Scale Video Classification with Convolutional Neural Networks , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[19]  Niall Bolger,et al.  Effects of social support visibility on adjustment to stress: experimental evidence. , 2007, Journal of personality and social psychology.

[20]  R. Heyman,et al.  Observation of couple conflicts: clinical assessment applications, stubborn truths, and shaky foundations. , 2001, Psychological assessment.

[21]  Panayiotis G. Georgiou,et al.  A dynamic model for behavioral analysis of couple interactions using acoustic features , 2015, INTERSPEECH.

[22]  R. Slatcher,et al.  Marital quality and health: a meta-analytic review. , 2014, Psychological bulletin.

[23]  David Kotz,et al.  StressAware: An app for real-time stress monitoring on the amulet wearable platform , 2016, 2016 IEEE MIT Undergraduate Research Technology Conference (URTC).

[24]  Rahul Gupta,et al.  A language-based generative model framework for behavioral analysis of couples' therapy , 2015, 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[25]  H. Reis,et al.  The occurrence and correlates of emotional interdependence in romantic relationships. , 2020, Journal of personality and social psychology.

[26]  Athanasios Katsamanis,et al.  Multiple Instance Learning for Classification of Human Behavior Observations , 2011, ACII.

[27]  Athanasios Katsamanis,et al.  "You made me do it": Classification of Blame in Married Couples' Interactions by Fusing Automatically Derived Speech and Language Information , 2011, INTERSPEECH.

[28]  Erik Cambria,et al.  A review of affective computing: From unimodal analysis to multimodal fusion , 2017, Inf. Fusion.

[29]  Patricia K. Kerig,et al.  Couple observational coding systems , 2004 .

[30]  Peter A. Gloor,et al.  "Making you happy makes me happy" - Measuring Individual Mood with Smartwatches , 2017, ArXiv.

[31]  J. Gottman What predicts divorce? The relationship between marital processes and marital outcomes. , 1994 .

[32]  J. Russell A circumplex model of affect. , 1980 .

[33]  Alberto Betella,et al.  The Affective Slider: A Digital Self-Assessment Scale for the Measurement of Human Emotions , 2016, PloS one.

[34]  Kristof Van Laerhoven,et al.  Wearable-Based Affect Recognition—A Review , 2019, Sensors.

[35]  Iryna Gurevych,et al.  Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks , 2019, EMNLP.

[36]  Panayiotis G. Georgiou,et al.  Modeling Interpersonal Influence of Verbal Behavior in Couples Therapy Dyadic Interactions , 2018, INTERSPEECH.

[37]  Ivan Laptev,et al.  Learning and Transferring Mid-level Image Representations Using Convolutional Neural Networks , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[38]  Tobias Kowatsch,et al.  VADLite: an open-source lightweight system for real-time voice activity detection on smartwatches , 2019, UbiComp/ISWC Adjunct.

[39]  Sebastian Ruder,et al.  Universal Language Model Fine-tuning for Text Classification , 2018, ACL.

[40]  S. R. M. Prasanna,et al.  Deep Learning Techniques for Speech Emotion Recognition: A Review , 2019, 2019 29th International Conference Radioelektronika (RADIOELEKTRONIKA).

[41]  Prateek Verma,et al.  Emotion recognition from embedded bodily expressions and speech during dyadic interactions , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[42]  Carlos Busso,et al.  MSP-IMPROV: An Acted Corpus of Dyadic Interactions to Study Emotion Perception , 2017, IEEE Transactions on Affective Computing.

[43]  T. Kowatsch,et al.  Social Support and Common Dyadic Coping in Couples' Dyadic Management of Type II Diabetes: Protocol for an Ambulatory Assessment Application , 2019, JMIR research protocols.

[44]  Carlos Busso,et al.  IEMOCAP: interactive emotional dyadic motion capture database , 2008, Lang. Resour. Evaluation.

[45]  Peter Kuppens,et al.  Speech Emotion Recognition among Couples using the Peak-End Rule and Transfer Learning , 2020, ICMI Companion.

[46]  Shrikanth Narayanan,et al.  Automatic Prediction of Suicidal Risk in Military Couples Using Multimodal Interaction Cues from Couples Conversations , 2020, ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[47]  Panayiotis G. Georgiou,et al.  Head Motion Modeling for Human Behavior Analysis in Dyadic Interaction , 2015, IEEE Transactions on Multimedia.

[48]  H. Reis,et al.  Actual and Perceived Emotional Similarity in Couples’ Daily Lives , 2020, Social Psychology and Personality Science.

[49]  Athanasios Katsamanis,et al.  Quantification of prosodic entrainment in affective spontaneous spoken interactions of married couples , 2010, INTERSPEECH.

[50]  Rui Wang,et al.  Machine learning for the recognition of emotion in the speech of couples in psychotherapy using the Stanford Suppes Brain Lab Psychotherapy Dataset , 2019, ArXiv.

[51]  Ryan J. Halter,et al.  GeriActive: Wearable app for monitoring and encouraging physical activity among older adults , 2018, 2018 IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks (BSN).

[52]  L. Carstensen,et al.  Emotional behavior in long-term marriage. , 1995, Psychology and aging.

[53]  Ryan J. Halter,et al.  ActivityAware: An app for real-time daily activity level monitoring on the Amulet wrist-worn device , 2017, 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops).

[54]  Stefan Winkler,et al.  Deep Learning for Emotion Recognition on Small Datasets using Transfer Learning , 2015, ICMI.

[55]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[56]  Haoqi Li,et al.  Sparsely Connected and Disjointly Trained Deep Neural Networks for Low Resource Behavioral Annotation: Acoustic Classification in Couples' Therapy , 2016, INTERSPEECH.

[57]  Theodora Chaspari,et al.  A Review of Generalizable Transfer Learning in Automatic Emotion Recognition , 2020, Frontiers in Computer Science.

[58]  Tobias Kowatsch,et al.  Poster: DyMand -- An Open-Source Mobile and Wearable System for Assessing Couples' Dyadic Management of Chronic Diseases , 2019, MobiCom.

[59]  Björn Schuller,et al.  Opensmile: the munich versatile and fast open-source audio feature extractor , 2010, ACM Multimedia.

[60]  Björn W. Schuller,et al.  The Geneva Minimalistic Acoustic Parameter Set (GeMAPS) for Voice Research and Affective Computing , 2016, IEEE Transactions on Affective Computing.

[61]  R. Levenson,et al.  Continuous measurement of emotion: The affect rating dial. , 2007 .

[62]  Bin Liu,et al.  Multimodal Transformer Fusion for Continuous Emotion Recognition , 2020, ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[63]  P. Ekman Universal facial expressions of emotion. , 1970 .

[64]  Panayiotis G. Georgiou,et al.  Approaching Human Performance in Behavior Estimation in Couples Therapy Using Deep Sentence Embeddings , 2017, INTERSPEECH.

[65]  Sidney K. D'Mello,et al.  A Review and Meta-Analysis of Multimodal Affect Detection Systems , 2015, ACM Comput. Surv..

[66]  Nadine Mandran,et al.  Multimodal Corpora : Advances in Capturing , Coding and Analyzing Multimodality , 2010 .

[67]  Athanasios Katsamanis,et al.  Computing vocal entrainment: A signal-derived PCA-based quantification scheme with application to affect analysis in married couple interactions , 2014, Comput. Speech Lang..

[68]  Doina Precup,et al.  Neural Transfer Learning for Cry-based Diagnosis of Perinatal Asphyxia , 2019, INTERSPEECH.

[69]  Harishchandra Dubey,et al.  BigEAR: Inferring the Ambient and Emotional Correlates from Smartphone-Based Acoustic Big Data , 2016, 2016 IEEE First International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE).

[70]  W. Sommer,et al.  Psychometric challenges and proposed solutions when scoring facial emotion expression codes , 2013, Behavior research methods.

[71]  Ralf Krestel,et al.  hpiDEDIS at GermEval 2019: Offensive Language Identification using a German BERT model , 2019, KONVENS.

[72]  George Boateng,et al.  Emotion Capture among Real Couples in Everyday Life , 2020, CHI 2020.

[73]  D. Watson,et al.  Development and validation of brief measures of positive and negative affect: the PANAS scales. , 1988, Journal of personality and social psychology.

[74]  R. Slatcher,et al.  Romantic Relationships and Health , 2013 .

[75]  J. Gottman The Mathematics of Marriage: Dynamic Nonlinear Models , 2003 .

[76]  Masumi Iida,et al.  When the Going Gets Tough, Does Support Get Going? Determinants of Spousal Support Provision to Type 2 Diabetic Patients , 2010, Personality & social psychology bulletin.

[77]  Haoqi Li,et al.  "Honey, I Learned to Talk": Multimodal Fusion for Behavior Analysis , 2018, ICMI.

[78]  Haoqi Li,et al.  Predicting Behavior in Cancer-Afflicted Patient and Spouse Interactions using Speech and Language , 2019, INTERSPEECH.

[79]  Shrikanth S. Narayanan,et al.  Using Multimodal Wearable Technology to Detect Conflict among Couples , 2017, Computer.

[80]  G. A. Mendelsohn,et al.  Affect grid : A single-item scale of pleasure and arousal , 1989 .

[81]  Björn W. Schuller,et al.  YouTube Movie Reviews: Sentiment Analysis in an Audio-Visual Context , 2013, IEEE Intelligent Systems.

[82]  Shrikanth Narayanan,et al.  The USC Creative IT Database: A Multimodal Database of Theatrical Improvisation , 2010 .