Recognizing Frustration of Drivers From Face Video Recordings and Brain Activation Measurements With Functional Near-Infrared Spectroscopy

Experiencing frustration while driving can harm cognitive processing, result in aggressive behavior and hence negatively influence driving performance and traffic safety. Being able to automatically detect frustration would allow adaptive driver assistance and automation systems to adequately react to a driver’s frustration and mitigate potential negative consequences. To identify reliable and valid indicators of driver’s frustration, we conducted two driving simulator experiments. In the first experiment, we aimed to reveal facial expressions that indicate frustration in continuous video recordings of the driver’s face taken while driving highly realistic simulator scenarios in which frustrated or non-frustrated emotional states were experienced. An automated analysis of facial expressions combined with multivariate logistic regression classification revealed that frustrated time intervals can be discriminated from non-frustrated ones with accuracy of 62.0% (mean over 30 participants). A further analysis of the facial expressions revealed that frustrated drivers tend to activate muscles in the mouth region (chin raiser, lip pucker, lip pressor). In the second experiment, we measured cortical activation with almost whole-head functional near-infrared spectroscopy (fNIRS) while participants experienced frustrating and non-frustrating driving simulator scenarios. Multivariate logistic regression applied to the fNIRS measurements allowed us to discriminate between frustrated and non-frustrated driving intervals with higher accuracy of 78.1% (mean over 12 participants). Frustrated driving intervals were indicated by increased activation in the inferior frontal, putative premotor and occipito-temporal cortices. Our results show that facial and cortical markers of frustration can be informative for time resolved driver state identification in complex realistic driving situations. The markers derived here can potentially be used as an input for future adaptive driver assistance and automation systems that detect driver frustration and adaptively react to mitigate it.

[1]  Henrik Walter,et al.  Neural correlates of frustration , 2005, Neuroreport.

[2]  Meike Jipp,et al.  Frustration in the face of the driver , 2018, Interaction Studies.

[3]  B. Seymour,et al.  The neural signature of escalating frustration in humans , 2014, Cortex.

[4]  R. Gur,et al.  Automated Facial Action Coding System for dynamic analysis of facial expressions in neuropsychiatric disorders , 2011, Journal of Neuroscience Methods.

[5]  Franklin Farell Roadmap to a Single European Transport Area: Towards a competitive and resource efficient transport system , 2014 .

[6]  K. Scherer What are emotions? And how can they be measured? , 2005 .

[7]  Bodo Winter,et al.  Linear models and linear mixed effects models in R with linguistic applications , 2013, ArXiv.

[8]  M. Brass,et al.  To Do or Not to Do: The Neural Signature of Self-Control , 2007, The Journal of Neuroscience.

[9]  Christoph Reichert,et al.  Online tracking of the contents of conscious perception using real-time fMRI , 2014, Front. Neurosci..

[10]  Joy Hirsch,et al.  Separation of the global and local components in functional near-infrared spectroscopy signals using principal component spatial filtering , 2016, Neurophotonics.

[11]  R. Schnuerch,et al.  Assessing and correcting for regression toward the mean in deviance-induced social conformity , 2015, Front. Psychol..

[12]  P. Ekman Facial expression and emotion. , 1993, The American psychologist.

[13]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[14]  J. Schulkin,et al.  Facial expressions of emotion: A cognitive neuroscience perspective , 2003, Brain and Cognition.

[15]  Myounghoon Jeon,et al.  A systematic approach to using music for mitigating affective effects on driving performance and safety , 2012, UbiComp.

[16]  Hans-Jochen Heinze,et al.  Predicting the recognition of natural scenes from single trial MEG recordings of brain activity , 2000, NeuroImage.

[17]  Hilke Plassmann,et al.  Nonlinear Responses Within the Medial Prefrontal Cortex Reveal When Specific Implicit Information Influences Economic Decision Making , 2005, Journal of neuroimaging : official journal of the American Society of Neuroimaging.

[18]  J. Gross,et al.  The cognitive control of emotion , 2005, Trends in Cognitive Sciences.

[19]  Yaling Pei,et al.  Design and implementation of dynamic near-infrared optical tomographic imaging instrumentation for simultaneous dual-breast measurements. , 2005, Applied optics.

[20]  S. Fantini,et al.  Comment on the modified Beer-Lambert law for scattering media. , 2004, Physics in medicine and biology.

[21]  Yi-Ching Lee,et al.  Relationship between frustration justification and vehicle control behaviors – A simulator study , 2014 .

[22]  E. Olejarczyk,et al.  Application of fractal dimension method of functional MRI time-series to limbic dysregulation in anxiety study , 2007, 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[23]  D. Bates,et al.  Fitting Linear Mixed-Effects Models Using lme4 , 2014, 1406.5823.

[24]  Robert T. Knight,et al.  Encoding and Decoding Models in Cognitive Electrophysiology , 2017, Front. Syst. Neurosci..

[25]  Shai Ben-David,et al.  Understanding Machine Learning: From Theory to Algorithms , 2014 .

[26]  Martin Wolf,et al.  A review on continuous wave functional near-infrared spectroscopy and imaging instrumentation and methodology , 2014, NeuroImage.

[27]  M. Phillips,et al.  A neural model of voluntary and automatic emotion regulation: implications for understanding the pathophysiology and neurodevelopment of bipolar disorder , 2008, Molecular Psychiatry.

[28]  P. Ekman Emotions Revealed : Understanding Faces and Feelings , 2003 .

[29]  N F Schreiter,et al.  Fast 3D Near-infrared breast imaging using indocyanine green for detection and characterization of breast lesions. , 2011, RoFo : Fortschritte auf dem Gebiete der Rontgenstrahlen und der Nuklearmedizin.

[30]  Bettina Grün,et al.  Required Sample Sizes for Data-Driven Market Segmentation Analyses in Tourism , 2014 .

[31]  P. Ekman,et al.  Unmasking the face : a guide to recognizing emotions from facial clues , 1975 .

[32]  Matcheri S. Keshavan,et al.  Enlarged right superior temporal gyrus in children and adolescents with autism , 2010, Brain Research.

[33]  Beatriz Luna,et al.  fNIRS evidence of prefrontal regulation of frustration in early childhood , 2014, NeuroImage.

[34]  Ardalan Aarabi,et al.  Autoregressive model based algorithm for correcting motion and serially correlated errors in fNIRS. , 2013, Biomedical optics express.

[35]  Tonio Ball,et al.  Causal interpretation rules for encoding and decoding models in neuroimaging , 2015, NeuroImage.

[36]  Ilias Tachtsidis,et al.  False positives and false negatives in functional near-infrared spectroscopy: issues, challenges, and the way forward , 2016, Neurophotonics.

[37]  Yi-Ching Lee,et al.  Measuring Drivers' Frustration in a Driving Simulator , 2010 .

[38]  Kristy Elizabeth Boyer,et al.  Automatically Recognizing Facial Indicators of Frustration: A Learning-centric Analysis , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[39]  Katya Rubia,et al.  Right inferior prefrontal cortex mediates response inhibition while mesial prefrontal cortex is responsible for error detection , 2003, NeuroImage.

[40]  H. Walter,et al.  Acute and Sustained Effects of Cognitive Emotion Regulation in Major Depression , 2010, The Journal of Neuroscience.

[41]  Daniel McDuff,et al.  Exploring Temporal Patterns in Classifying Frustrated and Delighted Smiles , 2012, IEEE Transactions on Affective Computing.

[42]  Nirbhay N. Singh,et al.  Facial Expressions of Emotion , 1998 .

[43]  Scotty D. Craig,et al.  Integrating Affect Sensors in an Intelligent Tutoring System , 2004 .

[44]  Jean-Philippe Thiran,et al.  Detecting emotional stress from facial expressions for driving safety , 2014, 2014 IEEE International Conference on Image Processing (ICIP).

[45]  P. Ekman,et al.  Facial signs of emotional experience. , 1980 .

[46]  A. Nowicka,et al.  Effect of Frustration on Brain Activation Pattern in Subjects with Different Temperament , 2016, Front. Psychol..

[47]  Harry L. Graber,et al.  nirsLAB: A Computing Environment for fNIRS Neuroimaging Data Analysis , 2014 .

[48]  H. Akaike,et al.  Information Theory and an Extension of the Maximum Likelihood Principle , 1973 .

[49]  Leanne M. Hirshfield,et al.  Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions , 2014, Adv. Hum. Comput. Interact..

[50]  J. Decety,et al.  A PET Investigation of the Attribution of Intentions with a Nonverbal Task , 2000, NeuroImage.

[51]  Robert Tibshirani,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition , 2001, Springer Series in Statistics.

[52]  Vicki Bruce What the human face tells the human mind: some challenges for the robot-human interface , 1993, Adv. Robotics.

[53]  R. Lazarus Progress on a cognitive-motivational-relational theory of emotion. , 1991, The American psychologist.

[54]  Frank Köster,et al.  Modular and Scalable Driving Simulator Hardware and Software for the Development of Future Driver Assistence and Automation Systems , 2014 .

[55]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[56]  Kazuya Takeda,et al.  Analysis of Real-World Driver's Frustration , 2011, IEEE Transactions on Intelligent Transportation Systems.

[57]  T. Huppert,et al.  Evidence of Non-Linear Associations between Frustration-Related Prefrontal Cortex Activation and the Normal:Abnormal Spectrum of Irritability in Young Children , 2017, Journal of Abnormal Child Psychology.

[58]  K. Zilles,et al.  Differential brain activation according to chronic social reward frustration , 2005, Neuroreport.

[59]  Myounghoon Jeon,et al.  Towards affect-integrated driving behaviour research , 2015 .

[60]  Ipke Wachsmuth,et al.  Pleasure-arousal-dominance driven facial expression simulation , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[61]  Jonathan Klein,et al.  Computers that recognise and respond to user emotion: theoretical and practical implications , 2002, Interact. Comput..

[62]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.

[63]  Gwen Littlewort,et al.  Computer Expression Recognition Toolbox , 2008, 2008 8th IEEE International Conference on Automatic Face & Gesture Recognition.

[64]  Riender Happee,et al.  The effects of time pressure on driver performance and physiological activity: a driving simulator study , 2016 .

[65]  J. Russell A circumplex model of affect. , 1980 .

[66]  R. Saager,et al.  Direct characterization and removal of interfering absorption trends in two-layer turbid media. , 2005, Journal of the Optical Society of America. A, Optics, image science, and vision.

[67]  F. Jöbsis Noninvasive, infrared monitoring of cerebral and myocardial oxygen sufficiency and circulatory parameters. , 1977, Science.

[68]  A. Villringer,et al.  Near infrared spectroscopy (NIRS): A new tool to study hemodynamic changes during activation of brain function in human adults , 1993, Neuroscience Letters.

[69]  Meike Jipp,et al.  Assessing the Driver’s Current Level of Working Memory Load with High Density Functional Near-infrared Spectroscopy: A Realistic Driving Simulator Study , 2017, Front. Hum. Neurosci..