Protocol for Eliciting Driver Frustration in an In-vehicle Environment

A state of frustration can impair a driver's ability to make decisions that optimize the safety of the driver as well as that of those around him or her. Equipping a car with the capability of detecting signs of driver frustration and responding with appropriate interventions can be an effective method to improve driver safety. In this paper, we first describe the design and implementation of a novel protocol used to elicit true frustration of varying intensities in participants interacting with an in-car Human Machine Interface (HMI) while driving in a simulator. We detail the instrumentation that was used to capture the participants' interactions with the HMI. We provide a computational analysis of signs of frustration displayed by participants in the facial and vocal modalities and discuss trends that were observed. Finally, we present baseline machine learning methods trained on features computed from facial and vocal modalities to predict the difficulty of the completed task and whether the participant is multitasking, validating the assumptions that informed our protocol design.

[1]  Daniel McDuff,et al.  AutoEmotive: bringing empathy to the driving experience to manage stress , 2014, DIS Companion '14.

[2]  G. L. Johnson,et al.  AGGRESSIVE DRIVING BEHAVIORS: ARE THERE PSYCHOLOGICAL AND ATTITUDINAL PREDICTORS? , 2003 .

[3]  Rosalind W. Picard,et al.  Signal processing for recognition of human frustration , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).

[4]  Tal Hassner,et al.  Emotion Recognition in the Wild via Convolutional Neural Networks and Mapped Binary Patterns , 2015, ICMI.

[5]  Peter Cairney,et al.  Traffic Management Performance: Development of a Traffic Frustration Index , 2000 .

[6]  Judi McCuaig,et al.  Detecting Learner Frustration: Towards Mainstream Use Cases , 2010, Intelligent Tutoring Systems.

[7]  Kristy Elizabeth Boyer,et al.  The Additive Value of Multimodal Features for Predicting Engagement, Frustration, and Learning during Tutoring , 2014, ICMI.

[8]  Daniel McDuff,et al.  AFFDEX SDK: A Cross-Platform Real-Time Multi-Face Expression Recognition Toolkit , 2016, CHI Extended Abstracts.

[9]  S D Lawson,et al.  Red-light running: accidents and surveillance cameras , 1991 .

[10]  P. Ekman,et al.  Facial action coding system , 2019 .

[11]  Meike Jipp,et al.  Recognizing Frustration of Drivers From Face Video Recordings and Brain Activation Measurements With Functional Near-Infrared Spectroscopy , 2018, Front. Hum. Neurosci..

[12]  Myounghoon Jeon,et al.  Towards affect-integrated driving behaviour research , 2015 .

[13]  Kristy Elizabeth Boyer,et al.  Automatically Recognizing Facial Indicators of Frustration: A Learning-centric Analysis , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[14]  Samuel Lanes Frustration. The Development of a Scientific Concept , 1968 .

[15]  David Shinar,et al.  Aggressive driving: the contribution of the drivers and the situation , 1998 .

[16]  Jonathan Klein,et al.  Frustrating the user on purpose: using biosignals in a pilot study to detect the user's emotional state , 1998, CHI Conference Summary.

[17]  Shaun Helman,et al.  An experimental study of factors associated with driver frustration and overtaking intentions. , 2015, Accident; analysis and prevention.

[18]  A. R. Hauber The social psychology of driving behaviour and the traffic environment: research on aggressive behaviour in traffic , 1980 .

[19]  Daniel McDuff,et al.  Driver Frustration Detection from Audio and Video in the Wild , 2016, IJCAI.

[20]  Kazuya Takeda,et al.  Analysis of Real-World Driver's Frustration , 2011, IEEE Transactions on Intelligent Transportation Systems.

[21]  Ashwin Belle,et al.  Frustration Detection with Electrocardiograph Signal Using Wavelet Transform , 2010, 2010 International Conference on Biosciences.

[22]  Daniel McDuff,et al.  Exploring Temporal Patterns in Classifying Frustrated and Delighted Smiles , 2012, IEEE Trans. Affect. Comput..

[23]  Ryan Shaun Joazeiro de Baker,et al.  Automatic Detection of Learning-Centered Affective States in the Wild , 2015, IUI.

[24]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[25]  N. Miller,et al.  Frustration and aggression , 1939 .

[26]  David Malah,et al.  Speech enhancement using a minimum mean-square error log-spectral amplitude estimator , 1984, IEEE Trans. Acoust. Speech Signal Process..

[27]  Bryan D. Edwards,et al.  Taking a look behind the wheel: an investigation into the personality predictors of aggressive driving. , 2012, Accident; analysis and prevention.

[28]  Adrian Burns,et al.  SHIMMER™ – A Wireless Sensor Platform for Noninvasive Biomedical Research , 2010, IEEE Sensors Journal.

[29]  Theodoros Giannakopoulos pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis , 2015, PloS one.

[30]  D. Hennessy,et al.  Traffic congestion, driver stress, and driver aggression , 1999 .

[31]  Ronnie Taib,et al.  Quantifying driver frustration to improve road safety , 2014, CHI Extended Abstracts.

[32]  Andreas Stolcke,et al.  Prosody-based automatic detection of annoyance and frustration in human-computer dialog , 2002, INTERSPEECH.

[33]  B. Wright,et al.  Road rage. , 1994, Accident and emergency nursing.

[34]  H. Emrah Tasli,et al.  Deep learning based FACS Action Unit occurrence and intensity estimation , 2015, 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).