Integration of Driver Behavior into Emotion Recognition Systems: A Preliminary Study on Steering Wheel and Vehicle Acceleration

The current status of the development for emotion recognition systems in cars is mostly focused on camera-based solutions which consider the face as the main input data source. Modeling behavior of the driver in automotive domain is also a challenging topic which has a great impact on developing intelligent and autonomous vehicles. In order to study the correlation between driving behavior and emotional status of the driver, we propose a multimodal system which is based on facial expressions and driver specific behavior including steering wheel usage and the change in vehicle acceleration. The aim of this work is to investigate the impact of integration of driver behavior into emotion recognition systems and to build a structure which continuously classifies the emotions in an efficient and non-intrusive manner. We consider driver behavior as the typical range of interactions with the vehicle which represents the responses to certain stimuli. To recognize facial emotions, we extract the histogram values from the key facial regions and combine them into a single vector which is then used to train a SVM classifier. Following that, using machine learning techniques and statistical methods two modules of abrupt car maneuvers counter, based on steering wheel rotation, and aggressive driver predictor, based on a variation of acceleration, are built. In the end, all three modules are combined into one final emotion classifier which is capable of predicting the emotional group of the driver with 94% of accuracy in sub-samples. For the evaluation we used a real car simulator with 8 different participants as the drivers.

[1]  Abdul Wahab,et al.  Features extraction for speech emotion , 2008, J. Comput. Methods Sci. Eng..

[2]  Stefanos Zafeiriou,et al.  Incremental Face Alignment in the Wild , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[3]  Adrian D. C. Chan,et al.  Driver identification using vehicle acceleration and deceleration events from naturalistic driving of older drivers , 2017, 2017 IEEE International Symposium on Medical Measurements and Applications (MeMeA).

[4]  Gerhard Rigoll,et al.  Bimodal fusion of emotional data in an automotive environment , 2005, Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005..

[5]  Hu Hongyu,et al.  Driver Drowsiness Detection Based on Time Series Analysis of Steering Wheel Angular Velocity , 2017, 2017 9th International Conference on Measuring Technology and Mechatronics Automation (ICMTMA).

[6]  Ing-Marie Jonsson,et al.  Automatic recognition of affective cues in the speech of car drivers to allow appropriate responses , 2005, OZCHI.

[7]  Zhi-Long Chen,et al.  Using EEG to detect drivers' emotion with Bayesian Networks , 2010, 2010 International Conference on Machine Learning and Cybernetics.

[8]  Lalit Kulkarni,et al.  Study of video based facial expression and emotions recognition methods , 2017, 2017 International Conference on I-SMAC (IoT in Social, Mobile, Analytics and Cloud) (I-SMAC).

[9]  Haibin Shen,et al.  SVM point-based real-time emotion detection , 2017, 2017 IEEE Conference on Dependable and Secure Computing.

[10]  Mohan M. Trivedi,et al.  Speech based emotion classification framework for driver assistance system , 2010, 2010 IEEE Intelligent Vehicles Symposium.

[11]  Joost C. F. de Winter,et al.  Eye-based driver state monitor of distraction, drowsiness, and cognitive load for transitions of control in automated driving , 2016, 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[12]  Tamás D. Gedeon,et al.  Collecting Large, Richly Annotated Facial-Expression Databases from Movies , 2012, IEEE MultiMedia.

[13]  Erik Cambria,et al.  A review of affective computing: From unimodal analysis to multimodal fusion , 2017, Inf. Fusion.

[14]  Siti Anom Ahmad,et al.  Driver emotion recognition framework based on electrodermal activity measurements during simulated driving conditions , 2016, 2016 IEEE EMBS Conference on Biomedical Engineering and Sciences (IECBES).

[15]  Chaithra Shetty,et al.  Facial expression recognition using feature based techniques and model based techniques: A survey , 2015, 2015 2nd International Conference on Electronics and Communication Systems (ICECS).

[16]  Yorgos Goletsis,et al.  Emotion Recognition in Car Industry , 2015 .

[17]  Bill Triggs,et al.  Histograms of oriented gradients for human detection , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[18]  Abdul Wahab,et al.  Real Time Driving Data Collection and Driver Verification using CMAC-MFCC , 2008, IC-AI.

[19]  Jeffrey F. Cohn,et al.  What can head and facial movements convey about positive and negative affect? , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[20]  Andrea Cavallaro,et al.  Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[21]  P. Ekman,et al.  EMFACS-7: Emotional Facial Action Coding System , 1983 .

[22]  Fadi Al Machot,et al.  CNN Based Subject-Independent Driver Emotion Recognition System Involving Physiological Signals for ADAS , 2016 .

[23]  Josephine Sullivan,et al.  One millisecond face alignment with an ensemble of regression trees , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[24]  Abdul Wahab,et al.  Driver behavior analysis through speech emotion understanding , 2010, 2010 IEEE Intelligent Vehicles Symposium.

[25]  Sidney K. D'Mello,et al.  A Review and Meta-Analysis of Multimodal Affect Detection Systems , 2015, ACM Comput. Surv..

[26]  Astrid Paeschke,et al.  A database of German emotional speech , 2005, INTERSPEECH.

[27]  Craig A. Knoblock,et al.  A Survey of Digital Map Processing Techniques , 2014, ACM Comput. Surv..

[28]  Michael J. Lyons,et al.  Coding facial expressions with Gabor wavelets , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[29]  Sergio Escalera,et al.  Survey on RGB, 3D, Thermal, and Multimodal Approaches for Facial Expression Recognition: History, Trends, and Affect-Related Applications , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[30]  Aron Larsson,et al.  Recognition of emotions by the emotional feedback through behavioral human poses , 2015 .

[31]  Kyandoghere Kyamakya,et al.  Detecting human driver's physiological stress and emotions using sophisticated one-person cockpit vehicle simulator , 2015, 2015 Information Technologies in Innovation Business Conference (ITIB).

[32]  Lanlan Chen,et al.  Detecting driving stress in physiological signals based on multimodal feature analysis and kernel classifiers , 2017, Expert Syst. Appl..

[33]  Nikos Fakotakis,et al.  Comparative Evaluation of Various MFCC Implementations on the Speaker Verification Task , 2007 .

[34]  P. Ekman,et al.  Facial action coding system: a technique for the measurement of facial movement , 1978 .

[35]  Conrad S. Tucker,et al.  Machine learning classification of design team members' body language patterns for real time emotional state detection , 2015 .

[36]  Victor C. M. Leung,et al.  SOVCAN: Safety-Oriented Vehicular Controller Area Network , 2017, IEEE Communications Magazine.

[37]  Maozhen Li,et al.  Real time facial expression recognition app development on mobile phones , 2016, 2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD).

[38]  Aliaa A. A. Youssif,et al.  Spontaneous Facial Expression Recognition Based on Histogram of Oriented Gradients Descriptor , 2014, Comput. Inf. Sci..

[39]  L. Rothkrantz Multimodal recognition of emotions in car environments , 2009 .

[40]  E. Kensinger,et al.  Remembering Emotional Experiences: The Contribution of Valence and Arousal , 2004, Reviews in the neurosciences.

[41]  Tanaya Guha,et al.  On the role of head motion in affective expression , 2017, 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[42]  Hubert Konik,et al.  Human vision inspired framework for facial expressions recognition , 2012, ICIP.

[43]  Fernando De la Torre,et al.  Temporal Segmentation of Facial Behavior , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[44]  Nikita P. Desai,et al.  Mel Frequency Cepstral Coefficients (MFCC) based speaker identification in noisy environment using wiener filter , 2014, 2014 International Conference on Green Computing Communication and Electrical Engineering (ICGCCEE).

[45]  Tomas Krotak,et al.  The analysis of the acceleration of the vehicle for assessing the condition of the driver , 2012, 2012 IEEE Intelligent Vehicles Symposium.

[46]  Bir Bhanu,et al.  Novel representation for driver emotion recognition in motor vehicle videos , 2017, 2017 IEEE International Conference on Image Processing (ICIP).

[47]  Vincent Lepetit,et al.  BRIEF: Binary Robust Independent Elementary Features , 2010, ECCV.

[48]  Louis-Philippe Morency,et al.  Multimodal Sentiment Intensity Analysis in Videos: Facial Gestures and Verbal Messages , 2016, IEEE Intelligent Systems.

[49]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[50]  Erik Cambria,et al.  Towards an intelligent framework for multimodal affective data analysis , 2015, Neural Networks.