Using Eye-Tracking Data to Predict Situation Awareness in Real Time During Takeover Transitions in Conditionally Automated Driving

Situation awareness (SA) is critical to improving takeover performance during the transition period from automated driving to manual driving. Although many studies measured SA during or after the driving task, few studies have attempted to predict SA in real time in automated driving. In this work, we propose to predict SA during the takeover transition period in conditionally automated driving using eye-tracking and self-reported data. First, a tree ensemble machine learning model, named LightGBM (Light Gradient Boosting Machine), was used to predict SA. Second, in order to understand what factors influenced SA and how, SHAP (SHapley Additive exPlanations) values of individual predictor variables in the LightGBM model were calculated. These SHAP values explained the prediction model by identifying the most important factors and their effects on SA, which further improved the model performance of LightGBM through feature selection. We standardized SA between 0 and 1 by aggregating three performance measures (i.e., placement, distance, and speed estimation of vehicles with regard to the ego-vehicle) of SA in recreating simulated driving scenarios, after 33 participants viewed 32 videos with six lengths between 1 and 20 s. Using only eye-tracking data, our proposed model outperformed other selected machine learning models, having a root-mean-squared error (RMSE) of 0.121, a mean absolute error (MAE) of 0.096, and a 0.719 correlation coefficient between the predicted SA and the ground truth. The code is available at https://github.com/refengchou/Situation-awareness-prediction. Our proposed model provided important implications on how to monitor and predict SA in real time in automated driving using eye-tracking data.

[1]  Keith S. Karn,et al.  Commentary on Section 4. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. , 2003 .

[2]  Scott M. Lundberg,et al.  Consistent Individualized Feature Attribution for Tree Ensembles , 2018, ArXiv.

[3]  Klaus Bengler,et al.  A method to improve driver's situation awareness in automated driving , 2017 .

[4]  Wolfgang Rosenstiel,et al.  Driver-Activity Recognition in the Context of Conditionally Autonomous Driving , 2015, 2015 IEEE 18th International Conference on Intelligent Transportation Systems.

[5]  Feng Zhou,et al.  Predicting driver takeover performance in conditionally automated driving. , 2020, Accident; analysis and prevention.

[6]  A. McLaughlin,et al.  Situational Awareness and Time to Takeover: Exploring an Alternative Method to Measure Engagement with High-Level Automation , 2017 .

[7]  Neville A. Stanton,et al.  Situation awareness based on eye movements in relation to the task environment , 2018, Cognition, Technology & Work.

[8]  Feng Zhou,et al.  Driver fatigue transition prediction in highly automated driving using physiological features , 2020, Expert Syst. Appl..

[9]  Mica R. Endsley,et al.  Toward a Theory of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.

[10]  X. Jessie Yang,et al.  From Manual Driving to Automated Driving: A Review of 10 Years of AutoUI , 2019, AutomotiveUI.

[11]  Hugh Chen,et al.  From local explanations to global understanding with explainable AI for trees , 2020, Nature Machine Intelligence.

[12]  Leo Gugerty,et al.  Development of a Novel Measure of Situation Awareness: The Case for Eye Movement Analysis , 2010 .

[13]  O. Wolf,et al.  The role of eye fixation in memory enhancement under stress – An eye tracking study , 2017, Neurobiology of Learning and Memory.

[14]  H. French,et al.  Psycho-physiological Measures of Situation Awareness. , 2007 .

[15]  Laura D. Strater,et al.  Measures of Platoon Leader Situation Awareness in Virtual Decision-Making Exercises , 2001 .

[16]  Wolfgang Rosenstiel,et al.  Online Recognition of Driver-Activity Based on Visual Scanpath Classification , 2017, IEEE Intelligent Transportation Systems Magazine.

[17]  Dawn M. Tilbury,et al.  Situational Awareness, Drivers Trust in Automated Driving Systems and Secondary Task Performance , 2019, SAE International Journal of Connected and Automated Vehicles.

[18]  Kathrin Zeeb,et al.  Is take-over time all that matters? The impact of visual-cognitive load on driver take-over quality after conditionally automated driving. , 2016, Accident; analysis and prevention.

[19]  Philip Wolfe,et al.  Contributions to the theory of games , 1953 .

[20]  X. Jessie Yang,et al.  Psychophysiological responses to takeover requests in conditionally automated driving , 2020, Accident; analysis and prevention.

[21]  Scott M. Lundberg,et al.  Explainable machine-learning predictions for the prevention of hypoxaemia during surgery , 2018, Nature Biomedical Engineering.

[22]  Tao Deng,et al.  Where Does the Driver Look? Top-Down-Based Saliency Detection in a Traffic Driving Environment , 2016, IEEE Transactions on Intelligent Transportation Systems.

[23]  Tie-Yan Liu,et al.  LightGBM: A Highly Efficient Gradient Boosting Decision Tree , 2017, NIPS.

[24]  Yong Gu Ji,et al.  Non-driving-related tasks, workload, and takeover performance in highly automated driving contexts , 2019, Transportation Research Part F: Traffic Psychology and Behaviour.

[25]  Paul M. Salmon,et al.  Missing links? The effects of distraction on driver situation awareness , 2013 .

[26]  X. Jessie Yang,et al.  Modeling Dispositional and Initial learned Trust in Automated Vehicles with Predictability and Explainability , 2020, Transportation Research Part F: Traffic Psychology and Behaviour.

[27]  Radha Nila Meghanathan,et al.  Fixation duration surpasses pupil size as a measure of memory load in free viewing , 2015, Front. Hum. Neurosci..

[28]  Feng Zhou,et al.  Takeover Transition in Autonomous Vehicles: A YouTube Study , 2019, Int. J. Hum. Comput. Interact..

[29]  Keiichi Uchimura,et al.  Driver Inattention Monitoring System for Intelligent Vehicles: A Review , 2009, IEEE Transactions on Intelligent Transportation Systems.

[30]  Yke Bauke Eisma,et al.  Visual Sampling Processes Revisited: Replicating and Extending Senders (1983) Using Modern Eye-Tracking Equipment , 2018, IEEE Transactions on Human-Machine Systems.

[31]  Zhenji Lu,et al.  How much time do drivers need to obtain situation awareness? A laboratory-based study of automated driving. , 2017, Applied ergonomics.

[32]  Lisa J Molnar Age-Related Differences in Driver Behavior Associated with Automated Vehicles and the Transfer of Control between Automated and Manual Control: A Simulator Evaluation , 2017 .

[33]  Matthias Rauterberg,et al.  The effect of peripheral visual feedforward system in enhancing situation awareness and mitigating motion sickness in fully automated driving , 2018, Transportation Research Part F: Traffic Psychology and Behaviour.

[34]  R. M. Taylor,et al.  Situational Awareness Rating Technique (Sart): The Development of a Tool for Aircrew Systems Design , 2017 .

[35]  Wendy Ju,et al.  Situation awareness with different levels of automation , 2014, 2014 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[36]  Ronald A. Rensink Visual Search for Change: A Probe into the Nature of Attentional Processing , 2000 .

[37]  Brandon J. Pitts,et al.  Physiological Measurements of Situation Awareness: A Systematic Review , 2020, Hum. Factors.

[38]  Neville A. Stanton,et al.  Rolling Out the Red (and Green) Carpet: Supporting Driver Decision Making in Automation-to-Manual Transitions , 2019, IEEE Transactions on Human-Machine Systems.

[39]  J. D. Winter,et al.  Take over! A video-clip study measuring attention, situation awareness, and decision-making in the face of an impending hazard , 2020, Transportation Research Part F: Traffic Psychology and Behaviour.

[40]  Michael Weber,et al.  From Car-Driver-Handovers to Cooperative Interfaces: Visions for Driver–Vehicle Interaction in Automated Driving , 2017 .

[41]  Scott Lundberg,et al.  A Unified Approach to Interpreting Model Predictions , 2017, NIPS.

[42]  Louis Tijerina,et al.  Predicting Driver Fatigue in Automated Driving with Explainability , 2021, ArXiv.

[43]  Makoto Itoh,et al.  Effects of Auditory Stimuli and Verbal Communications on Drivers' Situation Awareness in Partially Automated Driving , 2018, 2018 57th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE).

[44]  Mica R. Endsley,et al.  Measurement of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.

[45]  Yongjin Kwon,et al.  Evaluation of synthetic vision information system (SVIS) displays based on pilot performance , 2001, 20th DASC. 20th Digital Avionics Systems Conference (Cat. No.01CH37219).

[46]  Helmut Krcmar,et al.  Improving take-over quality in automated driving by interrupting non-driving tasks , 2019, IUI.

[47]  X. Jessie Yang,et al.  Predicting Takeover Performance in Conditionally Automated Driving , 2020, CHI Extended Abstracts.

[48]  Wolfgang Rosenstiel,et al.  Ready for Take-Over? A New Driver Assistance System for an Automated Classification of Driver Take-Over Readiness , 2017, IEEE Intelligent Transportation Systems Magazine.

[49]  Wendy Ju,et al.  Toward Measurement of Situation Awareness in Autonomous Vehicles , 2017, CHI.

[50]  Maria K. Eckstein,et al.  Beyond eye gaze: What else can eyetracking reveal about cognition and cognitive development? , 2016, Developmental Cognitive Neuroscience.

[51]  Taxonomy and definitions for terms related to driving automation systems for on-road motor vehicles , 2022 .