Ready for Take-Over? A New Driver Assistance System for an Automated Classification of Driver Take-Over Readiness

Recent studies analyzing driver behavior report that various factors may influence a driver's take-over readiness when resuming control after an automated driving section. However, there has been little effort made to transfer and integrate these findings into an automated system which classifies the driver's take-over readiness and derives the expected take-over quality. This study now introduces a new advanced driver assistance system to classify the driver's takeover readiness in conditionally automated driving scenarios. The proposed system works preemptively, i.e., the driver is warned in advance if a low take-over readiness is to be expected. The classification of the take-over readiness is based on three information sources: (i) the complexity of the traffic situation, (ii) the current secondary task of the driver, and (iii) the gazes at the road. An evaluation based on a driving simulator study with 81 subjects showed that the proposed system can detect the take-over readiness with an accuracy of 79%. Moreover, the impact of the character of the take-over intervention on the classification result is investigated. Finally, a proof of concept of the novel driver assistance system is provided showing that more than half of the drivers with a low take-over readiness would be warned preemptively with only a 13% false alarm rate.

[1]  Mubarak Shah,et al.  Determining driver visual attention with one camera , 2003, IEEE Trans. Intell. Transp. Syst..

[2]  George Forman,et al.  Learning from Little: Comparison of Classifiers Given Little Training , 2004, PKDD.

[3]  Yukinori Yamada,et al.  Development of a New Pre-crash Safety System , 2006 .

[4]  Eberhard Zeeb Daimler's new full-scale, high-dynamic driving simulator - A technical overview , 2010 .

[5]  Frank Flemisch,et al.  Towards a dynamic balance between humans and automation: authority, ability, responsibility and control in shared and cooperative control situations , 2012, Cognition, Technology & Work.

[6]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Klaus Bengler,et al.  Partially Automated Driving as a Fallback Level of High Automation , 2013 .

[8]  Mohan M. Trivedi,et al.  Where is the driver looking: Analysis of head, eye and iris for robust gaze zone estimation , 2014, 17th International IEEE Conference on Intelligent Transportation Systems (ITSC).

[9]  Mohan M. Trivedi,et al.  Head, Eye, and Hand Patterns for Driver Activity Recognition , 2014, 2014 22nd International Conference on Pattern Recognition.

[10]  Klaus Bengler,et al.  How Traffic Situations and Non-Driving Related Tasks Affect the Take-Over Quality in Highly Automated Driving , 2014 .

[11]  Josef Nilsson,et al.  Safe Transitions From Automated to Manual Driving Using Driver Controllability Estimation , 2015, IEEE Transactions on Intelligent Transportation Systems.

[12]  Kathrin Zeeb,et al.  What determines the take-over time? An integrated model approach of driver take-over after automated driving. , 2015, Accident; analysis and prevention.

[13]  Wolfgang Rosenstiel,et al.  Driver-Activity Recognition in the Context of Conditionally Autonomous Driving , 2015, 2015 IEEE 18th International Conference on Intelligent Transportation Systems.

[14]  Wolfgang Rosenstiel,et al.  On the necessity of adaptive eye movement classification in conditionally automated driving scenarios , 2016, ETRA.

[15]  Kathrin Zeeb,et al.  Is take-over time all that matters? The impact of visual-cognitive load on driver take-over quality after conditionally automated driving. , 2016, Accident; analysis and prevention.

[16]  Wolfgang Rosenstiel,et al.  Online Recognition of Driver-Activity Based on Visual Scanpath Classification , 2017, IEEE Intelligent Transportation Systems Magazine.

[17]  Klaus Bengler,et al.  How the Duration of Automated Driving Influences Take-Over Performance and Gaze Behavior , 2017 .