A Human-Machine Interface for Cooperative Highly Automated Driving

Cooperative perception of the traffic environment will enable Highly Automated Driving (HAD) functions to provide timelier and more complex Take-Over Requests (TOR) than it is possible with vehicle-localized perception alone. Furthermore, cooperative perception will extend automated vehicles’ capability of performing tactic and strategic maneuvers independently of any driver intervention (e.g., avoiding of obstacles). In this paper, resulting challenges to the design of the Human-Machine Interface (HMI) are discussed and a prototypical HMI is presented. The prototype is evaluated by experts from the field of cognitive ergonomics in a small-scale simulator study.

[1]  Frank E. Pollick,et al.  Language-based multimodal displays for the handover of control in autonomous cars , 2015, AutomotiveUI.

[2]  Alexandra Neukum,et al.  Controllability of partially automated driving functions - does it matter whether drivers are allowed to take their hands off the steering wheel? , 2015 .

[3]  Klaus Bengler,et al.  “Take over!” How long does it take to get the driver back into the loop? , 2013 .

[4]  Wendy Ju,et al.  Why did my car just do that? Explaining semi-autonomous driving actions to improve driver understanding, trust, and performance , 2014, International Journal on Interactive Design and Manufacturing (IJIDeM).

[5]  Monica N. Lees,et al.  The influence of distraction and driving context on driver response to imperfect collision warning systems , 2007, Ergonomics.

[6]  Frederik Naujoks,et al.  Der Fahrer im Zentrum eines vernetzten Strassenverkehrssystems - Herausforderungen und Chancen aus Sicht der Verkehrspsychologie / The driver and connected driving - Challenges and chances from the perspective of traffic psychology , 2014 .

[7]  R. Happee,et al.  Automated Driving: Human-Factors Issues and Design Solutions , 2012 .

[8]  David D. Woods,et al.  Envisioning human-robot coordination in future operations , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[9]  Klaus C. J. Dietmayer,et al.  Car2X-based perception in a high-level fusion architecture for cooperative perception systems , 2012, 2012 IEEE Intelligent Vehicles Symposium.

[10]  Alexandra Neukum,et al.  The effect of urgency of take-over requests during highly automated driving under distraction conditions , 2014 .

[11]  Ronald L. Boring,et al.  Shared understanding for collaborative control , 2005, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[12]  R. Brent Gillespie,et al.  Sharing Control Between Humans and Automation Using Haptic Interface: Primary and Secondary Task Performance Benefits , 2005, Hum. Factors.

[13]  Alexandra Neukum,et al.  Specificity and timing of advisory warnings based on cooperative perception , 2014, Mensch & Computer Workshopband.

[14]  Klaus Bengler,et al.  Vibrotactile Displays: A Survey With a View on Highly Automated Driving , 2016, IEEE Transactions on Intelligent Transportation Systems.

[15]  Klaus Bengler,et al.  How Traffic Situations and Non-Driving Related Tasks Affect the Take-Over Quality in Highly Automated Driving , 2014 .

[16]  Victoria A Banks,et al.  Keep the driver in control: Automating automobiles of the future. , 2016, Applied ergonomics.

[17]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004 .

[18]  Alexandra Neukum,et al.  Effectiveness of advisory warnings based on cooperative perception , 2015 .