Secondary task and situation awareness, a mobile application for semi-autonomous vehicle

Autonomous vehicles are developing rapidly and will lead to a significant change in the driver's role: he/she will have to move from the role of actor to the role of supervisor. Indeed, the driver will soon be able to perform a secondary task but he/she must be able to take over control in the event of a critical situation that is not managed by the autonomous system. This implies that the role of new interfaces and interactions within the vehicle is important to take into account. This article describes the design of an application that provides the driver with information about the environment perceived by his/her vehicle in the form of modules. This application is displayed as split screen on a tablet by which a secondary task can be performed. Initial tests were carried out with this application in a driving simulator. They made it possible to test the acceptance of the application and the clarity of the information transmitted. The results generally showed that the participants correctly identified some of the factors limiting the proper functioning of the autonomous pilot while performing a secondary task on a tablet.

[1]  Shamsi T. Iqbal,et al.  Priming Drivers before Handover in Semi-Autonomous Cars , 2017, CHI.

[2]  Wendy Ju,et al.  Reinventing the Wheel: Transforming Steering Wheel Systems for Autonomous Vehicles , 2017, UIST.

[3]  Renaud Deborne,et al.  Transition of control in a partially automated vehicle: Effects of anticipation and non-driving-related task involvement , 2017 .

[4]  Elena Mugellini,et al.  Owner Manuals Review and Taxonomy of ADAS Limitations in Partially Automated Vehicles , 2019, AutomotiveUI.

[5]  Alexandra Neukum,et al.  The impact of an in-vehicle display on glance distribution in partially automated driving in an on-road experiment , 2018 .

[6]  Fang Chen,et al.  Using Advisory 3D Sound Cues to Improve Drivers' Performance and Situation Awareness , 2017, CHI.

[7]  Chi Thanh Vi,et al.  What Did I Sniff?: Mapping Scents Onto Driving-Related Messages , 2017, AutomotiveUI.

[8]  Lynne Baillie,et al.  What's around the corner?: enhancing driver awareness in autonomous vehicles via in-vehicle spatial auditory displays , 2014, NordiCHI.

[9]  Sebastiaan M. Petermeijer,et al.  Take-over requests in highly automated driving: A crowdsourcing survey on auditory, vibrotactile, and visual displays , 2018, Transportation Research Part F: Traffic Psychology and Behaviour.

[10]  Julio Cesar Sampaio do Prado Leite,et al.  Software Transparency as a Key Requirement for Self-Driving Cars , 2018, 2018 IEEE 26th International Requirements Engineering Conference (RE).

[11]  Maurizio Morisio,et al.  Connected Car , 2016, ACM Comput. Surv..

[12]  Alexandra Neukum,et al.  Driver compliance to take-over requests with different auditory outputs in conditional automation. , 2017, Accident; analysis and prevention.

[13]  Tobias Vogelpohl,et al.  Asleep at the automated wheel-Sleepiness and fatigue during highly automated driving. , 2019, Accident; analysis and prevention.

[14]  Alexandra Neukum,et al.  Improving Usefulness of Automated Driving by Lowering Primary Task Interference through HMI Design , 2017 .

[15]  Torben Wallbaum,et al.  Comparing Shape-Changing and Vibro-Tactile Steering Wheels for Take-Over Requests in Highly Automated Driving , 2017, AutomotiveUI.

[16]  Elena Mugellini,et al.  Convey situation awareness in conditionally automated driving with a haptic seat , 2019, AutomotiveUI.

[17]  Wendy Ju,et al.  Distraction Becomes Engagement in Automated Driving , 2015 .

[18]  Richard J Hanowski,et al.  10th International Conference on managing fatigue: Managing fatigue to improve safety, wellness, and effectiveness. , 2019, Accident; analysis and prevention.

[19]  David R. Large,et al.  Putting the Joy in Driving: Investigating the Use of a Joystick as an Alternative to Traditional Controls within Future Autonomous Vehicles , 2017, AutomotiveUI.

[20]  Nazanin Nader,et al.  Autonomous vehicles' disengagements: Trends, triggers, and regulatory limitations. , 2018, Accident; analysis and prevention.

[21]  Wendy Ju,et al.  Looking ahead: Anticipatory interfaces for driver-automation collaboration , 2017, 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC).

[22]  Manfred Tscheligi,et al.  Dorsal haptic display: a shape-changing car seat for sensory augmentation of rear obstacles , 2015, AutomotiveUI.

[23]  Susanne Boll,et al.  Sparkle: an ambient light display for dynamic off-screen points of interest , 2014, NordiCHI.

[24]  Mascha C. van der Voort,et al.  How to assess driver's interaction with partially automated driving systems – A framework for early concept assessment , 2017 .

[25]  Nazanin Nader,et al.  Examining accident reports involving autonomous vehicles in California , 2017, PloS one.