Don't you see them?: towards gaze-based interaction adaptation for driver-vehicle cooperation

Highly automated driving evolves steadily and even gradually enters public roads. Nevertheless, there remain driving-related tasks that can be handled more efficiently by humans. Cooperation with the human user on a higher abstraction level of the dynamic driving task has been suggested to overcome operational boundaries. This cooperation includes for example deciding whether pedestrians want to cross the road ahead. We suggest that systems should monitor their users when they have to make such decisions. Moreover, these systems can adapt the interaction to support their users. In particular, they can match gaze direction and objects in their environmental model like vulnerable road users to guide the focus of users towards overlooked objects. We conducted a pilot study to investigate the need and feasibility of this concept. Our preliminary analysis showed that some participants overlooked pedestrians that intended to cross the road which could be prevented with such systems.

[1]  Bastian Pfleging,et al.  Investigating user needs for non-driving-related activities during automated driving , 2016, MUM.

[2]  Sharon Brusic,et al.  Technology Today And Tomorrow , 1987 .

[3]  Homayoun Najjaran,et al.  Autonomous vehicle perception: The technology of today and tomorrow , 2018 .

[4]  Michael Weber,et al.  From Car-Driver-Handovers to Cooperative Interfaces: Visions for Driver–Vehicle Interaction in Automated Driving , 2017 .

[5]  Natasha Merat,et al.  Highly Automated Driving, Secondary Task Performance, and Driver State , 2012, Hum. Factors.

[6]  Julius Ziegler,et al.  Making Bertha Drive—An Autonomous Journey on a Historic Route , 2014, IEEE Intelligent Transportation Systems Magazine.

[7]  Martin Baumann,et al.  Cooperative Overtaking: Overcoming Automated Vehicles' Obstructed Sensor Range via Driver Help , 2019, AutomotiveUI.

[8]  Serge Boverie,et al.  The Importance of Driver State Assessment Within Highly Automated Vehicles , 2009 .

[9]  Graham. Parkhurst,et al.  Handover issues in autonomous driving: A literature review , 2016 .

[10]  Michael Weber,et al.  Click or Hold: Usability Evaluation of Maneuver Approval Techniques in Highly Automated Driving , 2018, CHI Extended Abstracts.

[11]  Zhiwei Zhu,et al.  Real-time nonintrusive monitoring and prediction of driver fatigue , 2004, IEEE Transactions on Vehicular Technology.

[12]  Neville A. Stanton,et al.  Effects of adaptive cruise control and highly automated driving on workload and situation awareness: A review of the empirical evidence , 2014 .

[13]  Michael Weber,et al.  CooperationCaptcha: On-The-Fly Object Labeling for Highly Automated Vehicles , 2019, CHI Extended Abstracts.

[14]  Luke Fletcher,et al.  Driver Inattention Detection based on Eye Gaze—Road Event Correlation , 2009, Int. J. Robotics Res..

[15]  Mica R. Endsley,et al.  Toward a Theory of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.

[16]  Michael Weber,et al.  Towards Cooperative Driving: Involving the Driver in an Autonomous Vehicle's Decision Making , 2016, AutomotiveUI.

[17]  Wendy Ju,et al.  Visual Attention During Simulated Autonomous Driving in the US and Japan , 2017, AutomotiveUI.