Shared Gaze While Driving: How Drivers Can Be Supported by an LED-Visualization of the Front-Seat Passenger's Gaze

The front-seat passenger in a vehicle may assist a driver in providing hints towards points of interest in a driving situation. In order to communicate spatial information efficiently, the so-called shared gaze approach has been introduced in previous research. Thereby, the gaze of the front-seat passenger is visualized for the driver. So far, this approach has been solely investigated in driving simulator environments. In this paper, we present a study on how well shared gaze works in a real driving situation (n = 8). We examine identification rates of different object types in the driving environment based on the visualization of the front-seat passenger’s gaze via glowing LEDs on an LED-strip. Our results show that this rate is dependent on object relevance for the driving task and movement of the object. We found that perceived visual distraction was low and that the usefulness of shared gaze for navigational tasks was considered high.

[1]  Lena Rittger,et al.  Guiding Driver Visual Attention with LEDs , 2017, AutomotiveUI.

[2]  Manfred Tscheligi,et al.  The Neglected Passenger—How Collaboration in the Car Fosters Driving Experience and Safety , 2017 .

[3]  Andrii Matviienko,et al.  Deriving design guidelines for ambient light systems , 2015, MUM.

[4]  Wendy Ju,et al.  Looking ahead: Anticipatory interfaces for driver-automation collaboration , 2017, 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC).

[5]  Andrii Matviienko,et al.  NaviLight: investigating ambient light displays for turn-by-turn navigation in cars , 2016, MobileHCI.

[6]  Manfred Tscheligi,et al.  ChaseLight: ambient LED stripes to control driving speed , 2015, AutomotiveUI.

[7]  Ralph Bruder,et al.  Driver assistance via optical information with spatial reference , 2013, 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013).

[8]  Manfred Tscheligi,et al.  Individual LED Visualization Calibration to Increase Spatial Accuracy: Findings from a Static Driving Simulator Setup , 2017, AutomotiveUI.

[9]  Susanne Boll,et al.  Supporting lane change decisions with ambient light , 2015, AutomotiveUI.

[10]  Jacques M. B. Terken,et al.  Situation Awareness in Automated Vehicles through Proximal Peripheral Light Signals , 2017, AutomotiveUI.

[11]  Oskar Juhlin,et al.  Social Media on the Road: Mobile Technologies and Future Traffic Research , 2011, IEEE MultiMedia.

[12]  Susanne Boll,et al.  Assisting Drivers with Ambient Take-Over Requests in Highly Automated Driving , 2016, AutomotiveUI.

[13]  Gerhard Rigoll,et al.  A large-scale LED array to support anticipatory driving , 2011, 2011 IEEE International Conference on Systems, Man, and Cybernetics.

[14]  Andreas Butz,et al.  Clique Trip: feeling related in different cars , 2012, DIS '12.

[15]  Manfred Tscheligi,et al.  Ambient Light and its Influence on Driving Experience , 2017, AutomotiveUI.

[16]  Erhardt Barth,et al.  Gaze guidance reduces the number of collisions with pedestrians in a driving simulator , 2012, TIIS.

[17]  Manfred Tscheligi,et al.  "Dad, Stop Crashing My Car!": Making Use of Probing to Inspire the Design of Future In-Car Interfaces , 2014, AutomotiveUI.

[18]  Jodi Forlizzi,et al.  Where should i turn: moving from individual to collaborative navigation strategies to inform the interaction design of future navigation systems , 2010, CHI.

[19]  Nicole Perterer Safety through Collaboration: A New Challenge for Automotive Design , 2016, CSCW '16 Companion.

[20]  Philipp Mayring,et al.  Qualitative Content Analysis: Theoretical Background and Procedures , 2015 .

[21]  Manfred Tscheligi,et al.  I need help!: exploring collaboration in the car , 2012, CSCW.

[22]  Erhardt Barth,et al.  Simple gaze-contingent cues guide eye movements in a realistic driving simulator , 2013, Electronic Imaging.

[23]  Sebastiaan M. Petermeijer,et al.  Take-over requests in highly automated driving: A crowdsourcing survey on auditory, vibrotactile, and visual displays , 2018, Transportation Research Part F: Traffic Psychology and Behaviour.

[24]  Wendy Ju,et al.  Reinventing the Wheel: Transforming Steering Wheel Systems for Autonomous Vehicles , 2017, UIST.

[25]  Manfred Tscheligi,et al.  Light my way: visualizing shared gaze in the car , 2015, AutomotiveUI.

[26]  A. Pauzie,et al.  A method to assess the driver mental workload: The driving activity load index (DALI) , 2008 .

[27]  Manfred Tscheligi,et al.  Shared Gaze in the Car: Towards a Better Driver-Passenger Collaboration , 2014, AutomotiveUI.

[28]  Manfred Tscheligi,et al.  Four Eyes See More Than Two: Shared Gaze in the Car , 2015, INTERACT.

[29]  J. Charlton,et al.  Older driver and passenger collaboration for wayfinding in unfamiliar areas , 2014 .

[30]  Manfred Tscheligi,et al.  Co-Navigator: an advanced navigation system for front-seat passengers , 2015, AutomotiveUI.

[31]  Susanne Boll,et al.  An experiment on ambient light patterns to support lane change decisions , 2015, 2015 IEEE Intelligent Vehicles Symposium (IV).

[32]  P. Mayring Qualitative content analysis: theoretical foundation, basic procedures and software solution , 2014 .

[33]  Manfred Tscheligi,et al.  The Impact of Spatial Properties on Collaboration: An Exploratory Study in the Automotive Domain , 2016, GROUP.