Modeling visual sampling on in-car displays: The challenge of predicting safety-critical lapses of control

In this article, we study how drivers interact with in-car interfaces, particularly by focusing on understanding driver in-car glance behavior when multitasking while driving. The work focuses on using an in-car touch screen to find a target item from a large number of unordered visual items spread across multiple screens. We first describe a cognitive model that aims to represent a driver?s visual sampling strategy when interacting with an in-car display. The proposed strategy assumes that drivers are aware of the passage of time during the search task; they try to adjust their glances at the display to a time limit, after which they switch back to the driving task; and they adjust their time limits based on their performance in the current driving environment. For visual search, the model assumes a random starting point, inhibition of return, and a search strategy that always seeks the nearest uninspected item. We validate the model?s predictions with empirical data collected in two driving simulator studies with eye tracking. The results of the empirical study suggest that the visual design of in-car displays can have a significant impact on the probability of distraction. In particular, the results suggest that designers should try to minimize total task durations and the durations of all visual encoding steps required for an in-car task, as well as minimize the distance between visual display elements that are encoded one after the other. The cognitive model helps to explain gaze allocation strategies for performing in-car tasks while driving, and thus helps to quantify the effects of task duration and visual item spacing on safety-critical in-car glance durations. We study drivers? in-car glance behavior when multitasking while driving.We describe a cognitive model that represents driver?s visual sampling strategy.Model?s predictions were studied against data from two driving simulator studies.Task length and visual item spacing are critical factors for in-car glance lengths.The model helps explain drivers? gaze allocation strategies while multitasking.

[1]  Duncan P. Brumby,et al.  iPod distraction: effects of portable music-player use on driver performance , 2007, CHI.

[2]  Myounghoon Jeon,et al.  Menu Navigation With In-Vehicle Technologies: Auditory Menu Cues Improve Dual Task Performance, Preference, and Workload , 2015, Int. J. Hum. Comput. Interact..

[3]  John D. Lee,et al.  How Dangerous Is Looking Away From the Road? Algorithms Predict Crash Risk From Glance Patterns in Naturalistic Driving , 2012, Hum. Factors.

[4]  Preeti Verghese,et al.  The psychophysics of visual search , 2000, Vision Research.

[5]  Raymond Klein,et al.  Inhibition of return , 2000, Trends in Cognitive Sciences.

[6]  John Anderson,et al.  An integrated theory of prospective time interval estimation: the role of cognition, attention, and learning. , 2007, Psychological review.

[7]  D. Ballard,et al.  The role of uncertainty and reward on eye movements in a virtual driving task. , 2012, Journal of vision.

[8]  John R Anderson,et al.  An integrated theory of the mind. , 2004, Psychological review.

[9]  Anthony J. Hornof,et al.  Local Density Guides Visual Search: Sparse Groups are First and Faster , 2004 .

[10]  Duncan P. Brumby,et al.  Strategic Adaptation to Performance Objectives in a Dual-Task Setting , 2010, Cogn. Sci..

[11]  M. Posner,et al.  Components of visual orienting , 1984 .

[12]  Antti Oulasvirta,et al.  Model of visual search and selection time in linear menus , 2014, CHI.

[13]  Jeremy M. Wolfe,et al.  Guided Search 4.0: Current Progress With a Model of Visual Search , 2007, Integrated Models of Cognitive Systems.

[14]  Z. Pylyshyn The role of location indexes in spatial perception: A sketch of the FINST spatial-index model , 1989, Cognition.

[15]  Christopher D. Wickens,et al.  Pilot Task Management : Testing an Attentional Expected Value Model of Visual Scanning , 2001 .

[16]  R. Näsänen,et al.  Eye movements in the visual search of word lists , 2002, Vision Research.

[17]  Christopher D. Wickens,et al.  In-Vehicle Glance Duration , 2007 .

[18]  Frank E. Ritter,et al.  Model-based evaluation of expert cell phone menu interaction , 2007, TCHI.

[19]  Anthony J. Hornof,et al.  A minimal model for predicting visual search in human-computer interaction , 2007, CHI.

[20]  John W. Senders,et al.  THE ATTENTIONAL DEMAND OF AUTOMOBILE DRIVING , 1967 .

[21]  Walter W. Wierwille,et al.  An initial model of visual sampling of in-car displays and controls , 1993 .

[22]  Jeff K. Caird,et al.  Driving Performance while Engaged in MP-3 Player Interaction: Effects ofPractice and Task Difficulty on PRT and Eye Movements , 2017 .

[23]  Adrian Furnham,et al.  Tolerance of Ambiguity: A Review of the Recent Literature , 2013 .

[24]  Wayne D. Gray,et al.  Milliseconds Matter: an Introduction to Microstrategies and to Their Use in Describing and Predicting Interactive Behavior Milliseconds Matter: an Introduction to Microstrategies and to Their Use in Describing and Predicting Interactive Behavior , 2022 .

[25]  Pertti Saariluoma,et al.  Effects of menu structure and touch screen scrolling style on the variability of glance durations during in-vehicle visual search tasks , 2011, Ergonomics.

[26]  Heikki Summala,et al.  Driving experience and time-sharing during in-car tasks on roads of different width , 1998 .

[27]  Gary E. Burnett,et al.  Ubiquitous computing within cars: designing controls for non-visual use , 2001, Int. J. Hum. Comput. Stud..

[28]  Thomas A. Dingus,et al.  The Impact of Driver Inattention on Near-Crash/Crash Risk: An Analysis Using the 100-Car Naturalistic Driving Study Data , 2006 .

[29]  Dario D. Salvucci An integrated model of eye movements and visual encoding , 2001, Cognitive Systems Research.

[30]  Niels Taatgen,et al.  Learning when to switch tasks in a dynamic multitasking environment , 2006 .

[31]  Dario D. Salvucci,et al.  Threaded cognition: an integrated theory of concurrent multitasking. , 2008, Psychological review.

[32]  Mary Hayhoe,et al.  Predicting human visuomotor behaviour in a driving task , 2014, Philosophical Transactions of the Royal Society B: Biological Sciences.

[33]  C. Ian,et al.  メチル化DNA結合タンパク質MBD2はグルココルチコイド受容体のNGFI-A(egr-1)媒介転写活性化を促進する , 2014 .

[34]  Masahiro Takei,et al.  Human resource development and visualization , 2009, J. Vis..

[35]  Myra Blanco,et al.  The impact of secondary task cognitive processing demand on driving performance. , 2006, Accident; analysis and prevention.

[36]  Brian P. Bailey,et al.  Understanding changes in mental workload during execution of goal-directed tasks and its application for interruption management , 2008, TCHI.

[37]  Allen Newell,et al.  The psychology of human-computer interaction , 1983 .

[38]  Chris R Sims,et al.  Adaptive Allocation of Vision under Competing Task Demands , 2011, The Journal of Neuroscience.

[39]  Dario D. Salvucci Modeling Driver Behavior in a Cognitive Architecture , 2006, Hum. Factors.