Supporting Attention Allocation in Multitask Environments

Objective: The aim of the current study was to investigate potential benefits of likelihood alarm systems (LASs) over binary alarm systems (BASs) in a multitask environment. Background: Several problems are associated with the use of BASs, because most of them generate high numbers of false alarms. Operators lose trust in the systems and ignore alarms or cross-check all of them when other information is available. The first behavior harms safety, whereas the latter one reduces productivity. LASs represent an alternative, which is supposed to improve operators’ attention allocation. Method: We investigated LASs and BASs in a dual-task paradigm with and without the possibility to cross-check alerts with raw data information. Participants’ trust in the system, their behavior, and their performance in the alert and the concurrent task were assessed. Results: Reported trust, compliance with alarms, and performance in the alert and the concurrent task were higher for the LAS than for the BAS. The cross-check option led to an increase in alert task performance for both systems and a decrease in concurrent task performance for the BAS, which did not occur in the LAS condition. Conclusion: LASs improve participants’ attention allocation between two different tasks and therefore lead to an increase in alert task and concurrent task performance. The performance maximum is achieved when LAS is combined with a cross-check option for validating alerts with additional information. Application: The use of LASs instead of BASs in safety-related multitask environments has the potential to increase safety and productivity likewise.

[1]  Joachim Meyer,et al.  Effects of Warning Validity and Proximity on Responses to Warnings , 2001, Hum. Factors.

[2]  P A Hancock,et al.  Alarm effectiveness in driver-centred collision-warning systems. , 1997, Ergonomics.

[3]  D. M. Green,et al.  Signal detection theory and psychophysics , 1966 .

[4]  J A Swets,et al.  The science of choosing the right decision threshold in high-stakes diagnostics. , 1992, The American psychologist.

[5]  Joachim Meyer,et al.  Conceptual Issues in the Study of Dynamic Hazard Warnings , 2004, Hum. Factors.

[6]  Joshua D. Hoffman,et al.  Collision warning design to mitigate driver distraction , 2004, CHI.

[7]  S. Breznitz Cry Wolf: The Psychology of False Alarms , 1984 .

[8]  John D. Lee,et al.  Alerts for In-Vehicle Information Systems: Annoyance, Urgency, and Appropriateness , 2007, Hum. Factors.

[9]  Monica N. Lees,et al.  The influence of distraction and driving context on driver response to imperfect collision warning systems , 2007, Ergonomics.

[10]  Christopher D. Wickens,et al.  Dual-Task Performance Consequences of Imperfect Alerting Associated With a Cockpit Display of Traffic Information , 2007, Hum. Factors.

[11]  Douglas A. Wiegmann,et al.  Automation Failures on Tasks Easily Performed by Operators Undermines Trust in Automated Aids , 2003 .

[12]  J. Bliss,et al.  Reversal of the Cry-Wolf Effect: An Investigation of Two Methods to Increase Alarm Response Rates , 1994 .

[13]  Barry H. Kantowitz,et al.  Likelihood Alarm Displays , 1988 .

[14]  Ernesto A. Bustamante,et al.  Effects of Workload and Likelihood Information on Human Response to Alarm Signals , 2005 .

[15]  Nina Gérard Are false alarms not as bad as supposed after all? A study investigating operators’ responses to imperfect alarms , 2010 .

[16]  Ernesto A. Bustamante,et al.  Differential Effects of Likelihood Alarm Technology and False-Alarm vs. Miss Prone Automation on Decision Making , 2009 .

[17]  B. S. Fuller,et al.  Reversal of the Cry-Wolf Effect: An Investigation of Two Methods to Increase Alarm Response Rates , 1995 .

[18]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004, Hum. Factors.

[19]  John A. Swets,et al.  System operator response to warnings of danger: A laboratory investigation of the effects of the predictive value of a warning on human response time. , 1995 .

[20]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.