Dissonance between multiple alerting systems.I. Modeling and analysis

The potential for conflicting information to be transmitted by different automated alerting systems is growing as these systems become more pervasive in process operations. Newly introduced alerting systems must be carefully designed to minimize the potential for and impact of alerting conflicts, but little is currently available to aid this process. A model of alert dissonance is developed that provides a theoretical foundation for understanding conflicts and a practical basis from which specific problems can be addressed. Part I establishes a generalized methodology to analyze dissonance between alerting systems, and Part II exercises the principles and presents methodologies to avoid and mitigate dissonance. In Part I, we develop a generalized state-space representation of alerting operation that can be tailored across a variety of applications. Based on the representation, two major causes of dissonance are identified: logic differences and sensor error. Additionally, several possible types of dissonance are identified. A mathematical analysis method is developed to identify the conditions that cause dissonance due to logic differences. A probabilistic analysis methodology is also developed to estimate the probability of dissonance originating due to sensor error.

[1]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[2]  Mica R. Endsley,et al.  Measurement of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.

[3]  Douglas R. Isaacson,et al.  Conflict Detection and Resolution In the Presence of Prediction Error , 1997 .

[4]  James K. Kuchar,et al.  MODELING AND ANALYSIS OF CONFLICTS BETWEEN ALERTING SYSTEMS , 2001 .

[5]  Amy R. Pritchett,et al.  Testing and implementing cockpit alerting systems , 2002, Reliab. Eng. Syst. Saf..

[6]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[7]  Victor A. Riley,et al.  Operator reliance on automation: Theory and data. , 1996 .

[8]  John D. Lee,et al.  Trust, self-confidence, and operators' adaptation to automation , 1994, Int. J. Hum. Comput. Stud..

[9]  H. Erzberger,et al.  Design of a conflict detection algorithm for the Center/TRACON automation system , 1997, 16th DASC. AIAA/IEEE Digital Avionics Systems Conference. Reflections to the Future. Proceedings.

[10]  Thomas B. Sheridan,et al.  Telerobotics, Automation, and Human Supervisory Control , 2003 .

[11]  Nadine B. Sarter,et al.  How in the World Did We Ever Get into That Mode? Mode Error and Awareness in Supervisory Control , 1995, Hum. Factors.

[12]  Amy R. Pritchett,et al.  Pilot non-conformance to alerting system commands during closely spaced parallel approaches , 2013, 16th DASC. AIAA/IEEE Digital Avionics Systems Conference. Reflections to the Future. Proceedings.

[13]  N Moray,et al.  Trust, control strategies and allocation of function in human-machine systems. , 1992, Ergonomics.

[14]  David D. Woods,et al.  Systems with Human Monitors: A Signal Detection Analysis , 1985, Hum. Comput. Interact..

[15]  Mica R. Endsley,et al.  Toward a Theory of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.