Automated Control Systems Do They Reduce Human Error And Incidents

WITH EACH PASSING DAY, the world becomes more technologically advanced. As automation, artificial intelligence and robotics improve, it may be increasingly tempting to employ automatic means to accomplish production goals. Just think of it: If no humans were on the production floor, no more human errors would occur, no one would get injured, and companies would produce higher-quality product that they could move faster to market. Supervisors would not have to deal with employees who arrive late, get tired, want time off, complain about conditions, get hurt, spoil batches and generally disrupt operations. The ability to move higher-quality product out the door faster would seem enough for any company to want to automate every process. In fact, trends suggest this may be happening. Management consultant Walter Bennis says that the factory of the future will have only two employees, a human and a dog—with the human there only to feed the dog and the dog there to bite the human if s/he touches anything (Paradies & Unger, 2000). Few would argue that automated control systems are not necessary in today’s complex industrial processes. However, is complete automation the best or most appropriate approach? While control system automation provides predictable, consistent performance, it lacks human judgment, adaptability and logic. Conversely, humans provide judgment, adaptability, experience and sound logic, yet are unpredictable, unreliable, inconsistent, subject to emotions and alternative motivations, and not biomechanically efficient. This raises two important questions: 1) To maximize system performance, should we automate humans out of the system? or 2) Do we maximize human input and lose efficient, consistent, error-free system performance? The answer is that the proper level of automation is likely somewhere between these two extremes and it is likely different for each system and situation (Haight & Kecojevic, 2005). This article reviews existing literature on automated control systems and human interface, and attempts to extend the work of Haight and Kecojevic (2005). The goal is to develop a method that helps design engineers determine how to minimize human error while maximizing system performance and better understand the right human/machine mix (Haight & Kecojevic, 2005).

[1]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[2]  Nadine B. Sarter,et al.  Good Vibrations: Tactile Feedback in Support of Attention Allocation and Human-Automation Coordination in Event-Driven Domains , 1999, Hum. Factors.

[3]  David B. Kaber,et al.  Adaptive Automation of Human-Machine System Information-Processing Functions , 2005, Hum. Factors.

[4]  Catherine M. Burns,et al.  There Is More to Monitoring a Nuclear Power Plant than Meets the Eye , 2000, Hum. Factors.

[5]  Asilian Hasan,et al.  HUMAN ERROR ANALYSIS AMONG PETROCHEMICAL PLANT CONTROL ROOM OPERATORS WITH HUMAN ERROR ASSESSMENT AND REDUCTION TECHNIQUE , 2009 .

[6]  T. Salthouse A Theory of Cognitive Aging , 1985 .

[7]  Asaf Degani,et al.  Formal Verification of Human-Automation Interaction , 2002, Hum. Factors.

[8]  Dan Petersen Human-error reduction and safety management , 1996 .

[9]  S S Stevens,et al.  HUMAN ENGINEERING FOR AN EFFECTIVE AIR-NAVIGATION AND TRAFFIC-CONTROL SYSTEM, AND APPENDIXES 1 THRU 3 , 1951 .

[10]  Mark W. Scerbo,et al.  Effects of a Psychophysiological System for Adaptive Automation on Performance, Workload, and the Event-Related Potential P300 Component , 2003, Hum. Factors.

[11]  Penelope M. Sanderson,et al.  Minimal Instrumentation May Compromise Failure Diagnosis With an Ecological Interface , 2004, Hum. Factors.

[12]  A Kirlik,et al.  Modeling Strategic Behavior in Human-Automation Interaction: Why an "Aid" Can (and Should) Go Unused , 1993, Human factors.

[13]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004 .

[14]  D. Oxley Design Paradigms: Case Histories of Error and Judgment in Engineering , 1997 .

[15]  R. I. Sutton,et al.  Switching Cognitive Gears: From Habits of Mind to Active Thinking , 1991 .

[16]  Chris W. Clegg,et al.  A Sociotechnical Method for Designing Work Systems , 2002, Hum. Factors.

[17]  Kim J. Vicente,et al.  Designing Effective Human-Automation-Plant Interfaces: A Control-Theoretic Perspective , 2005, Hum. Factors.

[18]  Raja Parasuraman,et al.  Monitoring an Automated System for a Single Failure: Vigilance and Task Complexity Effects , 1996 .

[19]  Thomas B. Sheridan,et al.  Man-machine systems;: Information, control, and decision models of human performance , 1974 .

[20]  Douglas A. Wiegmann,et al.  Automation Failures on Tasks Easily Performed by Operators Undermines Trust in Automated Aids , 2003 .

[21]  Vladislav Kecojevic,et al.  Automation vs. Human intervention: What is the best fit for the best performance? , 2005 .

[22]  John W. Senders,et al.  Human Error: Cause, Prediction, and Reduction , 1991 .

[23]  Mustapha Mouloua,et al.  Effects of Adaptive Task Allocation on Monitoring of Automated Systems , 1996, Hum. Factors.