Human-Automation Interaction

Automation does not mean humans are replaced; quite the opposite. Increasingly, humans are asked to interact with automation in complex and typically large-scale systems, including aircraft and air traffic control, nuclear power, manufacturing plants, military systems, homes, and hospitals. This is not an easy or error-free task for either the system designer or the human operator/automation supervisor, especially as computer technology becomes ever more sophisticated. This review outlines recent research and challenges in the area, including taxonomies and qualitative models of human-automation interaction; descriptions of automation-related accidents and studies of adaptive automation; and social, political, and ethical issues. Keywords: Driver distraction; Language: en

[1]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[2]  R. Parasuraman,et al.  Psychophysiology and adaptive automation , 1996, Biological Psychology.

[3]  L J Skitka,et al.  Automation bias: decision making and performance in high-tech cockpits. , 1997, The International journal of aviation psychology.

[4]  W. E. Hick Quarterly Journal of Experimental Psychology , 1948, Nature.

[5]  Thomas B. Sheridan,et al.  Musings on Telepresence and Virtual Presence , 1992, Presence: Teleoperators & Virtual Environments.

[6]  D A Norman,et al.  The 'problem' with automation: inappropriate feedback and interaction, not 'over-automation'. , 1990, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[7]  Hiroshi Furukawa,et al.  Supporting System-Centered View of Operators Through Ecological Interface Design: Two Experiments on Human-Centered Automation , 2003 .

[8]  Daniel V. McGehee,et al.  Collision Warning Timing, Driver Distraction, and Driver Response to Imminent Rear-End Collisions in a High-Fidelity Driving Simulator , 2002, Hum. Factors.

[9]  Peter A. Hancock,et al.  Allocating Functions Rationally between Humans and Machines , 1998 .

[10]  Joachim Meyer,et al.  Effects of Warning Validity and Proximity on Responses to Warnings , 2001, Hum. Factors.

[11]  Jeffrey M. Bradshaw,et al.  Ten Challenges for Making Automation a "Team Player" in Joint Human-Agent Activity , 2004, IEEE Intell. Syst..

[12]  M R Endsley,et al.  Level of automation effects on performance, situation awareness and workload in a dynamic control task. , 1999, Ergonomics.

[13]  Thomas B. Sheridan,et al.  TRUSTWORTHINESS OF COMMAND AND CONTROL SYSTEMS , 1988 .

[14]  Hiroshi Furukawa,et al.  A flexible delegation-type interface enhances system performance in human supervision of multiple robots: empirical studies with RoboFlag , 2005, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[15]  B. J. Fogg,et al.  Can computer personalities be human personalities? , 1995, Int. J. Hum. Comput. Stud..

[16]  Toshiyuki Inagaki,et al.  Attention and complacency , 2000 .

[17]  Raja Parasuraman,et al.  Automation-Related “Complacency”: Theory, Empirical Data, and Design Implications , 2001 .

[18]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[19]  N Moray,et al.  Culture, politics and ergonomics , 2000, Ergonomics.

[20]  Timothy L. Brown,et al.  Speech-Based Interaction with In-Vehicle Computers: The Effect of Speech-Based E-Mail on Drivers' Attention to the Roadway , 2001, Hum. Factors.

[21]  Zhi-Gang Wei,et al.  A Quantitative Measure for Degree of Automation and Its Relation to System Performance and Mental Load , 1998, Hum. Factors.

[22]  David Woods,et al.  1. How to make automated systems team players , 2002 .

[23]  Kim J. Vicente,et al.  Ecological interface design: theoretical foundations , 1992, IEEE Trans. Syst. Man Cybern..

[24]  F. Freeman,et al.  A Closed-Loop System for Examining Psychophysiological Measures for Adaptive Task Allocation , 2000, The International journal of aviation psychology.

[25]  N Moray,et al.  Trust, control strategies and allocation of function in human-machine systems. , 1992, Ergonomics.

[26]  Raja Parasuraman,et al.  Performance Consequences of Automation-Induced 'Complacency' , 1993 .

[27]  David B. Kaber,et al.  The effects of level of automation and adaptive automation on human performance, situation awareness and workload in a dynamic control task , 2004 .

[28]  David L. Kleinman,et al.  An optimal control model of human response part I: Theory and validation , 1970 .

[29]  Lisanne Bainbridge,et al.  Ironies of automation , 1982, Autom..

[30]  Kim J. Vicente,et al.  Ecological Interface Design: Progress and Challenges , 2002, Hum. Factors.

[31]  Jennifer Wilson,et al.  Flight Deck Automation issues , 1999 .

[32]  Lance Sherry,et al.  Shared Models of Flight Management System Vertical Guidance , 1999 .

[33]  B. Rouse William,et al.  Adaptive aiding for human/computer control , 1988 .

[34]  Raja Parasuraman,et al.  Human Versus Automation in Responding to Failures: An Expected-Value Analysis , 2000, Hum. Factors.

[35]  Raja Parasuraman,et al.  Automation in Future Air Traffic Management: Effects of Decision Aid Reliability on Controller Performance and Mental Workload , 2005, Hum. Factors.

[36]  Glenn F. Wilson,et al.  Real-Time Assessment of Mental Workload Using Psychophysiological Measures and Artificial Neural Networks , 2003, Hum. Factors.

[37]  John D. Lee,et al.  Trust, self-confidence, and operators' adaptation to automation , 1994, Int. J. Hum. Comput. Stud..

[38]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004 .

[39]  Thomas B. Sheridan,et al.  My Anxieties about Virtual Environments , 1993, Presence: Teleoperators & Virtual Environments.

[40]  R Parasuraman,et al.  Designing automation for human use: empirical studies and quantitative models , 2000, Ergonomics.

[41]  David A. Kobus,et al.  Overview of the DARPA Augmented Cognition Technical Integration Experiment , 2004, Int. J. Hum. Comput. Interact..

[42]  Earl L. Wiener 13 – Cockpit Automation , 1988 .

[43]  K B Bennett,et al.  Graphical Displays: Implications for Divided Attention, Focused Attention, and Problem Solving , 1992, Human factors.

[44]  Thomas B. Sheridan,et al.  Function allocation: algorithm, alchemy or apostasy? , 2000, Int. J. Hum. Comput. Stud..

[45]  Christopher A. Miller,et al.  The Rotorcraft Pilot's Associate: design and evaluation of an intelligent user interface for cockpit information management , 1999, Knowl. Based Syst..

[46]  John W. Senders,et al.  The Human Operator as a Monitor and Controller of Multidegree of Freedom Systems , 1964 .

[47]  T. L. Porte High Reliability Organizations: Unlikely, Demanding and At Risk , 1996 .

[48]  Thomas B. Sheridan,et al.  Toward a General Model of Supervisory Control , 1976 .

[49]  Christopher A. Miller,et al.  Trust and etiquette in high-criticality automated systems , 2004, CACM.

[50]  Norbert Wiener,et al.  God and Golem, inc. : a comment on certain points where cybernetics impinges on religion , 1964 .

[51]  William B. Rouse,et al.  Adaptive Allocation of Decision Making Responsibility between Supervisor and Computer , 1976 .

[52]  Renwick E. Curry,et al.  Flight-deck automation: promises and problems , 1980 .

[53]  Kim J. Vicente,et al.  Designing Effective Human-Automation-Plant Interfaces: A Control-Theoretic Perspective , 2005, Hum. Factors.

[54]  Nadine B. Sarter,et al.  How in the World Did We Ever Get into That Mode? Mode Error and Awareness in Supervisory Control , 1995, Hum. Factors.

[55]  N. Jordan Allocation of functions between man and machines in automated systems. , 1963 .

[56]  G. Hardin,et al.  The Tragedy of the Commons , 1968, Green Planet Blues.

[57]  Duane T. McRuer,et al.  A Review of Quasi-Linear Pilot Models , 1967 .

[58]  Scott A. Shappell,et al.  Human Factors Analysis of Postaccident Data: Applying Theoretical Taxonomies of Human Error , 1997 .