Human Interaction with Levels of Automation and Decision-Aid Fidelity in the Supervisory Control of Multiple Simulated Unmanned Air Vehicles

Remotely operated vehicles (ROVs) are vehicular robotic systems that are teleoperated by a geographically separated user. Advances in computing technology have enabled ROV operators to manage multiple ROVs by means of supervisory control techniques. The challenge of incorporating telepresence in any one vehicle is replaced by the need to keep the human in the loop of the activities of all vehicles. An evaluation was conducted to compare the effects of automation level and decision-aid fidelity on the number of simulated remotely operated vehicles that could be successfully controlled by a single operator during a target acquisition task. The specific ROVs instantiated for the study were unmanned air vehicles (UAVs). Levels of automation (LOAs) included manual control, management-by-consent, and management-by-exception. Levels of decision-aid fidelity (100 correct and 95 correct) were achieved by intentionally injecting error into the decision-aiding capabilities of the simulation. Additionally, the number of UAVs to be controlled varied (one, two, and four vehicles). Twelve participants acted as UAV operators. A mixed-subject design was utilized (with decision-aid fidelity as the between-subjects factor), and participants were not informed of decision-aid fidelity prior to data collection. Dependent variables included mission efficiency, percentage correct detection of incorrect decision aids, workload and situation awareness ratings, and trust in automation ratings. Results indicate that an automation level incorporating management-by-consent had some clear performance advantages over the more autonomous (management-by-exception) and less autonomous (manual control) levels of automation. However, automation level interacted with the other factors for subjective measures of workload, situation awareness, and trust. Additionally, although a 3D perspective view of the mission scene was always available, it was used only during low-workload periods and did not appear to improve the operator's sense of presence. The implications for ROV interface design are discussed, and future research directions are proposed.

[1]  N Moray,et al.  Trust, control strategies and allocation of function in human-machine systems. , 1992, Ergonomics.

[2]  Christopher D. Wickens,et al.  IMPERFECT AND UNRELIABLE AUTOMATION AND ITS IMPLICATIONS FOR ATTENTION ALLOCATION , INFORMATION ACCESS AND SITUATION AWARENESS , 2000 .

[3]  Christopher D. Wickens,et al.  The Future of Air Traffic Control: Human Operators and Automation , 1998 .

[4]  Nadine B. Sarter,et al.  Pilot Interaction With Cockpit Automation II: An Experimental Study of Pilots’ Model and Awareness of the Flight Management System , 1994 .

[5]  Harvey S. Smallman,et al.  The Use of 2D and 3D Displays for Shape-Understanding versus Relative-Position Tasks , 2001, Hum. Factors.

[6]  James C. Schueren,et al.  Using the Subjective Workload Dominance (SWORD) Technique for Projective Workload Assessment , 1991 .

[7]  Sandra H. Rouse,et al.  A Framework for Research on Adaptive Decision Aids , 1983 .

[8]  John M. Flach,et al.  UNITED STATES AIR FORCE RESEARCH LABORATORY SEAD AND THE UCAV: A PRELIMINARY COGNITIVE SYSTEMS ANALYSIS , 1998 .

[9]  Michael A. Vidulich,et al.  Testing a Subjective Metric of Situation Awareness , 1991 .

[10]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[11]  Nadine B. Sarter,et al.  Pilot Interaction With Cockpit Automation: Operational Experiences With the Flight Management System , 1992 .

[12]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[13]  Stephen R. Ellis,et al.  Perspective Traffic Display Format and Airline Pilot Traffic Avoidance , 1987 .

[14]  Pattie Maes,et al.  Agents that reduce work and information overload , 1994, CACM.

[15]  Charles E. Billings,et al.  Human-Centered Aviation Automation: Principles and Guidelines , 1996 .

[16]  R Parasuraman,et al.  MONITORING OF AUTOMATED SYSTEM , 1996 .

[17]  Christine M. Mitchell,et al.  The effectiveness of supervisory control strategies in scheduling flexible manufacturing systems , 1988, IEEE Trans. Syst. Man Cybern..

[18]  John E. Deaton,et al.  Theory and Design of Adaptive Automation in Aviation Systems , 1992 .

[19]  Christopher D. Wickens,et al.  Two- and Three-Dimensional Displays for Aviation: A Theoretical and Empirical Comparison , 1993 .

[20]  Thomas B. Sheridan,et al.  Telerobotics, Automation, and Human Supervisory Control , 2003 .

[21]  Michael W. Haas,et al.  UMAST: A Web-Based Architecture for Modeling Future Uninhabited Aerial Vehicles , 1999, Simul..

[22]  William B. Rouse,et al.  Design for success , 1991 .

[23]  Thomas B. Sheridan Further Musings on the Psychophysics of Presence , 1996, Presence: Teleoperators & Virtual Environments.

[24]  J. Llinas,et al.  Studies and Analyses of Aided Adversarial Decision Making. Phase 2: Research on Human Trust in Automation , 1998 .

[25]  Christopher D. Wickens,et al.  Flight to the Future: Human Factors in Air Traffic Control Edited by Christopher D. Wickens, Anne S. Mavor, & James P. McGee 1997, 368 pages, $44.95. Washington, DC: National Academy Press ISBN 0-309-05637-3 , 1997 .

[26]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[27]  R. Parasuraman,et al.  Trust as a Construct for Evaluation of Automated Aids: Past and Future Theory and Research , 1999 .