Synthetic vision system for improving unmanned aerial vehicle operator situation awareness

The Air Force Research Laboratory's Human Effectiveness Directorate (AFRL/HE) supports research addressing human factors associated with Unmanned Aerial Vehicle (UAV) operator control stations. Recent research, in collaboration with Rapid Imaging Software, Inc., has focused on determining the value of combining synthetic vision data with live camera video presented on a UAV control station display. Information is constructed from databases (e.g., terrain, cultural features, pre-mission plan, etc.), as well as numerous information updates via networked communication with other sources (e.g., weather, intel). This information is overlaid conformal, in real time, onto the dynamic camera video image display presented to operators. Synthetic vision overlay technology is expected to improve operator situation awareness by highlighting key spatial information elements of interest directly onto the video image, such as threat locations, expected locations of targets, landmarks, emergency airfields, etc. Also, it may help maintain an operator’s situation awareness during periods of video datalink degradation/dropout and when operating in conditions of poor visibility. Additionally, this technology may serve as an intuitive means of distributed communications between geographically separated users. This paper discusses the tailoring of synthetic overlay technology for several UAV applications. Pertinent human factors issues are detailed, as well as the usability, simulation, and flight test evaluations required to determine how best to combine synthetic visual data with live camera video presented on a ground control station display and validate that a synthetic vision system is beneficial for UAV applications.

[1]  Christopher D. Wickens,et al.  Object versus space-based models of visual attention: Implications for the design of head-up displays , 1995 .

[2]  Steven K. Feiner,et al.  View management for virtual and augmented reality , 2001, UIST '01.

[3]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[4]  Christopher D. Wickens,et al.  Frames of Reference for the Display of Battlefield Information: Judgment-Display Dependencies , 2000, Hum. Factors.

[5]  Heath A. Ruff,et al.  Manual Versus Speech Input for Unmanned Aerial Vehicle Control Station Operations , 2003 .

[6]  Jimmy Krozel,et al.  Advanced human-computer interfaces for air traffic management and simulation , 1996 .

[7]  Jonathan Levy,et al.  Performance Benefits with Scene-Linked Hud Symbology: An Attentional Phenomenon? , 1998 .

[8]  David C. Foyle,et al.  Superimposed Symbology: Attentional Problems and Design Solutions , 1994 .

[9]  Roy Kalawsky,et al.  Old theories, new technologies: cumulative clutter effects using augmented reality , 1999, 1999 IEEE International Conference on Information Visualization (Cat. No. PR00210).

[10]  Christopher D. Wickens,et al.  Costs and Benefits of Head-Up Display Use: A Meta-Analytic Approach , 1998 .

[11]  Christopher D. Wickens,et al.  Displaying multi-domain graphical databases: An evaluation of scanning clutter, display size, and user activity , 2002 .

[12]  Heath A. Ruff,et al.  The Effects of Head-Coupled Control and a Head-Mounted Display (HMD) on Large-Area Search Tasks , 2002 .

[13]  Christopher D. Wickens,et al.  Display Signaling in Augmented Reality: Effects of Cue Reliability and Image Realism on Attention Allocation and Trust Calibration , 2001, Hum. Factors.

[14]  Patricia May Ververs,et al.  Prepared for , 1998 .