Data Fusion for the Apache Longbow: Implementation and Experiences

Maintaining Situational Awareness and Tactical Decision Making are workload-intensive and timecritical challenges for the crew of the Army’s AH-64D Apache Longbow. The Apache crew faces these challenges with extreme mission demands coupled with stressful high-speed, low-level flight. Two technology trends show great promise in addressing these problems: Decision Aiding and Manned/Unmanned teaming with Unmanned Aerial Vehicles (UAV). The US Army Aviation Applied Technology Directorate (AATD) is leading the Airborne Manned/Unmanned System Technology Demonstration (AMUST-D) and HunterStandoff Killer Team (HSKT) programs to develop, deploy and demonstrate these technologies in operational evaluations. Data Fusion, the capability to integrate information from multiple sensors and other sources into a consistent Common Relevant Operational Picture (CROP), lies at the heart of both of these capabilities. A reliable CROP is required to support the automated reasoning processes of Decision Aiding, and to automatically combine sensor data from teamed UAV’s, reducing the significant human workload that would occur if UAV data were to be manually combined with other fused data representations. For the past 11 years, Lockheed Martin Advanced Technology Laboratories (LM ATL) has been developing a Data Fusion capability specifically designed to support Army Aviation decision aiding systems. In this paper, we describe LM ATL’s effort to design, implement, and test a Data Fusion system for the Apache Longbow under the AMUST-D and HSKT programs.