Data Fusion Processing for the Multi-Spectral Sensor Surveillance System (M4S)

Abstract : The Multi-Spectral Sensor Surveillance System (M4S) is a multi-year ONR-sponsored program to transition mature sensor and data fusion technology into existing and/or near-future airborne surveillance platforms. A study phase and on-board sensor data fusion concept-of-proof demonstration have been completed in 1997. This paper describes the data fusion concepts, architecture, and algorithms that have been designed and demonstrated in these efforts. The data fusion architecture selected for M4S is a distributed design in which each on-board sensor subsystem is equipped with a single-sensor tracking unit satisfying all the sensor-specific tracking needs in addition to required sensor data processing capability. Thus scan-to-scan, or frame-to-frame correlation is basically resolved on a single-sensor basis, and the outputs of each sensor-subsystem are typically single sensor tracks, or tracklets, i.e., stochastically independent fractions of tracks. Those outputs are then fed into a centralized multi-sensor, data fusion process that performs track-to-track association analysis and fuses appropriate single-sensor tracks into multiple-sensor tracks. In this way, each sensor sub-system provides target information complementary to each other as well as reinforcing each other, in terms of both target identification and target localization, so that the central data fusion process may produce a best picture of each target of interest. This system architecture also allows each sensor-specific tracker to temporarily lose hold of some targets but to re-acquire them later, yet to maintain continuous target recognition. This data fusion process is also connected, through an external communication network, to off-board intelligence and surveillance sources, such as Rivet Joint, AWACS, U2, JSTARS, etc., to provide the system with a complete tactical picture.