Object Classification Using Multispectral Sensor Data Fusion

In this paper, the potential benefits in applying sensor fusion to object classification are discussed. A specific example is presented that involves the fusion of multiple band IR and visible light data collected from co-located sensors. Pattern vectors describing the objects were based on features extracted form the simulated target signatures observed within the sensor wavebands individually and also by 'fusing' the multispectral data. The pattern vectors were then subjected to feature analysis using a variety of statistical pattern recognition techniques to determine the relative contribution of each feature to classification performance. Features selected through this process were then used in subsequent classification algorithms which established class boundaries, classified the objects, determined confidence levels, and calculated error probabilities. A neural network paradigm was also applied to the same data set to determine the relative merit of the features and to classify the objects. In particular, a competitive learning algorithm was used. Analysis methods and performance comparisons are presented.