A Pixel-Level Method for Multiple Imaging Sensor Data Fusion through Artificial Neural Networks

Multiple image sensor data fusion is the combination of two or more images from different imaging sensors to improve the performance over each individual image sensor. This paper presents a new pixel-level method of data fusion from multiple image sensors for non-destructive inspection. With this method the images from different sensors were processed and classified using artificial neural networks. The classified images were then fused to produce a resultant image that categorized better than any of the individually classified images. This method was applied to identify the corrosive spots on the aircraft panel specimens. In this application, ultrasonic and eddy current image data ran though artificial neural network classifiers to identify the corroded spots on the same aircraft panel specimen as compared with the benchmark X-ray image. The result indicated that the image data fusion consistently enhanced artificial neural network corrosion detection with eddy current and ultrasonic image data individually in overall and in low corrosion pixels, which are 90 percent of all corrosion pixels, with the improvements over the artificial neural network classification rates of the eddy current image by 12.6% and 12.21% in average for low corrosion and overall corrosion classification, respectively, and over the artificial neural network classification rates of the ultrasonic image by 28.88% and 32.18% in average for low corrosion and overall corrosion classification, respectively. This pixel-level method for multiple imaging sensor data fusion is expected to solve problems of non-destructive inspection in various areas. Key words: Multisensor Data Fusion; Imaging Sensor; Pixel Level; Artificial Neural Networks; Non-Destructive Inspection

[1]  F ROSENBLATT,et al.  The perceptron: a probabilistic model for information storage and organization in the brain. , 1958, Psychological review.

[2]  X. E. Gross NDT Data Fusion , 1997 .

[3]  Bir Bhanu,et al.  Fusion of color and infrared video for moving human detection , 2007, Pattern Recognit..

[4]  Rick S. Blum,et al.  Multi-sensor image fusion and its applications , 2005 .

[5]  Vassilis Tsagaris,et al.  Fusion of visible and infrared imagery for night color vision , 2005, Displays.

[6]  D. O. Hebb,et al.  The organization of behavior , 1988 .

[7]  Christine Pohl,et al.  Multisensor image fusion in remote sensing: concepts, methods and applications , 1998 .

[8]  H. Jones,et al.  Combining thermal and visible imagery for estimating canopy temperature and identifying plant stress. , 2004, Journal of experimental botany.

[9]  S. Grossberg,et al.  Adaptive pattern classification and universal recoding: I. Parallel development and coding of neural feature detectors , 1976, Biological Cybernetics.

[10]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[11]  M. J. Palakal,et al.  Intelligent Computational Methods for Corrosion Damage Assessment , 2001 .

[12]  GrossbergS. Adaptive pattern classification and universal recoding , 1976 .

[13]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[14]  Yulong Shen,et al.  Registration and fusion of retinal images-an evaluation study , 2003, IEEE Transactions on Medical Imaging.

[15]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[16]  N. Stanietsky,et al.  The interaction of TIGIT with PVR and PVRL2 inhibits human NK cell cytotoxicity , 2009, Proceedings of the National Academy of Sciences.

[17]  W. Pitts,et al.  A Logical Calculus of the Ideas Immanent in Nervous Activity (1943) , 2021, Ideas That Created the Future.