Mutual Information Metric Evaluation for PET/MRI Image Fusion

Image fusion developments has paved way for new approaches like image overlay, image sharpening, and image cueing through pixel, feature, or region/shape combinations. The applicability of these new techniques differs on the image content, contextual information, and generalized metrics of image fusion gain. An image fusion gain can be assessed relative to information gain or entropy reduction. In this paper, we are interested in exploring the performance metric evaluation of the fused images. The metric evaluation method for the fused image is done by studying the mutual information content of the images of interest. The registered MR/PET images are used for demonstration. Mutual Information is proposed as an information measure for evaluating image fusion performance. The proposed measure represents how information obtained from the fused image can be used to assess the information of different image fusion algorithms. The results show that the measure is meaningful and explicit.

[1]  Pramod K. Varshney,et al.  Performance of mutual information similarity measure for registration of multitemporal remote sensing images , 2003, IEEE Trans. Geosci. Remote. Sens..

[2]  Alexander Toet,et al.  Perceptual evaluation of different image fusion schemes , 2003 .

[3]  James Llinas Assessing the Performance of Multisensor Fusion Processes , 2001 .

[4]  David A. Castañón Optimal search strategies in dynamic hypothesis testing , 1995, IEEE Trans. Syst. Man Cybern..

[5]  David L. Hall,et al.  Assessing the Performance of Multisensor Fusion Processes , 2001 .

[6]  Alexandre Jouan,et al.  Land use mapping with evidential fusion of features extracted from polarimetric synthetic aperture radar and hyperspectral imagery , 2004, Inf. Fusion.

[7]  Rick S. Blum,et al.  Multi-sensor image fusion and its applications , 2005 .

[8]  Edmund G. Zelnio,et al.  Characterization of ATR performance evaluation , 1996, Defense, Security, and Sensing.

[9]  Erik Blasch,et al.  Information assessment of SAR data for ATR , 1998, Proceedings of the IEEE 1998 National Aerospace and Electronics Conference. NAECON 1998. Celebrating 50 Years (Cat. No.98CH36185).

[10]  A. Ardeshir Goshtasby,et al.  Fusion of multi-exposure images , 2005, Image Vis. Comput..

[11]  John M. Irvine,et al.  Information Fusion for Feature Extraction and the Development of Geospatial Information , 2004 .

[12]  Alexander Toet,et al.  Fusion of visible and thermal imagery improves situational awareness , 1997 .

[13]  Nishan Canagarajah,et al.  Object Tracking by Particle Filtering Techniques in Video Sequences , 2007 .

[14]  John W. Fisher,et al.  A Nonparametric Methodology for Information Theoretic Feature Extraction , 1998 .

[15]  Vladimir Petrovic,et al.  Objective image fusion performance measure , 2000 .

[16]  C. Angell Fusion performance using a validation approach , 2005, 2005 7th International Conference on Information Fusion.

[17]  A. Ardeshir Goshtasby,et al.  2-D and 3-D Image Registration: for Medical, Remote Sensing, and Industrial Applications , 2005 .

[18]  Alexander Toet,et al.  Fusion of visible and thermal imagery improves situational awareness , 1997, Defense, Security, and Sensing.

[19]  Stavri G. Nikolov,et al.  Image fusion: Advances in the state of the art , 2007, Inf. Fusion.

[20]  A. Waxman,et al.  Multisensor Image Fusion , Mining , and Reasoning : Rule Sets for Higher-Level AFE in a COTS Environment , 2004 .

[21]  N. Canagarajah,et al.  Wavelets for Image Fusion , 2001 .

[22]  James Llinas,et al.  Multisensor Data Fusion , 1990 .

[23]  Paul A. Viola,et al.  Alignment by Maximization of Mutual Information , 1997, International Journal of Computer Vision.

[24]  Stavri G. Nikolov,et al.  Focus and Context Visualisation for Fusion of Volumetric Medical Images, Information Fusion (Fusion 2001) , 2001 .

[25]  Pramod K. Varshney,et al.  A human perception inspired quality metric for image fusion based on regional information , 2007, Inf. Fusion.