Assessing Multispectral Image Fusion with Systems Factorial Technology

Despite the rich literature on techniques for creating a single image from multispectral sensors, there is relatively little research on methods for assessing these techniques based on human performance. We propose the use of Systems Factorial Technology (SFT), a nonparametric, mathematical modeling framework for analyzing human cognition. Previous work has demonstrated the use of SFT in evaluating human perception of multi-spectral imagery, although on relatively contrived tasks. In this work, we extend the approach to a task in which observers must determine whether a person in the image is holding a gun or a tool. We found that all observers processed the information from each spectrum less efficiently when images based on two different spectra were presented together, regardless of whether the information was fused into a single image or kept separately. Furthermore, when images from the two different spectra were presented side-by-side, some observers were able to use both sources in parallel.

[1]  D. Raab DIVISION OF PSYCHOLOGY: STATISTICAL FACILITATION OF SIMPLE REACTION TIMES* , 1962 .

[2]  Rick S. Blum,et al.  Fusion of visual and IR images for concealed weapon detection , 2002, Proceedings of the Fifth International Conference on Information Fusion. FUSION 2002. (IEEE Cat.No.02EX5997).

[3]  James T. Townsend,et al.  The statistical properties of the Survivor Interaction Contrast , 2010 .

[4]  Zhongliang Jing,et al.  Review of pixel-level image fusion , 2010 .

[5]  M. Senda,et al.  Diagnostic performance of CT, PET, side-by-side, and fused image interpretations for restaging of non-Hodgkin lymphoma , 2007, Annals of nuclear medicine.

[6]  E. Blasch,et al.  Assembling a distributed fused information-based human-computer cognitive decision making tool , 2000, IEEE Aerospace and Electronic Systems Magazine.

[7]  Xinming Tang,et al.  IMAGE FUSION AND IMAGE QUALITY ASSESSMENT OF FUSED IMAGES , 2013 .

[8]  W. K. Krebs,et al.  Visibility of road hazards in thermal, visible, and sensor-fused night-time imagery. , 2000, Applied ergonomics.

[9]  Rick S. Blum,et al.  Concealed weapon detection using color image fusion , 2003, Sixth International Conference of Information Fusion, 2003. Proceedings of the.

[10]  R Core Team,et al.  R: A language and environment for statistical computing. , 2014 .

[11]  J. G. Hollands,et al.  Confidence intervals in repeated-measures designs: The number of observations principle. , 2009, Canadian journal of experimental psychology = Revue canadienne de psychologie experimentale.

[12]  Alexander Toet,et al.  Object recognition methodology for the assessment of multi-spectral fusion algorithms: phase 1 , 2009, Defense + Commercial Sensing.

[13]  C. Tyler,et al.  Bayesian adaptive estimation of psychometric slope and threshold , 1999, Vision Research.

[14]  Joseph W. Houpt,et al.  Systems factorial technology with R , 2014, Behavior research methods.

[15]  Jan Noyes,et al.  Scanpath assessment of visible and infrared side-by-side and fused video displays , 2007, 2007 10th International Conference on Information Fusion.

[16]  J. Townsend,et al.  Spatio-temporal properties of elementary perception: an investigation of parallel, serial, and coactive theories , 1995 .

[17]  Elizabeth Fox Cognitive Analysis of Multi-Sensor Information , 2015 .

[18]  S. Narayanan,et al.  Cognitively-engineered multisensor image fusion for military applications , 2009, Inf. Fusion.