This paper presents an algorithm for accurately aligning two images of the same scene captured simultaneously by sensors operating in different wavebands (e.g. TV and IR). Such a setup is common in image fusion systems where the sensors are physically aligned as closely as possible and yet significant image mis-alignment remains due to differences infield of view, lens distortion and other camera characteristics. Our proposed registration method involves numerically minimising a global objective function defined in terms of local normalized correlation measures. The algorithm is demonstrated on real multimodal imagery and applications to image fusion are considered. In particular, we illustrate that fused image quality is closely related to the degree of registration accuracy achieved. To maintain this accuracy in real systems it is often necessary to continuously update the transform over time. Thus, we extend our registration approach to execute in real time on live imagery, providing optimal fused imagery in the presence of relative sensor motion and parallax effects.
[1]
Lisa M. Brown,et al.
A survey of image registration techniques
,
1992,
CSUR.
[2]
David J. Fleet,et al.
Performance of optical flow techniques
,
1994,
International Journal of Computer Vision.
[3]
Jake K. Aggarwal,et al.
Structure from stereo-a review
,
1989,
IEEE Trans. Syst. Man Cybern..
[4]
P. Anandan,et al.
Robust multi-sensor image alignment
,
1998,
Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).
[5]
Jamie P. Heather,et al.
A review of image fusion technology in 2005
,
2005,
SPIE Defense + Commercial Sensing.
[6]
Richard Szeliski,et al.
Spline-Based Image Registration
,
1997,
International Journal of Computer Vision.