In this paper, we present a method to estimate ambient illuminants using no-flash/flash image pairs. Accurate estimation of the ambient illuminant is useful for imaging applications. In most applications, however, this task is difficult because of the complicated combination of illuminants, surfaces, and camera characteristics during the imaging process. To estimate the scene illumination, a version of the “illuminating illumination” method suggested by Dicarlo et al. is used. The method introduces camera flash light into the scene, and the reflected light is used to estimate the ambient illuminant. The original method needs an extra step of estimating the object surface reflectance, using a 3-dimensional linear surface model and the knowledge of the spectral responsivities of camera sensors. Here we consider the problem of estimating the ambient illuminant directly, with only flash/no-flash pairs, without information on surface reflectance and camera sensors. First, the flash image is registered with the no-flash image: the difference between the two gives a pure-flash image, as if it were taken under flash only. The no-flash and pure-flash images are represented by a physically-based model of image formation which uses assumptions of Lambertian surfaces, Planckian lights, and narrowband camera sensors. We argue that first going to a “spectrally sharpened” color space, and then projecting the difference in a log domain of the pure-flash image and the no-flash image into a geometric-mean chromaticity space, gives the chromaticity of the ambient illuminant. We verify that the chromaticities corresponding to illuminants with different temperatures fall along a line on a plane in the log geometric-mean chromaticity space. Simply by taking the nearest color temperature along this illuminant line, or classifying into one of potential illuminants, our algorithm arrives at an estimate of the illuminant. Remarkably, our algorithm is truly practical as it can estimate the color of the ambient light even without any prior knowledge about surface reflectance, flash light, or camera sensors. Experiments on real images demonstrate that estimation accuracy
[1]
Michael F. Cohen,et al.
Digital photography with flash and no-flash image pairs
,
2004,
ACM Trans. Graph..
[2]
G. Wyszecki,et al.
Color Science Concepts and Methods
,
1982
.
[3]
Cheng Lu,et al.
Removing Shadows using Flash/Noflash Image Edges
,
2006,
2006 IEEE International Conference on Multimedia and Expo.
[4]
Gerald Schaefer,et al.
Single Surface Colour Constancy
,
1999,
Color Imaging Conference.
[5]
K Barnard,et al.
Sensor sharpening for computational color constancy.
,
2001,
Journal of the Optical Society of America. A, Optics, image science, and vision.
[6]
Brian V. Funt,et al.
A data set for color research
,
2002
.
[7]
Cheng Lu,et al.
Automatic Compensation for Camera Settings for Images Taken under Different Illuminants
,
2006,
Color Imaging Conference.
[8]
G D Finlayson,et al.
Color constancy at a pixel.
,
2001,
Journal of the Optical Society of America. A, Optics, image science, and vision.
[9]
B. Wandell,et al.
Natural scene-illuminant estimation using the sensor correlation
,
2002,
Proc. IEEE.
[10]
M. S. Drew,et al.
Color constancy - Generalized diagonal transforms suffice
,
1994
.
[11]
Drew,et al.
Spectral sharpening with positivity
,
2000,
Journal of the Optical Society of America. A, Optics, image science, and vision.
[12]
Mark S. Drew,et al.
4-sensor camera calibration for image representation invariant to shading, shadows, lighting, and specularities
,
2001,
Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.
[13]
W D Wright,et al.
Color Science, Concepts and Methods. Quantitative Data and Formulas
,
1967
.
[14]
Graham D. Finlayson,et al.
Color by Correlation
,
1997,
CIC.