Precise simulation of digital camera architectures requires an accurate description of how the radiance image is transformed by optics and sampled by the image sensor array. Both for diffraction-limited imaging and for all practical lenses, the width of the optical-point-spread function differs at each wavelength. These differences are relatively small compared to coarse pixel sizes (6μm-8μm). But as pixel size decreases, to say 1.5μm-3μm, wavelength-dependent point-spread functions have a significant impact on the sensor response. We provide a theoretical treatment of how the interaction of spatial and wavelength properties influences the response of high-resolution color imagers. We then describe a model of these factors and an experimental evaluation of the model's computational accuracy.
[1]
Orly Yadid-Pecht,et al.
CMOS APS MTF modeling
,
2001
.
[2]
Brian A Wandell,et al.
Optical efficiency of image sensor pixels.
,
2002,
Journal of the Optical Society of America. A, Optics, image science, and vision.
[3]
B. Wandell.
Foundations of vision
,
1995
.
[4]
R. Hornsey,et al.
Photoresponse of photodiode arrays for solid-state image sensors
,
2000
.
[5]
R. Tsai,et al.
Crosstalk and microlens study in a color CMOS image sensor
,
2003
.
[6]
J. Boyd,et al.
Computer simulation of optical crosstalk in linear imaging arrays
,
1981
.