When low contrast photographic images are digitized by a very small aperture, extreme film-grain noise almost completely obliterates the image information. Using a large aperture to average out the noise destroys the fine details of the image. In these situations conventional statistical restoration techniques have little effect, and well chosen heuristic algorithms have yielded better results. In this paper we analyze the noisecheating algorithm of Zweig et al. [J. Opt. Soc. Am. 65, 1347 (1975)] and show that it can be justified by classical maximum-likelihood detection theory. A more general algorithm applicable to a broader class of images is then developed by considering the signal-dependent nature of film-grain noise. Finally, a Bayesian detection algorithm with improved performance is presented.
[1]
D. Middleton.
An Introduction to Statistical Communication Theory
,
1960
.
[2]
I. M. Jacobs,et al.
Principles of Communication Engineering
,
1965
.
[3]
G. C. Higgins,et al.
Experimental Study of rms Granularity as a Function of Scanning-Spot Size*
,
1959
.
[4]
MeesC. E. Kenneth.
The Theory of the Photographic Process
,
1907
.
[5]
John F. Walkup,et al.
Image Processing In Signal-Dependent Noise
,
1974
.
[6]
A A Sawchuk,et al.
Estimation of images degraded by film-grain noise.
,
1978,
Applied optics.
[7]
I. Miller.
Probability, Random Variables, and Stochastic Processes
,
1966
.
[8]
Eamon B. Barrett,et al.
Noise-cheating image enhancement*
,
1975
.