Digital Removal of Random Media Image Degradations by Solving the Diffusion Equation Backwards in Time

We consider the image restoration problem with Gaussian-like point spread functions and reformulate it as an initial value problem for the backwards diffusion equation. This approach leads to rigorous bounds on the reliability of the restoration, as a function of the noise variance, without any assumptions on the spectral characteristics of either signal or noise.In the latter half of the paper, we then describe a powerful algorithm, based on the above reformulation, and successfully use it to restore a turbulence degraded image. Typically, a complete restoration and display requires 10 seconds of CDC 7600 computing time, for a $128 \times 128$ image. We also describe a restoration experiment where Gaussian blur was simulated, and multiplicative noise added according to Huang’s model. The algorithm performs competently even at low signal to noise ratios.