Selection of the Regularization Parameter

The success of all currently available regularization techniques relies heavily on the proper choice of the regularization parameter. Although many regularization parameter selection methods (RPSMs) have been proposed, very few of them are used in engineering practice. This is due to the fact that theoretically justified methods often require unrealistic assumptions, while empirical methods do not guarantee a good regularization parameter for any set of data. Among the methods that found their way into engineering applications, the most common are Morozov’s Discrepancy Principle (abbreviated as MDP) [morozov84, phillips62], Mallows’ CL [mallows73], generalized cross validation (abbreviated as GCV) [wahba90], and the L-curve method [hansen98]. A high sensitivity of CL and MDP to an underestimation of the noise level has limited their application to cases in which the noise level can be estimated with high fidelity [hansen98]. On the other hand, noise-estimate-free GCV occasionally fails, presumably due to the presence of correlated noise [wahba90]. The L-curve method is widely used; however, this method is nonconvergent [leonov97, vogel96]. An example of image restoration using different values of regularization parameters is shown in Figs. 2.1, 2.2, 2.3, 2.4, and 2.5. The Matlab code for this example was provided by Dr. Curt Vogel of Montana State University in a personal communication. The original image is presented in Fig. 2.1, and the observed blurred image is in Fig. 2.2.

[1]  R. A. Leibler,et al.  On Information and Sufficiency , 1951 .

[2]  H. Akaike,et al.  Information Theory and an Extension of the Maximum Likelihood Principle , 1973 .

[3]  C. Mallows Some Comments on Cp , 2000, Technometrics.

[4]  Per Christian Hansen,et al.  Rank-Deficient and Discrete Ill-Posed Problems , 1996 .

[5]  Andrei V. Gribok,et al.  Information complexity-based regularization parameter selection for solution of ill conditioned inverse problems , 2002 .

[6]  H. Bozdogan Model selection and Akaike's Information Criterion (AIC): The general theory and its analytical extensions , 1987 .

[7]  R. Shibata Statistical aspects of model selection , 1989 .

[8]  David L. Phillips,et al.  A Technique for the Numerical Solution of Certain Integral Equations of the First Kind , 1962, JACM.

[9]  V. A. Morozov,et al.  Methods for Solving Incorrectly Posed Problems , 1984 .

[10]  C. L. Mallows Some comments on C_p , 1973 .

[11]  Peter J. Huber,et al.  Robust Statistics , 2005, Wiley Series in Probability and Statistics.

[12]  D. Haughton,et al.  Informational complexity criteria for regression models , 1998 .

[13]  C. Vogel Non-convergence of the L-curve regularization parameter selection method , 1996 .

[14]  Dianne P. O'Leary,et al.  The Use of the L-Curve in the Regularization of Discrete Ill-Posed Problems , 1993, SIAM J. Sci. Comput..

[15]  G. Kitagawa,et al.  Generalised information criteria in model selection , 1996 .

[16]  C. Vogel Computational Methods for Inverse Problems , 1987 .

[17]  A. G. Yagola,et al.  The L-curve method always introduces a nonremovable systematic error , 1997 .

[18]  Per Christian Hansen,et al.  REGULARIZATION TOOLS: A Matlab package for analysis and solution of discrete ill-posed problems , 1994, Numerical Algorithms.

[19]  van M.H. Emden,et al.  An analysis of complexity , 1971 .