A gradient projection conjugate gradient algorithm for Bayesian PET reconstruction

In the Bayesian PET reconstruction problem, conjugate gradient (CG) algorithms were previously shown to have more favorable convergence rates than expectation maximization (EM) type algorithms. CG algorithms, however, are not easily applicable because of the non-negativity constraint. Earlier, the authors tackled this problem by augmenting the log-posterior density function with a penalty function, and using an appropriate preconditioner. Here, an active set approach is used which avoids some inherent problems of the penalty function method. This method simultaneously tries to estimate the "zero" variables (active set), and maximizes the cost function in the other variables (free) variables by using the following stages consecutively: (i) an unconstrained CG algorithm in the free variables followed by a bent line search, (ii) a gradient projection step to select a new active set. Using this gradient projection conjugate gradient algorithm, the authors retain fast convergence while avoiding the problem of selecting parameters inherent in their previous penalty function approach.<<ETX>>