Preconditioned conjugate gradient method for generalized least squares problems

Abstract A variant of the preconditioned conjugate gradient method to solve generalized least squares problems is presented. If the problem is min ( Ax − b ) T W −1 ( Ax − b ) with A ∈ R m × n and W ∈ R m × m symmetric and positive definite, the method needs only a preconditioner A 1 ∈ R n × n , but not the inverse of matrix W or of any of its submatrices. Freund's comparison result for regular least squares problems is extended to generalized least squares problems. An error bound is also given.