Conjugate Gradient Methods for Constrained Least Squares Problems
暂无分享,去创建一个
Abstract : In 1988, Barlow, Nichols, and Plemmons proposed an order-reducing conjugate gradient algorithm for solving constrained least squares problems. They proved that this method, which we call Algorithm (Barlow, Nichols, and Plemmons), is superior to p-cyclic (successive overrelaxation) in exact arithmetic. Here we continue the study of algorithm BNP. We identify and correct a source of instability in the original algorithm, and develop a parallel version suitable for substrated problems. We prove that BNP is superior to block accelerated overrelaxation (AOR), and establish a connection between BNP and a preconditioned form of the weighting method. We also show that BNP can be viewed as a nullspace method, in which a distinguished choice of the basis for the nullspace is used but never formed. Finally, we exploit this nullspace characterization to extend BNP, producing a class of algorithms we call implicit nullspace methods. These methods allow great flexibility in the choice of preconditioner, and can be used to solve problems for which BNP is not well suited. Like BNP, the extensions are suitable for parallel implementation on substructured problems. Experiments on structural engineering and Stokes Flow models suggest that BNP and its extensions offer a competitive alternative to existing iterative algorithms for solving constrained least squares problems. The appendix describes a mechanism which can cause the breakdown of incomplete QR factorizations. Keywords: BNP, SOR, AOR, Theoretical mathematics, Algorithm, LSE, Numerical analysis, Differential equations, Constrained minimization problem, Constrained gradient methods.