A Residual Replacement Strategy for Improving the Maximum Attainable Accuracy of Communication-Avoiding Krylov Subspace Methods

Abstract : The behavior of conventional Krylov Subspace Methods (KSMs) infinite precision arithmetic is a well-studied problem. The finite precision Lanczos process, which drives convergence of these methods, can lead to a significant deviation between the recursively computed residual and the true residual, b - Axk, decreasing the maximum attainable accuracy of the solution. Van der Vorst and Ye [24] have advocated the use of a residual replacement strategy for KSMs to prevent the accumulation of this error, in which the computed residual is replaced by the true residual at specific iterations chosen such that the Lanczos process is undisturbed.

[1]  G. Meurant The Lanczos and conjugate gradient algorithms , 2008 .

[2]  Eric de Sturler,et al.  A Performance Model for Krylov Subspace Methods on Mesh-Based Parallel Computers , 1996, Parallel Comput..

[3]  Gerard L. G. Sleijpen,et al.  Reliable updated residuals in hybrid Bi-CG methods , 1996, Computing.

[4]  L. Reichel,et al.  A Newton basis GMRES implementation , 1994 .

[5]  Mark Hoemmen,et al.  Communication-avoiding Krylov subspace methods , 2010 .

[6]  Lothar Reichel,et al.  On the generation of Krylov subspace bases , 2012 .

[7]  Dennis Gannon,et al.  On the Impact of Communication Complexity on the Design of Parallel Numerical Algorithms , 1984, IEEE Transactions on Computers.

[8]  J. Demmel,et al.  Avoiding Communication in Computing Krylov Subspaces , 2007 .

[9]  W. Joubert,et al.  Parallelizable restarted iterative methods for nonsymmetric linear systems. part I: Theory , 1992 .

[10]  H. Walker Implementation of the GMRES method using householder transformations , 1988 .

[11]  J. Vanrosendale,et al.  Minimizing inner product data dependencies in conjugate gradient iteration , 1983 .

[12]  Qiang Ye,et al.  Analysis of the finite precision bi-conjugate gradient algorithm for nonsymmetric linear systems , 2000, Math. Comput..

[13]  Sivan Toledo,et al.  Quantitative performance modeling of scientific computations and creating locality in numerical algorithms , 1995 .

[14]  Sivan Toledo,et al.  Eecient Out-of-core Algorithms for Linear Relaxation Using Blocking Covers Out-of-core Linear Relaxation 2 , 2007 .

[15]  Graham F. Carey,et al.  Parallelizable Restarted Iterative Methods for Nonsymmetric Linear Systems , 1991, PPSC.

[16]  Anthony T. Chronopoulos,et al.  Parallel Iterative S-Step Methods for Unsymmetric Linear Systems , 1996, Parallel Comput..

[17]  Miroslav Rozlozník,et al.  Modified Gram-Schmidt (MGS), Least Squares, and Backward Stability of MGS-GMRES , 2006, SIAM J. Matrix Anal. Appl..

[18]  A. Greenbaum Estimating the Attainable Accuracy of Recursively Computed Residual Methods , 1997, SIAM J. Matrix Anal. Appl..

[19]  Anthony T. Chronopoulos,et al.  On the efficient implementation of preconditioned s-step conjugate gradient methods on multiprocessors with memory hierarchy , 1989, Parallel Comput..

[20]  H. Walker,et al.  Note on a Householder implementation of the GMRES method , 1986 .

[21]  James Demmel,et al.  Minimizing communication in sparse matrix solvers , 2009, Proceedings of the Conference on High Performance Computing Networking, Storage and Analysis.

[22]  Anthony T. Chronopoulos,et al.  s-step iterative methods for symmetric linear systems , 1989 .

[23]  Qiang Ye,et al.  Residual Replacement Strategies for Krylov Subspace Iterative Methods for the Convergence of True Residuals , 2000, SIAM J. Sci. Comput..

[24]  Walter Gautschi,et al.  The condition of polynomials in power form , 1979 .

[25]  James Demmel,et al.  Avoiding Communication in Two-Sided Krylov Subspace Methods , 2011 .

[26]  G. Meurant,et al.  The Lanczos and conjugate gradient algorithms in finite precision arithmetic , 2006, Acta Numerica.