Minimal Residual Method Stronger than Polynomial Preconditioning

This paper compares the convergence behavior of two popular iterative methods for solving systems of linear equations: the $\rf$-step restarted minimal residual method (commonly implemented by algorithms such as GMRES($\rf$)) and $(s-1)$-degree polynomial preconditioning. It is known that for normal matrices, and in particular for symmetric positive definite matrices, the convergence bounds for the two methods are the same. In this paper we demonstrate that for matrices unitarily equivalent to an upper triangular Toeplitz matrix, a similar result holds; namely, either both methods converge or both fail to converge. However, we show this result cannot be generalized to all matrices. Specifically, we develop a method, based on convexity properties of the generalized field of values of powers of the iteration matrix, to obtain examples of real matrices for which GMRES($\rf$) converges for every initial vector, but every $(s-1)$-degree polynomial preconditioning stagnates or diverges for some initial vector.