Backward errors for eigenvalue and singular value decompositions

Summary.We present bounds on the backward errors for the symmetric eigenvalue decomposition and the singular value decomposition in the two-norm and in the Frobenius norm. Through different orthogonal decompositions of the computed eigenvectors we can define different symmetric backward errors for the eigenvalue decomposition. When the computed eigenvectors have a small residual and are close to orthonormal then all backward errors tend to be small. Consequently it does not matter how exactly a backward error is defined and how exactly residual and deviation from orthogonality are measured. Analogous results hold for the singular vectors. We indicate the effect of our error bounds on implementations for eigenvector and singular vector computation. In a more general context we prove that the distance of an appropriately scaled matrix to its orthogonal QR factor is not much larger than its distance to the closest orthogonal matrix.