Extensions of block-projections methods with relaxation parameters to inconsistent and rank-deficient least-squares problems

There exist many classes of block-projections algorithms for approximating solutions of linear least-squares problems. Generally, these methods generate sequences convergent to the minimal norm least-squares solution only for consistent problems. In the inconsistent case, which usually appears in practice because of some approximations or measurements, these sequences do no longer converge to a least-squares solution or they converge to the minimal norm solution of a “perturbed” problem. In the present paper, we overcome this difficulty by constructing extensions for almost all the above classes of block-projections methods. We prove that the sequences generated with these extensions always converge to a least-squares solution and, with a suitable initial approximation, to the minimal norm solution of the problem. Numerical experiments, described in the last section of the paper, confirm the theoretical results obtained.