September 1994 LIDS-P-2267 A HYBRID INCREMENTAL GRADIENT METHOD 1 FOR LEAST SQUARES PROBLEMS

The LMS method for linear least squares problems differs from the steepest descent method in that it processes data blocks one-by-one, with intermediate adjustment of the parameter vector under optimization. This mode of operation often leads to faster convergence when far from the eventual limit, and to slower (sublinear) convergence when close to the optimal solution. We embed both LMS and steepest descent, as well as other intermediate methods, within a oneparameter class of algorithms, and we propose a hybrid method that combines the faster early convergence rate of LMS with the faster ultimate linear convergence rate of steepest descent. 1 Research supported by NSF under Grant 9300494-DMI. 2 Department of Electrical Engineering and Computer Science, M.I.T., Cambridge, Mass., 02139.1