Training recurrent network with block-diagonal approximated Levenberg-Marquardt algorithm

We propose the block-diagonal matrix to approximate the Hessian matrix in the Levenberg-Marquardt method in the training of neural networks. Two weight updating strategies, namely asynchronous and synchronous updating methods, were investigated. Asynchronous method updates weights of one block at a time while synchronous method updates all weights at the same time. Variations of these two methods, which involves the determination of the parameters /spl mu/ and /spl lambda/, are examined.