Hybrid learning algorithm with low input-to-output mapping sensitivity for iterated time-series prediction

A hybrid backpropagation/Hebbian learning rule had been developed to enforce low input-to-output mapping sensitivities for feedforward neural networks. This additional functionality is coded as additional weak constraints into the cost function. For numerical efficiency and easy interpretations we specifically designed the additional cost terms with the first order derivatives at hidden-layer neural activation. The additional descent term follows the Hebbian learning rule, and this new algorithms incorporate two popular learning algorithms, i.e., the backpropagation and Hebbian learning rules. In this paper we provide theoretical justifications for the hybrid learning algorithm, and demonstrate its good performance for iterated time-series prediction problems.