Introduction of threshold self-adjustment improves the convergence in feature-detective neural nets

Abstract Two-cellular mechanisms of memory provide short- and long-term changes in the synapse. The long-term mechanism implemented as connection weights’ modification is widely used in neural network-based cognitive models. Short-term synaptic adaptation is often overlooked in these models. In this study it was implemented by independent dynamic thresholds in each unit. This mechanism was expected to improve the speed of convergence in feature-detective WTA nets. The results showed that dynamic thresholds increased the speed of learning, and made convergence possible for some sets of initial weights that did not converge without it.