Correlated sequence learning in a network of spiking neurons using maximum likelihood

Hopfield Networks are an idealised model of distributed computation in networks of non-linear, stochastic units. We consider the learning of correlated temporal sequences using Maximum Likelihood, deriving a simple Hebbian-like learning rule that is capable of robustly storing multiple sequences of correlated patterns. We argue that the learning rule is optimal for the case of long temporal sequences and has a natural stochastic interpretation.