Context quantization and contextual self-organizing maps

Vector quantization consists in finding a discrete approximation of a continuous input. One of the most popular neural algorithms related to vector quantization is the, so called, Kohonen map. We generalize vector quantization to temporal data, introducing context quantization. We propose a recurrent network inspired by the Kohonen map, the contextual self-organizing map, that develops near-optimal representations of context. We demonstrate quantitatively that this algorithm shows better performance than the other neural methods proposed so far.