Learning Data Representations with Sparse Coding Neural Gas

We consider the problem of learning an unknown (overcom- plete) basis from an unknown sparse linear combination. Introducing the "sparse coding neural gas" algorithm, we show how to employ a combina- tion of the original neural gas algorithm and Oja's rule in order to learn a simple sparse code that represents each training sample by a multiple of one basis vector. We generalise this algorithm using orthogonal matching pursuit in order to learn a sparse code where each training sample is rep- resented by a linear combination of k basis elements. We show that this method can be used to learn artificial sparse overcomplete codes.

[1]  Michael Elad,et al.  Stable recovery of sparse overcomplete representations in the presence of noise , 2006, IEEE Transactions on Information Theory.

[2]  Y. C. Pati,et al.  Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition , 1993, Proceedings of 27th Asilomar Conference on Signals, Systems and Computers.

[3]  Joel A. Tropp,et al.  Greed is good: algorithmic results for sparse approximation , 2004, IEEE Transactions on Information Theory.

[4]  E. Oja Simplified neuron model as a principal component analyzer , 1982, Journal of mathematical biology.