A comparison of some connectionist compression schemes

In this paper we consider connectionist compression schemes using auto-associative networks, and demonstrate the advantages gained by imposing two different constraints on the allowed network weights, and comparison with pruning of the unconstrained auto-associative network.

[1]  Michael C. Mozer,et al.  Using Relevance to Reduce Network Size Automatically , 1989 .

[2]  Tamás D. Gedeon,et al.  Bidirectional Neural Networks Reduce Generalization Error , 1995, IWANN.

[3]  Dennis Sanger,et al.  Contribution analysis: a technique for assigning responsibilities to hidden units in connectionist networks , 1991 .

[4]  Masafumi Hagiwara Novel backpropagation algorithm for reduction of hidden units and acceleration of convergence using artificial selection , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[5]  Ehud D. Karnin,et al.  A simple procedure for pruning back-propagation trained neural networks , 1990, IEEE Trans. Neural Networks.

[6]  R.J.F. Dow,et al.  Neural net pruning-why and how , 1988, IEEE 1988 International Conference on Neural Networks.

[7]  Tamás D. Gedeon Indicators of hidden neuron functionality: the weight matrix versus neuron behaviour , 1995, Proceedings 1995 Second New Zealand International Two-Stream Conference on Artificial Neural Networks and Expert Systems.

[8]  B. E. Segee,et al.  Fault tolerance of pruned multilayer networks , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[9]  D. Harris,et al.  Progressive image compression , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[10]  M. Arozullah,et al.  Higher order data compression with neural networks , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[11]  M. K. Fleming,et al.  Categorization of faces using unsupervised feature extraction , 1990, 1990 IJCNN International Joint Conference on Neural Networks.