A self-creating network effective in learning vector quantization, called RCN (Representation-burden Conservation Network) is developed. Each neuron in RCN is characterized by a measure of representation-burden. Conservation is achieved by bounding the summed representation-burden of all neurons at constant 1, as representation-burden values of all neurons are updated after each input presentation. We show that RCN effectively fulfills the conscience principle [1] and achieves biologically plausible self-development capability. In addition, conservation in representation-burden facilitates systematic derivations of learning parameters, including the adaptive learning rate control useful in accelerating the convergence as well as in improving node-utilization. Because it is smooth and incremental, RCN can overcome the stability-plasticity dilemma. Simulation results show that RCN displays superior performance over other competitive learning networks in minimizing the quantization error.
[1]
Teuvo Kohonen,et al.
Self-organization and associative memory: 3rd edition
,
1989
.
[2]
Robert M. Gray,et al.
An Algorithm for Vector Quantizer Design
,
1980,
IEEE Trans. Commun..
[3]
Yasuo Matsuyama.
Harmonic competition: a self-organizing multiple criteria optimization
,
1996,
IEEE Trans. Neural Networks.
[4]
Sang-Hui Park,et al.
Self-creating and organizing neural networks
,
1994,
IEEE Trans. Neural Networks.
[5]
Stanley C. Ahalt,et al.
Codeword distribution for frequency sensitive competitive learning with one-dimensional input data
,
1996,
IEEE Trans. Neural Networks.
[6]
Stanley C. Ahalt,et al.
Competitive learning algorithms for vector quantization
,
1990,
Neural Networks.