The CHIR Algorithm for Feed Forward Networks with Binary Weights
暂无分享,去创建一个
A new learning algorithm, Learning by Choice of Internal Represetations (CHIR), was recently introduced. Whereas many algorithms reduce the learning process to minimizing a cost function over the weights, our method treats the internal representations as the fundamental entities to be determined. The algorithm applies a search procedure in the space of internal representations, and a cooperative adaptation of the weights (e.g. by using the perceptron learning rule). Since the introduction of its basic, single output version, the CHIR algorithm was generalized to train any feed forward network of binary neurons. Here we present the generalised version of the CHIR algorithm, and further demonstrate its versatility by describing how it can be modified in order to train networks with binary (±1) weights. Preliminary tests of this binary version on the random teacher problem are also reported.
[1] Gerald Tesauro,et al. Scaling Relationships in Back-propagation Learning , 1988, Complex Syst..
[2] A. A. Mullin,et al. Principles of neurodynamics , 1962 .
[3] Bernard Widrow,et al. Neural nets for adaptive filtering and adaptive pattern recognition , 1988, Computer.