Two-stage parallel partial retraining scheme for defective multi-layer neural networks
暂无分享,去创建一个
We address a high-speed defect compensation method for multi-layer neural networks implemented in hardware devices. To compensate stuck defects of the neurons and weights, we have proposed a partial retraining scheme that adjusts the weights of a neuron affected by stuck defects between two layers by a backpropagation (BP) algorithm. Since the functions of defect compensation can be achieved by using learning circuits, we can save chip area. To reduce the number of weights to adjust, it also leads to high-speed defect compensation. We propose a two-stage partial retraining scheme to compensate input unit stuck defects. Our simulation results show that the two-stage partial retraining scheme can be about 100 times faster than whole network retraining by the BP algorithm.
[1] Chidchanok Lursinsap,et al. Weight shifting techniques for self-recovery neural networks , 1994, IEEE Trans. Neural Networks.
[2] Susumu Horiguchi,et al. The efficient design of fault-tolerant artificial neural networks , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.
[3] Y. Tan,et al. Fault-tolerant back-propagation model and its generalization ability , 1993, Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan).