Optimally generalizing neural networks with the ability to recover from single stuck-at r faults
暂无分享,去创建一个
Handling faults is an important issue for hardware implementation of hierarchical neural networks. This paper presents a method for complete recovery of prefault input/output relations by modifying only the normal connection weights when a fault or a stuck-at r fault occurs. We derive the necessary and sufficient conditions for which the number of intermediate elements and the intermediate element function are satisfied, in order to restore complete prefault functionality. We then develop a concrete method for creating an intermediate element function which satisfies these conditions and a method for modifying the connection weights when a fault occurs. © 2002 Wiley Periodicals, Inc. Syst Comp Jpn, 33(7): 114–123, 2002; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/scj.1147
[1] Ahmed El-Amawy,et al. On Fault Tolerant Training of Feedforward Neural Networks , 1997, Neural Networks.
[2] Dhananjay S. Phatak,et al. Complete and partial fault tolerance of feedforward neural nets , 1995, IEEE Trans. Neural Networks.
[3] Chalapathy Neti,et al. Maximally fault tolerant neural networks , 1992, IEEE Trans. Neural Networks.