Complete and partial fault tolerance of feedforward neural nets

A method is proposed to estimate the fault tolerance (FT) of feedforward artificial neural nets (ANNs) and synthesize robust nets. The fault model abstracts a variety of failure modes for permanent stuck-at type faults. A procedure is developed to build FT ANNs by replicating the hidden units. It exploits the intrinsic weighted summation operation performed by the processing units to overcome faults. Metrics are devised to quantify the FT as a function of redundancy. A lower bound on the redundancy required to tolerate all possible single faults is analytically derived. Less than triple modular redundancy (TMR) cannot provide complete FT for all possible single faults. The actual redundancy needed to synthesize a completely FT net is specific to the problem at hand and is usually much higher than that dictated by the general lower bound. The conventional TMR scheme of triplication and majority voting is the best way to achieve complete FT in most ANNs. Although the redundancy needed for complete FT is substantial, the ANNs exhibit good partial FT to begin with and degrade gracefully. The first replication yields maximum enhancement in partial FT compared with later successive replications. For large nets, exhaustive testing of all possible single faults is prohibitive, so the strategy of randomly testing a small fraction of the total number of links is adopted. It yields partial FT estimates that are very close to those obtained by exhaustive testing. When the fraction of links tested is held fixed, the accuracy of the estimate generated by random testing is seen to improve as the net size grows.

[1]  Dhananjay S. Phatak,et al.  Fault tolerance of feedforward neural nets for classification tasks , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[2]  Michael J. Carter,et al.  Operational Fault Tolerance of CMAC Networks , 1989, NIPS.

[3]  Yann LeCun,et al.  A theoretical framework for back-propagation , 1988 .

[4]  Michael J. Carter,et al.  Comparative Fault Tolerance of Parallel Distributed Processing Networks , 1994, IEEE Trans. Computers.

[5]  Chilukuri K. Mohan,et al.  Fault Tolerance of Neural Networks , 1994 .

[6]  Arthur E. Bryson,et al.  Applied Optimal Control , 1969 .

[7]  C. H. Sequin,et al.  Fault tolerance in artificial neural networks , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[8]  Yann Le Cun,et al.  A Theoretical Framework for Back-Propagation , 1988 .

[9]  Michael I. Jordan,et al.  Advances in Neural Information Processing Systems 30 , 1995 .

[10]  Alex Pentland,et al.  Analysis of Neural Networks with Redundancy , 1990, Neural Computation.

[11]  Josef Skrzypek,et al.  Synergy of Clustering Multiple Back Propagation Networks , 1989, NIPS.

[12]  Robert I. Damper,et al.  Fault tolerance and redundancy of neural nets for the classification of acoustic data , 1991, [Proceedings] ICASSP 91: 1991 International Conference on Acoustics, Speech, and Signal Processing.

[13]  T. R. Damarla,et al.  Fault tolerance of neural networks , 1989, Proceedings. IEEE Energy and Information Technologies in the Southeast'.

[14]  K. Lang,et al.  Learning to tell two spirals apart , 1988 .

[15]  Christian Lebiere,et al.  The Cascade-Correlation Learning Architecture , 1989, NIPS.

[16]  Daniel L. Palumbo,et al.  Performance and fault-tolerance of neural networks for optimization , 1993, IEEE Trans. Neural Networks.

[17]  Terrence J. Sejnowski,et al.  Analysis of hidden units in a layered network trained to classify sonar targets , 1988, Neural Networks.

[18]  Dhananjay S. Phatak,et al.  Construction of Minimal n-2-n Encoders for Any n , 1993, Neural Computation.

[19]  Chalapathy Neti,et al.  Maximally fault tolerant neural networks , 1992, IEEE Trans. Neural Networks.

[20]  Bradley W. Dickinson,et al.  Trellis codes, receptive fields, and fault tolerant, self-repairing neural networks , 1990, IEEE Trans. Neural Networks.

[21]  Yann LeCun,et al.  Optimal Brain Damage , 1989, NIPS.

[22]  Barry W. Johnson,et al.  The Analysis of the Faulty Behavior of Synchronous Neural Networks , 1991, IEEE Trans. Computers.

[23]  Dhananjay S. Phatak,et al.  Connectivity and performance tradeoffs in the cascade correlation learning architecture , 1994, IEEE Trans. Neural Networks.

[24]  Terrence J. Sejnowski,et al.  Parallel Networks that Learn to Pronounce English Text , 1987, Complex Syst..

[25]  R.D. Clay,et al.  Fault tolerance training improves generalization and robustness , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.