The importance of input variables to a neural network fault-diagnostic system for nuclear power plants

This thesis explores safety enhancement for nuclear power plants. Emergency response systems currently in use depend mainly on automatic systems engaging when certain parameters go beyond a pre-specified safety limit. Often times the operator has little or no opportunity to react since a fast scram signal shuts down the reactor smoothly and efficiently. These accidents are of interest to technical support personnel since examining the conditions that gave rise to these situations help determine causality. In many other cases an automated fault-diagnostic advisor would be a valuable tool in assisting the technicians and operators to determine what just happened and why.

[1]  R. E. Uhrig,et al.  A self-optimizing stochastic dynamic node learning algorithm for layered neural networks , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[2]  W. Freeman The physiology of perception. , 1991, Scientific American.

[3]  Maureen Caudill Using neural nets, Part 2: fuzzy decisions , 1990 .

[4]  Bernard Widrow,et al.  30 years of adaptive neural networks: perceptron, Madaline, and backpropagation , 1990, Proc. IEEE.

[5]  Jocelyn Sietsma,et al.  Creating artificial neural networks that generalize , 1991, Neural Networks.

[6]  Calculus with analytic geometry - Alternate Edition , 1983 .

[7]  Eric B. Bartlett,et al.  Nuclear power plant status diagnostics using an artificial neural network , 1992 .

[8]  Robert Hecht-Nielsen,et al.  Theory of the backpropagation neural network , 1989, International 1989 Joint Conference on Neural Networks.

[9]  R. Lippmann,et al.  An introduction to computing with neural nets , 1987, IEEE ASSP Magazine.

[10]  Maureen Caudill,et al.  Neural networks primer, part III , 1988 .

[11]  Gordon B. Davis,et al.  Fortran 77: A Structured, Disciplined Style , 1983 .

[12]  T. W. Kerlin,et al.  Nuclear power plant status diagnostics using simulated condensation: an auto-adaptive computer learning technique , 1990 .

[13]  Pamela Morris,et al.  Statistics, a first course , 1981 .

[14]  John B. Shoven,et al.  I , Edinburgh Medical and Surgical Journal.

[15]  D. G. Zill A first course in differential equations with applications , 1982 .

[16]  Maureen Caudill Using neural nets: diagnostic expert nets , 1990 .

[17]  Maureen Caudill Using neural nets, part 3: fuzzy cognitive maps , 1990 .

[18]  CaudillMaureen Neural networks primer, part I , 1987 .

[19]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[20]  S. Nolfi,et al.  Learning to understand sentences in a connectionist network , 1988, IEEE 1988 International Conference on Neural Networks.

[21]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[22]  David Hillman,et al.  Integrating neural nets and expert systems , 1990 .

[23]  W S McCulloch,et al.  A logical calculus of the ideas immanent in nervous activity , 1990, The Philosophy of Artificial Intelligence.

[24]  Paul J. Werbos,et al.  Building and Understanding Adaptive Systems: A Statistical/Numerical Approach to Factory Automation and Brain Research , 1987, IEEE Transactions on Systems, Man, and Cybernetics.

[25]  A. Basu,et al.  A dynamic node architecture scheme for backpropagation neural networks , 1991 .

[26]  Maureen Caudill Using neural nets: making an expert network , 1990 .

[27]  M. Caudill Using neural nets: representing knowledge, part 1 , 1989 .

[28]  Daniel J. Amit,et al.  Attractor neural networks and biological reality: associative memory and learning , 1989, Future Gener. Comput. Syst..

[29]  Yamashita,et al.  Backpropagation algorithm which varies the number of hidden units , 1989 .