Intelligent Alarm Handling

Most of the actions taken within today's power plants are directed by control systems, which usually are computerised and located in a central control room within the power plant. In normal states, the communication between the control system and the operators is satisfactory, with few alarms occurring infrequently. However, when large disturbances occur, the communication is problematical. Instead of being aided by the messages, the operators become swamped by the amount of information, and often have to make more or less informed guesses of what causes the abnormal situation. It is therefore of great importance if the control system can discriminate between normal and abnormal situations, as well as being less sensitive and giving priority to alarms that must be sent to the operators. In order for the system to make such analyses, processes for diagnosis and decision making regarding the reliab ility and importance of the information are needed. This paper shows how machine learning algorithms can be combined with decision theory w.r.t. vague and numerically imprecise background information, by using classifiers. An ensemble is a classifier created by combining the predictions of multiple component classifiers. We present a new method for combining classifiers into an ensemble based on a simple estimation of each classifier's competence. The purpose is to develop a filter for handling complex alarm situations. Decision situations are evaluated using fast algorithms developed particularly for solving these kinds of problems. The presented framework has been developed in co-operation with one of the main actors in the Swedish power plant industry.

[1]  Jiří Křovák Ranking alternatives--comparison of different methods based on binary comparison matrices , 1987 .

[2]  R. Clemen Combining forecasts: A review and annotated bibliography , 1989 .

[3]  Love Ekenberg,et al.  From Local Assessments to Global Rationality , 1996, Int. J. Cooperative Inf. Syst..

[4]  F. Lootsma SCALE SENSITIVITY IN THE MULTIPLICATIVE AHP AND SMART , 1993 .

[5]  Love Ekenberg,et al.  A framework for analysing decisions under risk , 1998 .

[6]  J. Mesirov,et al.  Hybrid system for protein secondary structure prediction. , 1992, Journal of molecular biology.

[7]  E. Lehmann,et al.  Testing Statistical Hypothesis. , 1960 .

[8]  Ching-Lai Hwang,et al.  Multiple Attribute Decision Making: Methods and Applications - A State-of-the-Art Survey , 1981, Lecture Notes in Economics and Mathematical Systems.

[9]  H. Sebastian Seung,et al.  Learning from a Population of Hypotheses , 1993, COLT '93.

[10]  Per-Erik Malmnäs,et al.  Towards a Mechanization of Real-Life Decisions , 1994 .

[11]  Ward Edwards,et al.  How to Use Multiattribute Utility Measurement for Social Decisionmaking , 1977, IEEE Transactions on Systems, Man, and Cybernetics.

[12]  Harris Drucker,et al.  The Boosting Approach to Machine Learning An Overview , 2003 .

[13]  J. Barzilai,et al.  Consistent weights for judgements matrices of the relative importance of alternatives , 1987 .

[14]  S R Watson,et al.  COMMENTS ON: ASSESSING ATTRIBUTE WEIGHTS BY RATIO , 1983 .

[15]  B. Rost,et al.  Prediction of protein secondary structure at better than 70% accuracy. , 1993, Journal of molecular biology.

[16]  Bruce W. Schmeiser,et al.  Optimal linear combinations of neural networks: an overview , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[17]  Geoffrey E. Hinton,et al.  Adaptive Mixtures of Local Experts , 1991, Neural Computation.

[18]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[19]  T. L. Saaty A Scaling Method for Priorities in Hierarchical Structures , 1977 .

[20]  Richard J. Mammone,et al.  Artificial neural networks for speech and vision , 1994 .

[21]  Ralph L. Keeney,et al.  Value-Focused Thinking: A Path to Creative Decisionmaking , 1992 .

[22]  J. Ross Quinlan,et al.  Bagging, Boosting, and C4.5 , 1996, AAAI/IAAI, Vol. 1.

[23]  Love Ekenberg,et al.  Imposing security constraints on agent-based decision support , 1997, Decis. Support Syst..

[24]  Geoffrey E. Hinton,et al.  Evaluation of Adaptive Mixtures of Competing Experts , 1990, NIPS.

[25]  Sr Watson,et al.  Assesing Attribute Weights , 1982 .

[26]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[27]  Josef Skrzypek,et al.  Synergy of Clustering Multiple Back Propagation Networks , 1989, NIPS.

[28]  Robert A. Jacobs,et al.  Hierarchical Mixtures of Experts and the EM Algorithm , 1993, Neural Computation.

[29]  R. Hämäläinen,et al.  Preference programming through approximate ratio comparisons , 1995 .

[30]  Michael I. Jordan,et al.  Advances in Neural Information Processing Systems 30 , 1995 .

[31]  R. L. Keeney,et al.  Decisions with Multiple Objectives: Preferences and Value Trade-Offs , 1977, IEEE Transactions on Systems, Man, and Cybernetics.

[32]  Andrew P. Sage,et al.  ARIADNE: A knowledge-based interactive system for planning and decision support , 1984, IEEE Transactions on Systems, Man, and Cybernetics.

[33]  Love Ekenberg,et al.  A Support System for Real-Life Decisions in Numerically Imprecise Domains , 1995 .

[34]  Jude W. Shavlik,et al.  Combining the Predictions of Multiple Classifiers: Using Competitive Learning to Initialize Neural Networks , 1995, IJCAI.

[35]  Anders Krogh,et al.  Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.

[36]  Galina L. Rogova,et al.  Combining the results of several neural network classifiers , 1994, Neural Networks.