Empirical Models with Self-Assessment Capabilities for On-Line Industrial Applications

Self-assessment capabilities are critical for the longevity of online empirical models in industrial settings. A generic structure of an on-line model supervisor, consisting of within-the-range indicator, confidence of prediction, performance indicator, novelty/outlier detector, and model fault detector, is proposed in the paper. Several methods for confidence limits calculations, such as ensembles of analytic neural networks and symbolic regression models generated by genetic programming, linearized models based on transforms, derived by genetic programming, and a strangeness measure, based on support vector machines for regression, have been explored and their performance was compared in a case study for emission estimation on-line model. Some of the self-assessment capabilities for detection of unacceptable on-line performance and model and process faults are illustrated with industrial applications in the chemical industry.

[1]  Arthur K. Kordon,et al.  Robust soft sensor development using genetic programming , 2003 .

[2]  Jie Zhang,et al.  Combination of multiple neural networks using data fusion techniques for enhanced nonlinear process modelling , 2005, Comput. Chem. Eng..

[3]  N. Draper,et al.  Applied Regression Analysis. , 1967 .

[4]  Terence Soule,et al.  Behavioral Diversity and a Probabilistically Optimal GP Ensemble , 2004, Genetic Programming and Evolvable Machines.

[5]  S. Qin,et al.  Self-validating inferential sensors with application to air emission monitoring , 1997 .

[6]  Guido Smits,et al.  Robust outlier detection using SVM regression , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[7]  Elsa M. Jordaan,et al.  Confidence of SVM Predictions using a Strangeness Measure , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[8]  Arthur K. Kordon,et al.  Application Issues of Genetic Programming in Industry , 2006 .

[9]  Mark A. Kramer,et al.  Autoassociative neural networks , 1992 .

[10]  Amanda J. C. Sharkey,et al.  Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems , 1999 .

[11]  Guido Smits,et al.  Hybrid model development methodology for industrial soft sensors , 2003, Proceedings of the 2003 American Control Conference, 2003..

[12]  Mark Kotanchek,et al.  Pareto-Front Exploitation in Symbolic Regression , 2005 .

[13]  Thomas P. Ryan,et al.  Modern Regression Methods , 1996 .

[14]  N. Draper,et al.  Applied Regression Analysis: Draper/Applied Regression Analysis , 1998 .

[15]  Lyle H. Ungar,et al.  A NEURAL NETWORK ARCHITECTURE THAT COMPUTES ITS OWN RELIABILITY , 1992 .

[16]  Xin Yao,et al.  Evolutionary ensembles with negative correlation learning , 2000, IEEE Trans. Evol. Comput..

[17]  Arthur K. Kordon,et al.  Symbolic Regression In Design Of Experiments: A Case Study With Linearizing Transformations , 2002, GECCO.

[18]  Siddhartha Bhattacharyya,et al.  Genetic programming in classifying large-scale data: an ensemble method , 2004, Inf. Sci..

[19]  Arthur K. Kordon,et al.  Robust Inferential Sensors Based on Ensemble of Predictors Generated by Genetic Programming , 2004, PPSN.

[20]  C. Kiparissides,et al.  Inferential Estimation of Polymer Quality Using Stacked Neural Networks , 1997 .

[21]  A. J. Morris,et al.  Confidence bounds for neural network representations , 1997 .