An Incremental Learning Ensemble Strategy for Industrial Process Soft Sensors

With the continuous improvement of automation in industrial production, industrial process data tends to arrive continuously in many cases. The ability to handle large amounts of data incrementally and efficiently is indispensable for modern machine learning (ML) algorithms. According to the characteristics of industrial production process, we address an ILES (incremental learning ensemble strategy) that incorporates incremental learning to extract information efficiently from constantly incoming data. The ILES aggregates multiple sublearning machines by different weights for better accuracy. When new data set arrives, a new submachine will be trained and aggregated into ensemble soft sensor model according to its weight. The other submachines' weights will be updated at the same time. Then a new updated soft sensor ensemble model can be obtained. The weight updating rules are designed by considering the prediction accuracy of submachines with new arrived data. So the update can fit the data change and obtain new information efficiently. The sizing percentage soft sensor model is established to learn the information from the production data in the sizing of industrial processes and to test the performance of ILES, where the ELM (Extreme Learning Machine) is selected as the sublearning machine. The comparison is done among new method, single ELM, AdaBoost.R ELM, and OS-ELM, and the test of the extensions is done with three test functions. The results of the experiments demonstrate that the soft sensor model based on the ILES has the best accuracy and ability of online updating.

[1]  Harris Drucker,et al.  Improving Regressors using Boosting Techniques , 1997, ICML.

[2]  Srikumar Krishnamoorthy,et al.  Efficiently mining high utility itemsets with negative unit profits , 2017, Knowl. Based Syst..

[3]  Ji Chen,et al.  Regularization incremental extreme learning machine with random reduced kernel for regression , 2018, Neurocomputing.

[4]  Durga L. Shrestha,et al.  Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression , 2006, Neural Computation.

[5]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[6]  Tian Huixi,et al.  Online soft measurement of sizing percentage based on intergrated multiple SVR fusion by Bagging , 2014 .

[7]  Zhi-Zhong Mao,et al.  An Ensemble ELM Based on Modified AdaBoost.RT Algorithm for Predicting the Temperature of Molten Steel in Ladle Furnace , 2010, IEEE Transactions on Automation Science and Engineering.

[8]  Nathan Intrator,et al.  Boosting Regression Estimators , 1999, Neural Computation.

[9]  Gregory Ditzler,et al.  Incremental Learning of Concept Drift from Streaming Imbalanced Data , 2013, IEEE Transactions on Knowledge and Data Engineering.

[10]  Guang-Bin Huang,et al.  Extreme learning machine: a new learning scheme of feedforward neural networks , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[11]  Robert E. Schapire,et al.  The strength of weak learnability , 1990, Mach. Learn..

[12]  Hai-Jun Rong,et al.  Aircraft recognition using modular extreme learning machine , 2014, Neurocomputing.

[13]  Ponnuthurai N. Suganthan,et al.  Ensemble incremental learning Random Vector Functional Link network for short-term electric load forecasting , 2018, Knowl. Based Syst..

[14]  Chih-Chou Chiu,et al.  A hybrid approach by integrating wavelet-based feature extraction with MARS and SVR for stock index forecasting , 2013, Decis. Support Syst..

[15]  Jeff Fleming,et al.  Predicting stock market volatility: A new measure , 1995 .

[16]  Vasant Honavar,et al.  Learn++: an incremental learning algorithm for supervised neural networks , 2001, IEEE Trans. Syst. Man Cybern. Part C.

[17]  Hai-Jun Rong,et al.  Self-organizing fuzzy failure diagnosis of aircraft sensors , 2015, Memetic Comput..

[18]  Jerzy Stefanowski,et al.  Reacting to Different Types of Concept Drift: The Accuracy Updated Ensemble Algorithm , 2014, IEEE Transactions on Neural Networks and Learning Systems.