A boosting method for process fault detection with detection delay reduction and label denoising

In this paper we propose a novel fault detection algorithm for process control and maintenance that builds an ensemble of classifiers based on the modified AdaBoost technique. While seeking for the best fault detection accuracy, our algorithm also concentrates on reducing detection delay, which ensures safety and timely equipment service. In addition, the new algorithm can simultaneously detect and remove class-label noise in process data. Training is performed via iteratively optimizing an exponential cost function. The cost function also adaptively changes at each iteration, such that (1) the importance of the fault transition periods is increased to reduce the detection delay and (2) noisy samples are removed from training data. The algorithm was tested on a well known benchmark problem, the Tennessee Eastman Process (TEP), and compared to the baseline AdaBoost ensemble fault detector that does not pay specific attention to minimization of the detection delay and noise removal.