MUSE-RNN: A Multilayer Self-Evolving Recurrent Neural Network for Data Stream Classification

In this paper, we propose MUSE-RNN, a multilayer self-evolving recurrent neural network model for real-time classification of streaming data. Unlike the existing approaches, MUSE-RNN offers special treatment towards capturing temporal aspects of data stream through its novel recurrent learning approach based on the teacher forcing policy. Novelties here are twofold. First, in contrast to the traditional RNN models, MUSE-RNN has intrinsic ability to self-adjust its capacity by growing and pruning hidden nodes as well as layers, to handle the ever-changing characteristics of data stream. Second, MUSERNN adopts a unique scoring-based layer adaptation mechanism, which makes it capable of recalling prior tasks, with minimum exploitation of network parameters. The performance of MUSERNN is evaluated in comparison with a number of state-of-theart techniques, using seven popular data streams and continual learning problems under prequential test-then-train protocol. Experimental results demonstrate the effectiveness of MUSERNN in stream classification scenario.

[1]  Geoff Holmes,et al.  Pitfalls in Benchmarking Data Stream Classification and How to Avoid Them , 2013, ECML/PKDD.

[2]  Alexandros Karatzoglou,et al.  Overcoming Catastrophic Forgetting with Hard Attention to the Task , 2018 .

[3]  Robi Polikar,et al.  Incremental Learning of Concept Drift in Nonstationary Environments , 2011, IEEE Transactions on Neural Networks.

[4]  Mahardhika Pratama,et al.  Autonomous Deep Learning: Continual Learning Approach for Dynamic Environments , 2018, SDM.

[5]  Ohad Shamir,et al.  The Power of Depth for Feedforward Neural Networks , 2015, COLT.

[6]  Jiwon Kim,et al.  Continual Learning with Deep Generative Replay , 2017, NIPS.

[7]  Marc'Aurelio Ranzato,et al.  Gradient Episodic Memory for Continual Learning , 2017, NIPS.

[8]  Vasant Honavar,et al.  Learn++: an incremental learning algorithm for supervised neural networks , 2001, IEEE Trans. Syst. Man Cybern. Part C.

[9]  Sreenatha G. Anavatti,et al.  PALM: An Incremental Construction of Hyperplanes for Data Stream Regression , 2018, IEEE Transactions on Fuzzy Systems.

[10]  Sung Ju Hwang,et al.  Lifelong Learning with Dynamically Expandable Networks , 2017, ICLR.

[11]  Wee Keong Ng,et al.  A survey on data stream clustering and classification , 2015, Knowledge and Information Systems.

[12]  P. Baldi,et al.  Searching for exotic particles in high-energy physics with deep learning , 2014, Nature Communications.

[13]  Latifur Khan,et al.  Robust High Dimensional Stream Classification with Novel Class Detection , 2019, 2019 IEEE 35th International Conference on Data Engineering (ICDE).

[14]  Konrad Schindler,et al.  Online Multi-Target Tracking Using Recurrent Neural Networks , 2016, AAAI.

[15]  João Gama,et al.  A survey on learning from data streams: current and future trends , 2012, Progress in Artificial Intelligence.

[16]  João Gama,et al.  On evaluating stream learning algorithms , 2012, Machine Learning.

[17]  José del Campo-Ávila,et al.  Online and Non-Parametric Drift Detection Methods Based on Hoeffding’s Bounds , 2015, IEEE Transactions on Knowledge and Data Engineering.

[18]  Ambuj Tewari,et al.  Online multiclass boosting , 2017, NIPS.

[19]  William Nick Street,et al.  A streaming ensemble algorithm (SEA) for large-scale classification , 2001, KDD '01.

[20]  Geoff Holmes,et al.  MOA: Massive Online Analysis , 2010, J. Mach. Learn. Res..

[21]  Gregory Ditzler,et al.  Incremental Learning of Concept Drift from Streaming Imbalanced Data , 2013, IEEE Transactions on Knowledge and Data Engineering.

[22]  Razvan Pascanu,et al.  Progressive Neural Networks , 2016, ArXiv.

[23]  Monidipa Das,et al.  FERNN: A Fast and Evolving Recurrent Neural Network Model for Streaming Data Classification , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).

[24]  Xuelong Li,et al.  Video Captioning with Tube Features , 2018, IJCAI.

[25]  Witold Pedrycz,et al.  Online Tool Condition Monitoring Based on Parsimonious Ensemble+ , 2017, IEEE Transactions on Cybernetics.

[26]  Suleyman Serdar Kozat,et al.  Efficient Online Learning Algorithms Based on LSTM Neural Networks , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[27]  Geoff Holmes,et al.  New ensemble methods for evolving data streams , 2009, KDD.

[28]  Zhonghai Wu,et al.  AEM: Attentional Ensemble Model for personalized classifier weight learning , 2019, Pattern Recognit..