Time series classification using diversified Ensemble Deep Random Vector Functional Link and Resnet features

Abstract Random Vector Functional Link (RVFL) is popular among researchers in many areas of machine learning. RVFL is preferred by many researchers as RVFL can produce good performance with relatively little training time. Recent works extend RVFL into deep and ensemble versions. However, RVFL does not have effective feature extraction methods commonly used in time series classification. This results in poor performance of RVFL in time series classification tasks. Also, deep RVFL is a relatively new and evolving area of research. In this paper, we present a framework that extracts features from Residual Networks (Resnet) and trains Ensemble Deep Random Vector Functional Link (edRVFL). We use features extracted from every residual block to train an ensemble of edRVFLs. We propose the following enhancements to edRVFL. Firstly, we diversity the structure of edRVFL and the direct link features to encourage diversity. Secondly, we built an ensemble of edRVFLs with the top two activation functions. Thirdly, we use two-stage tuning to save computational costs. Lastly, we perform a weighted average of all decisions made by every edRVFL. Experiments on the 55 largest UCR datasets show that using features extracted from all Residual blocks improves performance. All our proposed enhancements help improve classification accuracy or computational effort. Consequently, our proposed framework outperforms all traditional and deep learning-based time series classification methods.

[1]  Le Zhang,et al.  Visual Tracking With Convolutional Random Vector Functional Link Network , 2017, IEEE Transactions on Cybernetics.

[2]  Bin Ma,et al.  Convolutional neural network with multi-task learning scheme for acoustic scene classification , 2017, 2017 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC).

[3]  Germain Forestier,et al.  Deep learning for time series classification: a review , 2018, Data Mining and Knowledge Discovery.

[4]  Jason Lines,et al.  Classification of time series by shapelet transformation , 2013, Data Mining and Knowledge Discovery.

[5]  Ying Wah Teh,et al.  Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges , 2018, Expert Syst. Appl..

[6]  Y. Takefuji,et al.  Functional-link net computing: theory, system architecture, and functionalities , 1992, Computer.

[7]  Eamonn J. Keogh,et al.  The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances , 2016, Data Mining and Knowledge Discovery.

[8]  Robert P. W. Duin,et al.  Feedforward neural networks with random weights , 1992, Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology and Systems.

[9]  Jason Lines,et al.  Time-Series Classification with COTE: The Collective of Transformation-Based Ensembles , 2015, IEEE Trans. Knowl. Data Eng..

[10]  George C. Runger,et al.  A time series forest for classification and feature extraction , 2013, Inf. Sci..

[11]  Bolei Zhou,et al.  Learning Deep Features for Discriminative Localization , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[12]  George C. Runger,et al.  A Bag-of-Features Framework to Classify Time Series , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Najdan Vukovic,et al.  A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression , 2017, Appl. Soft Comput..

[14]  P. N. Suganthan,et al.  Random Vector Functional Link Neural Network based Ensemble Deep Learning , 2019, Pattern Recognit..

[15]  Swagatam Das,et al.  Improving the Performance of Neural Networks with an Ensemble of Activation Functions , 2020, 2020 International Joint Conference on Neural Networks (IJCNN).

[16]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[17]  Jun Wang,et al.  Multivariate Temporal Convolutional Network: A Deep Neural Networks Approach for Multivariate Time Series Forecasting , 2019, Electronics.

[18]  Le Zhang,et al.  An ensemble of decision trees with random vector functional link networks for multi-class classification , 2017, Appl. Soft Comput..

[19]  Jason Lines,et al.  Time series classification with ensembles of elastic distance measures , 2015, Data Mining and Knowledge Discovery.

[20]  Ajay M. Patrikar,et al.  Multi-Activation Hidden Units for Neural Networks with Random Weights , 2020, ArXiv.

[21]  Geoffrey E. Hinton,et al.  Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.

[22]  Xindong Wu,et al.  10 Challenging Problems in Data Mining Research , 2006, Int. J. Inf. Technol. Decis. Mak..

[23]  Matteo Terzi,et al.  Time-Series Classification Methods: Review and Applications to Power Systems Data , 2018 .

[24]  Tomasz Górecki,et al.  Using derivatives in time series classification , 2012, Data Mining and Knowledge Discovery.

[25]  Tomasz Górecki,et al.  Non-isometric transforms in time series classification using DTW , 2014, Knowl. Based Syst..

[26]  George C. Runger,et al.  Time series representation and similarity based on local autopatterns , 2016, Data Mining and Knowledge Discovery.

[27]  Yoh-Han Pao,et al.  Adaptive pattern recognition and neural networks , 1989 .

[28]  Jeffrey Dean,et al.  Scalable and accurate deep learning with electronic health records , 2018, npj Digital Medicine.

[29]  Javier Del Ser,et al.  On-line Elastic Similarity Measures for time series , 2019, Pattern Recognit..

[30]  Lifeng Shen,et al.  Time series classification with Echo Memory Networks , 2019, Neural Networks.

[31]  Anthony J. Bagnall,et al.  Binary Shapelet Transform for Multiclass Time Series Classification , 2015, Trans. Large Scale Data Knowl. Centered Syst..

[32]  Ling Tang,et al.  A non-iterative decomposition-ensemble learning paradigm using RVFL network for crude oil price forecasting , 2017, Appl. Soft Comput..

[33]  Houshang Darabi,et al.  LSTM Fully Convolutional Networks for Time Series Classification , 2017, IEEE Access.

[34]  Lars Schmidt-Thieme,et al.  Learning time-series shapelets , 2014, KDD.

[35]  Hubert A.B. Te Braake,et al.  Random activation weight neural net (RAWN) for fast non-iterative training. , 1995 .

[36]  Xiaohui Peng,et al.  Deep Learning for Sensor-based Activity Recognition: A Survey , 2017, Pattern Recognit. Lett..

[37]  David J. Sheskin,et al.  Handbook of Parametric and Nonparametric Statistical Procedures , 1997 .

[38]  Csaba Tóth,et al.  Bayesian Learning from Sequential Data using Gaussian Processes with Signature Covariances , 2019, ICML.

[39]  Qiang Chen,et al.  Network In Network , 2013, ICLR.

[40]  P. N. Suganthan,et al.  Benchmarking Ensemble Classifiers with Novel Co-Trained Kernal Ridge Regression and Random Vector Functional Link Ensembles [Research Frontier] , 2017, IEEE Computational Intelligence Magazine.

[41]  Ponnuthurai N. Suganthan,et al.  On the origins of randomization-based feedforward neural networks , 2021, Appl. Soft Comput..

[42]  Patrick Schäfer The BOSS is concerned with time series classification in the presence of noise , 2014, Data Mining and Knowledge Discovery.

[43]  Bijaya K. Panigrahi,et al.  Indian summer monsoon rainfall prediction: A comparison of iterative and non-iterative approaches , 2017, Appl. Soft Comput..

[44]  Christian Lebiere,et al.  The Cascade-Correlation Learning Architecture , 1989, NIPS.

[45]  Tim Oates,et al.  Time series classification from scratch with deep neural networks: A strong baseline , 2016, 2017 International Joint Conference on Neural Networks (IJCNN).

[46]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[47]  John Cristian Borges Gamboa,et al.  Deep Learning for Time-Series Analysis , 2017, ArXiv.

[48]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.