DyReg-FResNet: Unsupervised Feature Space Amplified Dynamic Regularized Residual Network for Time Series Classification

Time Series Classification (TSC) is a challenging problem owing to the practical constraints of lack of availability of training examples and insufficient sample points in the training instances. In order to ensure the construction of a robust trained model (under practical constraints) to address TSC, we propose DyReg-FResNet, which is a dynamically regularized Residual Network (ResNet), amplified by unsupervised feature space training. We generate signal processing, information theoretic and statistical features to augment the representation learning of the ResNet. The unsupervised features are capable of extracting the morphological and structural characteristics of the time series signals, whereas our proposed dynamic regularizer trades off by reducing substantial variance while not perturbing the bias. The regularization factor is a function of the signal dynamics (in effect, regularization factor is different for different training sets). DyReg-FResNet learns through residual mapping to minimize the exploding or vanishing gradient problems, amplified unsupervised features amplify the representation space by introducing lower level representation to guide the learning towards the basins of attraction of minima and dynamic regularizer minimizes the generalization error. We extensively experiment with publicly available UCR time series datasets. DyReg-FResNet demonstrates extremely superior performance by consistently outperforming the existing benchmark results as well as current state-of-the-art algorithms.

[1]  Carlos Agón,et al.  Time-series data mining , 2012, CSUR.

[2]  Tim Oates,et al.  Time series classification from scratch with deep neural networks: A strong baseline , 2016, 2017 International Joint Conference on Neural Networks (IJCNN).

[3]  Patrick Schäfer The BOSS is concerned with time series classification in the presence of noise , 2014, Data Mining and Knowledge Discovery.

[4]  Pierre Geurts,et al.  Pattern Extraction for Time Series Classification , 2001, PKDD.

[5]  Geoffrey I. Webb,et al.  Proximity Forest: an effective and scalable distance-based classifier for time series , 2018, Data Mining and Knowledge Discovery.

[6]  Serge J. Belongie,et al.  Residual Networks Behave Like Ensembles of Relatively Shallow Networks , 2016, NIPS.

[7]  Amy Loutfi,et al.  A review of unsupervised feature learning and deep learning for time-series modeling , 2014, Pattern Recognit. Lett..

[8]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[9]  Pankaj Malhotra,et al.  Fusing Features based on Signal Properties and TimeNet for Time Series Classification , 2019, ESANN.

[10]  Eamonn J. Keogh,et al.  The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances , 2016, Data Mining and Knowledge Discovery.

[11]  Peter Glöckner,et al.  Why Does Unsupervised Pre-training Help Deep Learning? , 2013 .

[12]  Eamonn J. Keogh,et al.  Exact indexing of dynamic time warping , 2002, Knowledge and Information Systems.

[13]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[14]  Jason Lines,et al.  Time-Series Classification with COTE: The Collective of Transformation-Based Ensembles , 2015, IEEE Transactions on Knowledge and Data Engineering.

[15]  Germain Forestier,et al.  Deep learning for time series classification: a review , 2018, Data Mining and Knowledge Discovery.