In the past few years, researchers have been extensively studying the application of recurrent neural networks (RNNs) to solving tasks where detection of long term dependencies is required. This paper proposes an original architecture termed the Recurrent Multiscale Network, RMN, to deal with these kinds of problems. Its most relevant properties are concerned with maintaining conventional RNNs' capability of information storing whilst simultaneously attempting to reduce their typical drawback occurring when they are trained by gradient descent algorithms, namely the vanishing gradient effect. This is achieved through RMN which preprocesses the original signal separating information at different temporal scales through an adequate DSP tool, and handling each information level with an autonomous recurrent architecture; the final goal is achieved by a nonlinear reconstruction section. This network has shown a markedly improved generalization performance over conventional RNNs, in its application to time series prediction tasks where long range dependencies are involved.
[1]
Amir B. Geva,et al.
ScaleNet-multiscale neural-network architecture for time series prediction
,
1998,
IEEE Trans. Neural Networks.
[2]
Jelena Kovacevic,et al.
Wavelets and Subband Coding
,
2013,
Prentice Hall Signal Processing Series.
[3]
Ronald J. Williams,et al.
Gradient-based learning algorithms for recurrent networks and their computational complexity
,
1995
.
[4]
Piotr Kokoszka,et al.
Prediction of Long-Memory Time Series: A Tutorial Review
,
2003
.
[5]
John F. Kolen,et al.
Field Guide to Dynamical Recurrent Networks
,
2001
.
[6]
A.H. Tewfik,et al.
Correlation structure of the discrete wavelet coefficients of fractional Brownian motion
,
1992,
IEEE Trans. Inf. Theory.
[7]
John F. Kolen,et al.
Dynamical Recurrent Networks
,
2001
.
[8]
Dominik R. Dersch,et al.
Multiresolution Forecasting for Futures Trading
,
2001
.