Recurrent Neural Networks Applied to GNSS Time Series for Denoising and Prediction

Global Navigation Satellite Systems (GNSS) are systems that continuously acquire data and provide position time series. Many monitoring applications are based on GNSS data and their efficiency depends on the capability in the time series analysis to characterize the signal content and/or to predict incoming coordinates. In this work we propose a suitable Network Architecture, based on Long Short Term Memory Recurrent Neural Networks, to solve two main tasks in GNSS time series analysis: denoising and prediction. We carry out an analysis on a synthetic time series, then we inspect two real different case studies and evaluate the results. We develop a non-deep network that removes almost the 50% of scattering from real GNSS time series and achieves a coordinate prediction with 1.1 millimeters of Mean Squared Error. 2012 ACM Subject Classification General and reference → General conference proceedings; Mathematics of computing → Time series analysis; Computing methodologies → Supervised learning by regression; Information systems → Global positioning systems

[1]  Michael Mayer,et al.  Analysing Time Series of GNSS Residuals by Means of AR(I)MA Processes , 2012 .

[2]  Sepp Hochreiter,et al.  The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions , 1998, Int. J. Uncertain. Fuzziness Knowl. Based Syst..

[3]  Tae-Suk Bae,et al.  Deep Learning-Based GNSS Network-Based Real-Time Kinematic Improvement for Autonomous Ground Vehicle Navigation , 2019, J. Sensors.

[4]  Richard A. Davis,et al.  Introduction to time series and forecasting , 1998 .

[5]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[6]  Yoshua Bengio,et al.  Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.

[7]  Geoffrey E. Hinton,et al.  Speech recognition with deep recurrent neural networks , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[8]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[9]  Simon X. Yang,et al.  Bridge Structure Deformation Prediction Based on GNSS Data Using Kalman-ARIMA-GARCH Model , 2018, Sensors.

[10]  Xiaoguang Luo,et al.  GPS Stochastic Modelling: Signal Quality Measures and ARMA Processes , 2013 .

[11]  Claudio Gallicchio,et al.  Short-term Memory of Deep RNN , 2018, ESANN.

[12]  Lima,et al.  Smoothing GNSS Time Series with Asymmetric Simple Moving Averages , 2012 .

[13]  Luca Tavasci,et al.  Structural Monitoring Using GNSS Technology and Sequential Filtering , 2015 .

[14]  Ping-Feng Pai,et al.  A hybrid ARIMA and support vector machines model in stock price forecasting , 2005 .

[15]  Alex Graves,et al.  Generating Sequences With Recurrent Neural Networks , 2013, ArXiv.

[16]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[17]  Yang Yin,et al.  Hybrid LSTM Neural Network for Short-Term Traffic Flow Prediction , 2019, Inf..

[18]  Navdeep Jaitly,et al.  Hybrid speech recognition with Deep Bidirectional LSTM , 2013, 2013 IEEE Workshop on Automatic Speech Recognition and Understanding.

[19]  Yuwei Chen,et al.  A MEMS IMU De-Noising Method Using Long Short Term Memory Recurrent Neural Networks (LSTM-RNN) , 2018, Sensors.

[20]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.