A Comparison of ARIMA and LSTM in Forecasting Time Series
Abstract:Forecasting time series data is an important subject in economics, business, and finance. Traditionally, there are several techniques to effectively forecast the next lag of time series data such as univariate Autoregressive (AR), univariate Moving Average (MA), Simple Exponential Smoothing (SES), and more notably Autoregressive Integrated Moving Average (ARIMA) with its many variations. In particular, ARIMA model has demonstrated its outperformance in precision and accuracy of predicting the next lags of time series. With the recent advancement in computational power of computers and more importantly development of more advanced machine learning algorithms and approaches such as deep learning, new algorithms are developed to analyze and forecast time series data. The research question investigated in this article is that whether and how the newly developed deep learning-based algorithms for forecasting time series data, such as "Long Short-Term Memory (LSTM)", are superior to the traditional algorithms. The empirical studies conducted and reported in this article show that deep learning-based algorithms such as LSTM outperform traditional-based algorithms such as ARIMA model. More specifically, the average reduction in error rates obtained by LSTM was between 84 - 87 percent when compared to ARIMA indicating the superiority of LSTM to ARIMA. Furthermore, it was noticed that the number of training times, known as "epoch" in deep learning, had no effect on the performance of the trained forecast model and it exhibited a truly random behavior.
暂无分享,去 创建一个
[1] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[2] Seong Joon Yoo,et al. A Deep Efficient Frontier Method for Optimal Investments , 2017 .
[3] Josh Patterson,et al. Deep Learning: A Practitioner's Approach , 2017 .
[4] Aderemi Oluyinka Adewumi,et al. Stock Price Prediction Using the ARIMA Model , 2014, 2014 UKSim-AMSS 16th International Conference on Computer Modelling and Simulation.
[5] Mehdi Khashei,et al. A novel hybridization of artificial neural networks and ARIMA models for time series forecasting , 2011, Appl. Soft Comput..
[6] Jürgen Schmidhuber,et al. Learning to forget: continual prediction with LSTM , 1999 .
[7] Yong Chen,et al. Log-Assisted Straggler-Aware I/O Scheduler for High-End Computing , 2016, 2016 45th International Conference on Parallel Processing Workshops (ICPPW).
[8] Thomas Fischer,et al. Deep learning with long short-term memory networks for financial market predictions , 2017, Eur. J. Oper. Res..
[9] Gwilym M. Jenkins,et al. Time series analysis, forecasting and control , 1971 .
[10] Sima Siami-Namini,et al. The Short and Long Run Effects of Selected Variables on Tax Revenue - A Case Study , 2018, Applied Economics and Finance.
[11] Sima Siami-Namini,et al. The impacts of sector growth and monetary policy on income inequality in developing countries , 2019, Journal of Economic Studies.
[12] A. Earnest,et al. Using autoregressive integrated moving average (ARIMA) models to predict and monitor the number of beds occupied during a SARS outbreak in a tertiary hospital in Singapore , 2005, BMC health services research.
[13] George Athanasopoulos,et al. Forecasting: principles and practice , 2013 .
[14] Sima Siami-Namini,et al. Inflation and Income Inequality in Developed and Developing Countries , 2019, Journal of Economic Studies.
[15] Sima Siami-Namini,et al. Commodity Price Volatility and U.S. Monetary Policy: The Overshooting Hypothesis of Agricultural Commodity Prices , 2017 .
[16] Nicolas Huck,et al. Pairs selection and outranking: An application to the S&P 100 index , 2009, Eur. J. Oper. Res..
[17] Matthew Scotch,et al. Comparison of ARIMA and Random Forest time series models for prediction of avian influenza H5N1 outbreaks , 2014, BMC Bioinformatics.