Exploiting statistical dependencies of time series with hierarchical correlation reconstruction

While we are usually focused on forecasting future values of time series, it is often valuable to additionally predict their entire probability distributions, e.g. to evaluate risk, Monte Carlo simulations. On example of time series of $\approx$ 30000 Dow Jones Industrial Averages, there will be presented application of hierarchical correlation reconstruction for this purpose: MSE estimating polynomial as joint density for (current value, context), where context is for example a few previous values. Then substituting the currently observed context and normalizing density to 1, we get predicted probability distribution for the current value. In contrast to standard machine learning approaches like neural networks, optimal polynomial coefficients here have inexpensive direct formula, have controllable accuracy, are unique and independently calculated, each has a specific cumulant-like interpretation, and such approximation can asymptotically approach complete description of any real joint distribution - providing universal tool to quantitatively describe and exploit statistical dependencies in time series, systematically enhancing ARMA/ARCH-like approaches, also based on different distributions than Gaussian which turns out improper for daily log returns. There is also discussed application for non-stationary time series like calculating linear time trend, or adapting coefficients to local statistical behavior.