A degeneracy in DRW modelling of AGN light curves

Individual light curves of active galactic nuclei (AGNs) are nowadays successfully modelled with the damped random walk (DRW) stochastic process, characterized by the power exponential covariance matrix of the signal, with the power $\beta=1$. By Monte Carlo simulation means, we generate mock AGN light curves described by non-DRW stochastic processes ($0.5\leq\beta\leq 1.5$ and $\beta\neq1$) and show they can be successfully and well-modelled as a single DRW process, obtaining comparable goodness of fits. A good DRW fit, in fact, may not mean that DRW is the true underlying process leading to variability and it cannot be used as a proof for it. When comparing the input (non-DRW) and measured (DRW) process parameters, the recovered time scale (amplitude) increases (decreases) with the increasing input $\beta$. In practice, this means that the recovered DRW parameters may lead to biased (or even non-existing) correlations of the variability and physical parameters of AGNs if the true AGN variability is caused by non-DRW stochastic processes. The proper way of identifying the processes leading to variability are model-independent structure functions and/or power spectral densities and then using such information on the covariance matrix of the signal in light curve modelling.