Quasi- Monte Carlo algorithms for solving linear algebraic equations
暂无分享,去创建一个
In this article, the QMC method is applied to solving linear algebraic equations. In particular a finite difference analogue of the five-dimensional Laplace equation is examined. The error distribution is studied for linear systems and some high-dimensional integrals. A modification of the QMC method for linear systems is suggested which allows to considerably reduce the constructive dimension of the algorithm.
[1] J. Halton,et al. Asymptotic complexity of Monte Carlo methods for solving linear systems , 2000 .