All-mode renormalization for tensor network with stochastic noise

In usual (non-stochastic) tensor network calculations, the truncated singular value decomposition (SVD) is often used for approximating a tensor, and it causes systematic errors. By introducing stochastic noise in the approximation, however, one can avoid such systematic errors at the expense of statistical errors which can be straightforwardly controlled. Therefore in principle, exact results can be obtained even at finite bond dimension up to the statistical errors. A previous study of the unbiased method implemented in tensor renormalization group (TRG) algorithm, however, showed that the statistical errors for physical quantity are not negligible, and furthermore the computational cost is linearly proportional to a system volume. In this paper, we introduce a new way of stochastic noise such that the statistical error is suppressed, and moreover, in order to reduce the computational cost we propose common noise method whose cost is proportional to the logarithm of volume. We find that the method provides better accuracy for the free energy compared with the truncated SVD when applying to TRG for Ising model on square lattice. Although the common noise method introduces systematic error originated from a correlation of noises, we show that the error can be described by a simple functional form in terms of the number of noises, thus the error can be straightforwardly controlled in an actual analysis. We also apply the method to the graph independent local truncation algorithm and show that the accuracy is further improved.