An Efficient Approximation for Gaussian Process Regression

Gaussian processes are a powerful tool for regression problems. Beside computing regression curves using predictive mean values, the uncertainty of the estimations can be computed in terms of predictive variances. However, the complexity of learning and testing the model is often too large for practical use. We present an efficient approximation of the Gaussian process regression framework leading to reduced complexities of both runtime and memory. The idea is to approximate Gaussian process predictive mean and variance using a special diagonal matrix instead of the full kernel matrix. We show that this simple diagonal matrix approximation of the Gaussian process predictive variance is a true upper bound for the exact variance. Experimental results are presented for a standard regression task.