Expanding Gaussian kernels for multivariate conditional density estimation

We propose a new method to estimate the multivariate conditional density, f(m|x), a density over the output space m conditioned on any given input x. In particular, we are interested in cases where the number of available training data points is relatively sparse within x space. We start from a priori considerations and establish certain desirable characteristics in kernel functions for conditional density estimation. We find that Gaussian kernels with expanding covariances, expanding as we move away from the data point of the kernel, satisfy these a priori considerations. We combine these expanding Gaussian kernels (EGK) according to Bayesian techniques. We compare the EGK with standard Gaussian kernel (SDK) methods, and find that EGK avoids multimodality, has diminishing confidence levels farther from training points, performs better asymptotically, and performs better with respect to the Kullback-Leibler criteria.