Quasi-interpolation and outliers removal

In this work, we present a method that will allow us to remove outliers from a data set. Given the measurements of a function f = g + e on a set of sample points X⊂ℝd$X \subset \mathbb {R}^{d}$, where g∈CM+1(ℝd)$g \in C^{M+1}(\mathbb {R}^{d})$ is the function of interest and e is the deviation from the function g. We will say that a sample point x ∈ X is an outlier if the difference e(x) = f(x) − g(x) is large. We show that by analyzing the approximation errors on our sample set X, we may predict which of the sample points are outliers. Furthermore, we can identify outliers of very small deviations, as well as ones with large deviations.