Degree of Approximation by Neural and Translation Networks with a Single Hidden Layer

Let s >= d >= 1 be integers, 1 @? p < ~. We investigate the degree of approximation of [email protected] functions in L^p[[email protected], @p]^s (resp. C[- @p, @p]^s) by finite linear combinations of translates and (matrix) dilates of a [email protected] function in L^p[[email protected], @p]^d (resp. C[- @p, @p]^d). Applications to the theory of neural networks and radial basis approximation of functions which are not necessarily periodic are also discussed. In particular, we estimate the order of approximation by radial basis functions in terms of the number of translates involved in the approximating function.