Regularised Least-Squares Regression with Infinite-Dimensional Output Space

We present some learning theory results on reproducing kernel Hilbert space (RKHS) regression, where the output space is an infinite-dimensional Hilbert space.

[1]  László Györfi,et al.  A Probabilistic Theory of Pattern Recognition , 1996, Stochastic Modelling and Applied Probability.

[2]  Maxime Sangnier,et al.  Infinite Task Learning in RKHSs , 2019, AISTATS.

[3]  Stéphane Canu,et al.  Operator-valued Kernels for Learning from Functional Response Data , 2015, J. Mach. Learn. Res..

[4]  B. Bollobás Linear analysis : an introductory course , 1990 .

[5]  S. Smale,et al.  Learning Theory Estimates via Integral Operators and Their Approximations , 2007 .

[6]  Adam Krzyzak,et al.  A Distribution-Free Theory of Nonparametric Regression , 2002, Springer series in statistics.

[7]  C. Carmeli,et al.  Vector valued reproducing kernel Hilbert spaces and universality , 2008, 0807.1659.

[8]  Pierre Laforgue,et al.  Duality in RKHSs with Infinite Dimensional Outputs: Application to Robust Losses , 2020, ICML.

[9]  C. Carmeli,et al.  VECTOR VALUED REPRODUCING KERNEL HILBERT SPACES OF INTEGRABLE FUNCTIONS AND MERCER THEOREM , 2006 .

[10]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[11]  B. Hall Quantum Theory for Mathematicians , 2013 .

[12]  Charles A. Micchelli,et al.  On Learning Vector-Valued Functions , 2005, Neural Computation.

[13]  Radu Precup,et al.  Methods in Nonlinear Integral Equations , 2002 .

[14]  Arthur Gretton,et al.  Learning Theory for Distribution Regression , 2014, J. Mach. Learn. Res..

[15]  J. Weidmann Linear Operators in Hilbert Spaces , 1980 .

[16]  Felipe Cucker,et al.  On the mathematical foundations of learning , 2001 .