A Non-linear Function Approximation from Small Samples Based on Nadaraya-Watson Kernel Regression

Solving function approximation problem is to appropriately find the relationship between dependent variable and independent variable(s). Function approximation algorithms normally require sufficient amount of samples to approximate a function. However, insufficient samples may result in unsatisfactory prediction to any function approximation algorithms. It is due to the failure of the function approximation algorithms to fill the information gap between the available and very limited samples. In this study, a function approximation algorithm which is based on Nadaraya-Watson Kernel Regression (NWKR) is proposed for approximating a non-linear function with small samples. Gaussian function is chosen as a kernel function for this study. The results show that the NWKR is effective in the case where the target function is non-linear and the given training sample is small. The performance of the NWKR is compared with other existing function approximation algorithms, such as artificial neural network.

[1]  Jules Thibault,et al.  Process modeling with neural networks using small experimental datasets , 1999 .

[2]  Der-Chiang Li,et al.  Approximate modeling for high order non-linear functions using small sample sets , 2008, Expert Syst. Appl..

[3]  Soon-Chuan Ong,et al.  Learning from small data sets to improve assembly semiconductor manufacturing processes , 2010, 2010 The 2nd International Conference on Computer and Automation Engineering (ICCAE).

[4]  E. Nadaraya On Non-Parametric Estimates of Density Functions and Regression Curves , 1965 .

[5]  Bart Kosko,et al.  Fuzzy Systems as Universal Approximators , 1994, IEEE Trans. Computers.

[6]  L. Wang,et al.  Fuzzy systems are universal approximators , 1992, [1992 Proceedings] IEEE International Conference on Fuzzy Systems.

[7]  G. S. Watson,et al.  Smooth regression analysis , 1964 .

[8]  P. Werbos,et al.  Beyond Regression : "New Tools for Prediction and Analysis in the Behavioral Sciences , 1974 .

[9]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[10]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[11]  Tomaso Poggio,et al.  Incorporating prior information in machine learning by creating virtual examples , 1998, Proc. IEEE.

[12]  Claudio Moraga,et al.  A diffusion-neural-network for learning from small samples , 2004, Int. J. Approx. Reason..

[13]  J. C. Suh,et al.  Function Approximations by Superimposing Genetic Programming Trees: With Applications to Engineering Problems , 2000, Inf. Sci..

[14]  E. Nadaraya On Estimating Regression , 1964 .

[15]  Katya Rodríguez-Vázquez,et al.  Multi-branches Genetic Programming as a Tool for Function Approximation , 2004, GECCO.

[16]  Yong Shi,et al.  Towards Efficient Fuzzy Information Processing - Using the Principle of Information Diffusion , 2002, Studies in Fuzziness and Soft Computing.

[17]  Ken-ichi Funahashi,et al.  On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.

[18]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..