A general formulation ofnon-linear least square regression using multi-layered perceptrons

Non linear regression and non linear approximation are widely used for data analysis. In many applications, the aim is to build a model linking observations and parameters of a physical system. Two cases of increasing complexity have been studied: the case of deterministic inputs and noisy output data and the case of noisy input and output data. We present in this paper a general formulation of non linear regression using multilayered Perceptrons. Regression algorithms are derived in the three cases. In particular, a generalized learning rule is proposed to deal with noisy input and output data. The algorithm enables not only to build an accurate model but also to re*ne the learning data set. The algorithms are tested on two real-world problem in Geophysics. The good results suggests that multilayered Perceptrons can emmerged as an e°cient nonlinear regression model for a wide range of applications.

[1]  David A. Nix,et al.  Learning Local Error Bars for Nonlinear Regression , 1994, NIPS.

[2]  Dan Cornford,et al.  Structured neural network modelling of multi-valued functions for wind vector retrieval from satellite scatterometer measurements , 2000, Neurocomputing.

[3]  Ashok N. Srivastava,et al.  Nonlinear gated experts for time series: discovering regimes and avoiding overfitting , 1995, Int. J. Neural Syst..

[4]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[5]  Hans Henrik Thodberg,et al.  A review of Bayesian neural networks with an application to near infrared spectroscopy , 1996, IEEE Trans. Neural Networks.

[6]  Robert A. Jacobs,et al.  Hierarchical Mixtures of Experts and the EM Algorithm , 1993, Neural Computation.

[7]  S. Thiria,et al.  Determination of the Geophysical Model Function of NSCAT and its corresponding variance by the use of Neural Networks , 1999 .

[8]  Olivier Sarzeaud,et al.  Neural Direct Approaches for Geoacoustic Inversion , 1998 .

[9]  Roberto Battiti,et al.  First- and Second-Order Methods for Learning: Between Steepest Descent and Newton's Method , 1992, Neural Computation.

[10]  W. Härdle Applied Nonparametric Regression , 1992 .

[11]  Halbert White,et al.  Connectionist nonparametric regression: Multilayer feedforward networks can learn arbitrary mappings , 1990, Neural Networks.

[12]  F. Girosi,et al.  Networks for approximation and learning , 1990, Proc. IEEE.

[13]  S. Hyakin,et al.  Neural Networks: A Comprehensive Foundation , 1994 .

[14]  A. Refenes Neural Networks in the Capital Markets , 1994 .

[15]  David J. C. MacKay,et al.  Bayesian Interpolation , 1992, Neural Computation.

[16]  Michael I. Jordan,et al.  Forward Models: Supervised Learning with a Distal Teacher , 1992, Cogn. Sci..

[17]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[18]  H. Claustre,et al.  Variability in the chlorophyll‐specific absorption coefficients of natural phytoplankton: Analysis and parameterization , 1995 .

[19]  Fouad Badran,et al.  Neural network wind retrieval from ERS-1 scatterometer data , 2000, Neurocomputing.

[20]  Ken-ichi Funahashi,et al.  On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.

[21]  P. M. Williams,et al.  Using Neural Networks to Model Conditional Multivariate Densities , 1996, Neural Computation.

[22]  Michael I. Jordan,et al.  Hierarchical Mixtures of Experts and the EM Algorithm , 1994, Neural Computation.