Automatic Closed Modeling of Multiple Variable Systems Using Soft Computation

One of the most interesting goals in engineering and the sciences is the mathematical representation of physical, social and other kind of complex phenomena. This goal has been attempted and, lately, achieved with different machine learning (ML) tools. ML owes much of its present appeal to the fact that it allows to model complex phenomena without the explicit definition of the form of the model. Neural networks and support vector machines exemplify such methods. However, in most of the cases, these methods yield “black box” models, i.e. input and output correspond to the phenomena under scrutiny but it is very difficult (or outright impossible) to discern the interrelation of the input variables involved. In this paper we address this problem with the explicit aim of targeting on models which are closed in nature, i.e. the aforementioned relation between variables is explicit. In order to do this, in general, the only assumption regarding the data is that they be approximately continuous. In such cases it is possible to represent the system with polynomial expressions. To be able to do so one must define the number of monomials, the degree of every variable in every monomial and the coefficients associated. We model sparse data systems with an algorithm minimizing the min-max norm. From mathematical and experimental evidence we are able to set a bound on the number of terms and degrees of the approximating polynomials. Thereafter, a genetic algorithm (GA) identifies the coefficients which correspond to the terms and degrees defined as above.

[1]  Á. Kuri-Morales,et al.  The Best Genetic Algorithm II , 2013, MICAI 2013.

[2]  S. Zeger,et al.  Multivariate Regression Analyses for Categorical Data , 1992 .

[3]  Simon Haykin,et al.  Neural Networks and Learning Machines , 2010 .

[4]  David A. Ratkowsky,et al.  Handbook of nonlinear regression models , 1990 .

[5]  E. Cheney Introduction to approximation theory , 1966 .

[6]  David J. C. MacKay,et al.  Information Theory, Inference, and Learning Algorithms , 2004, IEEE Transactions on Information Theory.

[7]  Rene F. Swarttouw,et al.  Orthogonal polynomials , 2020, NIST Handbook of Mathematical Functions.

[8]  Angel Kuri-Morales,et al.  Polynomial Multivariate Approximation with Genetic Algorithms , 2014 .

[9]  Bernhard Beckermann,et al.  The condition number of real Vandermonde, Krylov and positive definite Hankel matrices , 2000, Numerische Mathematik.

[10]  Carl D. Meyer,et al.  Matrix Analysis and Applied Linear Algebra , 2000 .

[11]  M. J. D. Powell,et al.  A fast algorithm for nonlinearly constrained optimization calculations , 1978 .

[12]  Ángel Fernando Kuri Morales Training Neural Networks Using Non-standard Norms - Preliminary Results , 2000, MICAI.

[13]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[14]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1992, Math. Control. Signals Syst..

[15]  Jorge J. Moré,et al.  The Levenberg-Marquardt algo-rithm: Implementation and theory , 1977 .

[16]  E. Bishop A generalization of the Stone-Weierstrass theorem. , 1961 .

[17]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.