Shape functional optimization with restrictions boosted with machine learning techniques

Shape optimization is a widely used technique in the design phase of a product. Current ongoing improvement policies require a product to fulfill a series of conditions from the perspective of mechanical resistance, fatigue, natural frequency, impact resistance, etc. All these conditions are translated into equality or inequality restrictions which must be satisfied during the optimization process that is necessary in order to determine the optimal shape. This article describes a new method for shape optimization that considers any regular shape as a possible shape, thereby improving on traditional methods limited to straight profiles or profiles established a priori. Our focus is based on using functional techniques and this approach is, based on representing the shape of the object by means of functions belonging to a finite-dimension functional space. In order to resolve this problem, the article proposes an optimization method that uses machine learning techniques for functional data in order to represent the perimeter of the set of feasible functions and to speed up the process of evaluating the restrictions in each iteration of the algorithm. The results demonstrate that the functional approach produces better results in the shape optimization process and that speeding up the algorithm using machine learning techniques ensures that this approach does not negatively affect design process response times.

[1]  B. Yegnanarayana,et al.  Artificial Neural Networks , 2004 .

[2]  G. R. Walsh,et al.  Methods Of Optimization , 1976 .

[3]  José M. Matías,et al.  Functional support vector machines and generalized linear models for glacier geomorphology analysis , 2009, Int. J. Comput. Math..

[4]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[5]  José M. Matías,et al.  Functional Pattern Recognition of 3D Laser Scanned Images of Wood-Pulp Chips , 2007, IbPRIA.

[6]  B. Silverman,et al.  Functional Data Analysis , 1997 .

[7]  Halbert White,et al.  Artificial Neural Networks: Approximation and Learning Theory , 1992 .

[8]  Peter L. Bartlett,et al.  Learning in Neural Networks: Theoretical Foundations , 1999 .

[9]  Michel Verleysen,et al.  Representation of functional data in neural networks , 2005, Neurocomputing.

[10]  G. Gladwell,et al.  Solid mechanics and its applications , 1990 .

[11]  W. González-Manteiga,et al.  Comparison of Kriging and Neural Networks With Application to the Exploitation of a Slate Mine , 2004 .

[12]  Alexander J. Smola,et al.  Support Vector Regression Machines , 1996, NIPS.

[13]  Martin T. Hagan,et al.  Gauss-Newton approximation to Bayesian learning , 1997, Proceedings of International Conference on Neural Networks (ICNN'97).

[14]  Virginia Torczon,et al.  On the Convergence of Pattern Search Algorithms , 1997, SIAM J. Optim..

[15]  J. M. Thomas,et al.  Introduction à l'analyse numérique des équations aux dérivées partielles , 1983 .

[16]  H. Saunders,et al.  Finite element procedures in engineering analysis , 1982 .

[17]  Robert Michael Lewis,et al.  A Globally Convergent Augmented Lagrangian Pattern Search Algorithm for Optimization with General Constraints and Simple Bounds , 2002, SIAM J. Optim..

[18]  Manolis Papadrakakis,et al.  Structural optimization using evolution strategies and neural networks , 1998 .

[19]  James A. Anderson,et al.  Neurocomputing: Foundations of Research , 1988 .

[20]  Peter L. Bartlett,et al.  Neural Network Learning - Theoretical Foundations , 1999 .

[21]  Bernhard Schölkopf,et al.  Learning with kernels , 2001 .