Active Learning for Feasible Region Discovery

Often in the design process of an engineer, the design specifications of the system are not completely known initially. However, usually there are some physical constraints which are already known, corresponding to a region of interest in the design space that is called feasible. These constraints often have no analytical form but need to be characterised based on expensive simulations or measurements. Therefore, it is important that the feasible region can be modeled sufficiently accurate using only a limited amount of samples. This can be solved by using active learning techniques that minimize the amount of samples w.r.t. what we try to model. Most active learning strategies focus on classification models or regression models with classification accuracy and regression accuracy in mind respectively. In this work, regression models of the constraints are used, but only the (in) feasibility is of interest. To tackle this problem, an information-theoretic sampling strategy is constructed to discover these regions. The proposed method is then tested on two synthetic examples and one engineering example and proves to outperform the current state-of-the-art.

[1]  Neil D. Lawrence,et al.  Computationally Efficient Convolved Multiple Output Gaussian Processes , 2011, J. Mach. Learn. Res..

[2]  Wei Chen,et al.  Active expansion sampling for learning feasible domains in an unbounded input space , 2017, Structural and Multidisciplinary Optimization.

[3]  Tom Dhaene,et al.  GPflowOpt: A Bayesian Optimization Library using TensorFlow , 2017, NIPS 2017.

[4]  David J. C. MacKay,et al.  Information-Based Objective Functions for Active Data Selection , 1992, Neural Computation.

[5]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[6]  Daniel Hern'andez-Lobato,et al.  Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints , 2016, Neurocomputing.

[7]  Zi Wang,et al.  Max-value Entropy Search for Efficient Bayesian Optimization , 2017, ICML.

[8]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[9]  Margaret J. Robertson,et al.  Design and Analysis of Experiments , 2006, Handbook of statistics.

[10]  Philipp Hennig,et al.  Entropy Search for Information-Efficient Global Optimization , 2011, J. Mach. Learn. Res..

[11]  Zoubin Ghahramani,et al.  Collaborative Gaussian Processes for Preference Learning , 2012, NIPS.

[12]  Andy J. Keane,et al.  Engineering Design via Surrogate Modelling - A Practical Guide , 2008 .

[13]  Dirk Gorissen,et al.  Grid-enabled adaptive surrogate modeling for computer aided engineering , 2010 .

[14]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[15]  Diego Granziol,et al.  Fast Information-theoretic Bayesian Optimisation , 2017, ICML.

[16]  Edwin Lughofer,et al.  Hybridization of multi-objective evolutionary algorithms and artificial neural networks for optimizing the performance of electrical drives , 2013, Eng. Appl. Artif. Intell..

[17]  Daniel J. Fonseca,et al.  Simulation metamodeling through artificial neural networks , 2003 .

[18]  Tom Dhaene,et al.  Machine Learning for Fast Characterization of Magnetic Logic Devices , 2018, 2018 IEEE Electrical Design of Advanced Packaging and Systems Symposium (EDAPS).

[19]  Dick den Hertog,et al.  Maximin Latin Hypercube Designs in Two Dimensions , 2007, Oper. Res..

[20]  Edwin Lughofer,et al.  Performance comparison of generational and steady-state asynchronous multi-objective evolutionary algorithms for computationally-intensive problems , 2015, Knowl. Based Syst..

[21]  Horst Nowacki,et al.  Modelling of Design Decision for CAD , 1980, CAD Advanced Course.

[22]  Sonja Kuhnt,et al.  Design and analysis of computer experiments , 2010 .

[23]  Piet Demeester,et al.  A Surrogate Modeling and Adaptive Sampling Toolbox for Computer Based Design , 2010, J. Mach. Learn. Res..

[24]  Matthew W. Hoffman,et al.  Predictive Entropy Search for Efficient Global Optimization of Black-box Functions , 2014, NIPS.