Data-Driven Design via Scalable Gaussian Processes for Multi-Response Big Data With Qualitative Factors
暂无分享,去创建一个
Scientific and engineering problems often require an inexpensive surrogate model to aid understanding and the search for promising designs. While Gaussian processes (GP) stand out as easy-to-use and interpretable learners in surrogate modeling, they have difficulties in accommodating big datasets, qualitative inputs, and multi-type responses obtained from different simulators, which has become a common challenge for a growing number of data-driven design applications. In this paper, we propose a GP model that utilizes latent variables and functions obtained through variational inference to address the aforementioned challenges simultaneously. The method is built upon the latent variable Gaussian process (LVGP) model where qualitative factors are mapped into a continuous latent space to enable GP modeling of mixed-variable datasets. By extending variational inference to LVGP models, the large training dataset is replaced by a small set of inducing points to address the scalability issue. Output response vectors are represented by a linear combination of independent latent functions, forming a flexible kernel structure to handle multi-type responses. Comparative studies demonstrate that the proposed method scales well for large datasets with over 104 data points, while outperforming state-of-the-art machine learning methods without requiring much hyperparameter tuning. In addition, an interpretable latent space is obtained to draw insights into the effect of qualitative factors, such as those associated with “building blocks” of architectures and element choices in metamaterial and materials design. Our approach is demonstrated for machine learning of ternary oxide materials and topology optimization of a multiscale compliant mechanism with aperiodic microstructures and multiple materials.