Bayesian Convolutional Neural Networks as probabilistic surrogates for the fast prediction of stress fields in structures with microscale features

Finite Element Analysis (FEA) for stress prediction in structures with microstructural features is computationally expensive since those features are much smaller than the other geometric features of the structure. The accurate prediction of the additional stress generated by such microstructural features therefore requires a very fine FE mesh. Omitting or averaging the effect of the microstructural features from FEA models is standard practice, resulting in faster calculations of global stress fields, which, assuming some degree of scale separability, may then be complemented by local defect analyses. The purpose of this work is to train an Encoder-Decoder Convolutional Neural Networks (CNN) to automatically add local fine-scale stress corrections to coarse stress predictions around defects. We wish to understand to what extent such a framework may provide reliable stress predictions inside and outside the training set, i.e. for unseen coarse scale geometries and stress distributions and/or unseen defect geometries. Ultimately, we aim to develop efficient offline data generation and online data acquisition methods to maximise the domain of validity of the CNN predictions. To achieve these ambitious goals, we will deploy a Bayesian approach providing not point estimates, but credible intervals of the fine-scale stress field, as a means to evaluate the uncertainty of the predictions. The uncertainty quantified by the network will automatically encompass the lack of knowledge due to unseen macro and micro features, and the lack of knowledge due to the potential lack of scale separability. This uncertainty will be used in a Selective Learning framework to reduce the data requirements of the network. In this work we will investigate stress prediction in 2D composite structures with randomly distributed circular pores.

[1]  Evgenii Tsymbalov,et al.  Dropout-based Active Learning for Regression , 2018, AIST.

[2]  Sergey Ioffe,et al.  Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift , 2015, ICML.

[3]  Nikolaos Papanikolopoulos,et al.  Multi-class active learning for image classification , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[4]  Levent Burak Kara,et al.  StressGAN: A Generative Deep Learning Model for 2D Stress Distribution Prediction , 2020, DAC 2020.

[5]  Gowri Srinivasan,et al.  StressNet: Deep Learning to Predict Stress With Fracture Propagation in Brittle Materials , 2020, ArXiv.

[6]  Pengfei Chen,et al.  Rethinking the Usage of Batch Normalization and Dropout in the Training of Deep Neural Networks , 2019, ArXiv.

[7]  Xiang Li,et al.  Understanding the Disharmony Between Dropout and Batch Normalization by Variance Shift , 2018, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[8]  Stephane Cotin,et al.  Simulation of hyperelastic materials in real-time using Deep Learning , 2019, Medical Image Anal..

[9]  Zoubin Ghahramani,et al.  Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning , 2015, ICML.

[10]  Wei Li,et al.  Diverse Region-Based CNN for Hyperspectral Image Classification , 2018, IEEE Transactions on Image Processing.

[11]  Samir A. Rawashdeh,et al.  A Single-Stream Segmentation and Depth Prediction CNN for Autonomous Driving , 2021, IEEE Intelligent Systems.

[12]  Hajime Igarashi,et al.  Topology Optimization Accelerated by Deep Learning , 2019, IEEE Transactions on Magnetics.

[13]  Zoubin Ghahramani,et al.  Deep Bayesian Active Learning with Image Data , 2017, ICML.

[14]  Kyoung Mu Lee,et al.  Deeply-Recursive Convolutional Network for Image Super-Resolution , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[15]  Kyoung Mu Lee,et al.  Enhanced Deep Residual Networks for Single Image Super-Resolution , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[16]  Julien Cornebise,et al.  Weight Uncertainty in Neural Networks , 2015, ArXiv.

[17]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[18]  Yoshua Bengio,et al.  Gradient Flow in Recurrent Nets: the Difficulty of Learning Long-Term Dependencies , 2001 .

[19]  L. F. Abbott,et al.  Random Walk Initialization for Training Very Deep Feedforward Networks , 2014, 1412.6558.

[20]  Pietro Perona,et al.  Entropy-based active learning for object recognition , 2008, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[21]  Hongbin Zha,et al.  Recurrent Squeeze-and-Excitation Context Aggregation Net for Single Image Deraining , 2018, ECCV.

[22]  Andreas Maier,et al.  Towards Fast Biomechanical Modeling of Soft Tissue Using Neural Networks. , 2018 .

[23]  Riashat Islam Active Learning for High Dimensional Inputs using Bayesian Convolutional Neural Networks , 2016 .

[24]  Alex Graves,et al.  Practical Variational Inference for Neural Networks , 2011, NIPS.

[25]  Enhua Wu,et al.  Squeeze-and-Excitation Networks , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[26]  Xin Li,et al.  Adaptive Active Learning for Image Classification , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[27]  Nikos Komodakis,et al.  Wide Residual Networks , 2016, BMVC.

[28]  Jiajun Wu,et al.  Deep multiple instance learning for image classification and auto-annotation , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[29]  Ying Tai,et al.  SESR: Single Image Super Resolution with Recursive Squeeze and Excitation Networks , 2018, 2018 24th International Conference on Pattern Recognition (ICPR).

[30]  Wei Sun,et al.  A deep learning approach to estimate stress distribution: a fast and accurate surrogate of finite-element analysis , 2018, Journal of The Royal Society Interface.

[31]  Guang Lin,et al.  Predicting Mechanical Properties from Microstructure Images in Fiber-reinforced Polymers using Convolutional Neural Networks , 2020, ArXiv.

[32]  Oge Marques,et al.  Dropout vs. batch normalization: an empirical study of their impact to deep learning , 2020, Multimedia Tools and Applications.

[33]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[34]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[35]  Aleksander Madry,et al.  How Does Batch Normalization Help Optimization? (No, It Is Not About Internal Covariate Shift) , 2018, NeurIPS.

[36]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[37]  Geoffrey E. Hinton,et al.  Keeping the neural networks simple by minimizing the description length of the weights , 1993, COLT '93.

[38]  Airong Chen,et al.  A deep Convolutional Neural Network for topology optimization with strong generalization ability , 2019, ArXiv.

[39]  Levent Burak Kara,et al.  Deep Learning for Stress Field Prediction Using Convolutional Neural Networks , 2018, J. Comput. Inf. Sci. Eng..

[40]  Ian Stavness,et al.  TOWARDS FINITE-ELEMENT SIMULATION USING DEEP LEARNING , 2018 .

[41]  Walter D. Pilkey,et al.  PETERSON'S STRESS CONCENTRATION FACTORS Third Edition , 2008 .