Migrating Knowledge between Physical Scenarios based on Artificial Neural Networks

Deep learning is known to be data-hungry, which hinders its application in many areas of science when datasets are small. Here, we propose to use transfer learning methods to migrate knowledge between different physical scenarios and significantly improve the prediction accuracy of artificial neural networks trained on a small dataset. This method can help reduce the demand for expensive data by making use of additional inexpensive data. First, we demonstrate that in predicting the transmission from multilayer photonic film, the relative error rate is reduced by 46.8% (26.5%) when the source data comes from 10-layer (8-layer) films and the target data comes from 8-layer (10-layer) films. Second, we show that the relative error rate is decreased by 22% when knowledge is transferred between two very different physical scenarios: transmission from multilayer films and scattering from multilayer nanoparticles. Finally, we propose a multi-task learning method to improve the performance of different physical scenarios simultaneously in which each task only has a small dataset.

[1]  Shane Legg,et al.  Human-level control through deep reinforcement learning , 2015, Nature.

[2]  Jason Weston,et al.  A unified architecture for natural language processing: deep neural networks with multitask learning , 2008, ICML '08.

[3]  Itzik Malkiel,et al.  Deep Learning for Design and Retrieval of Nano-photonic Structures , 2017, 1702.07949.

[4]  P. Baldi,et al.  Searching for exotic particles in high-energy physics with deep learning , 2014, Nature Communications.

[5]  C. Valagiannopoulos,et al.  Limits for Absorption and Scattering by Core-Shell Nanowires in the Visible Spectrum , 2019, Physical Review Applied.

[6]  S. Noda,et al.  Full three-dimensional photonic bandgap crystals at near-infrared wavelengths , 2000, Science.

[7]  Brian Kingsbury,et al.  New types of deep neural network learning for speech recognition and related applications: an overview , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[8]  Michael Bass,et al.  Handbook of optics , 1995 .

[9]  Demis Hassabis,et al.  Mastering the game of Go with deep neural networks and tree search , 2016, Nature.

[10]  J. Koenderink Q… , 2014, Les noms officiels des communes de Wallonie, de Bruxelles-Capitale et de la communaute germanophone.

[11]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[12]  Lei Wang,et al.  Discovering phase transitions with unsupervised learning , 2016, 1606.00318.

[13]  Steven G. Johnson,et al.  Photonic Crystals: Molding the Flow of Light , 1995 .

[14]  Yi Yang,et al.  Nanophotonic particle simulation and inverse design using artificial neural networks , 2018, Science Advances.

[15]  Tara N. Sainath,et al.  Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups , 2012, IEEE Signal Processing Magazine.

[16]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[17]  Robert P. Sheridan,et al.  Deep Neural Nets as a Method for Quantitative Structure-Activity Relationships , 2015, J. Chem. Inf. Model..

[18]  C. Valagiannopoulos,et al.  Core-shell nanospheres under visible light: Optimal absorption, scattering, and cloaking , 2019, Physical Review B.

[19]  D. Deng,et al.  Quantum Entanglement in Neural Network States , 2017, 1701.04844.

[20]  Yongmin Liu,et al.  Deep-Learning-Enabled On-Demand Design of Chiral Metamaterials. , 2018, ACS nano.

[21]  Ross B. Girshick,et al.  Fast R-CNN , 2015, 1504.08083.

[22]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[23]  Brendan J. Frey,et al.  Deep learning of the tissue-regulated splicing code , 2014, Bioinform..

[24]  Yifan Gong,et al.  Cross-language knowledge transfer using multilingual deep neural network with shared hidden layers , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[25]  Sebastian Thrun,et al.  Learning to Learn , 1998, Springer US.

[26]  B. Frey,et al.  The human splicing code reveals new insights into the genetic determinants of disease , 2015, Science.

[27]  Jacob biamonte,et al.  Quantum machine learning , 2016, Nature.

[28]  Emilio Soria Olivas,et al.  Handbook of Research on Machine Learning Applications and Trends : Algorithms , Methods , and Techniques , 2009 .

[29]  Chiho Kim,et al.  Machine learning in materials informatics: recent applications and prospects , 2017, npj Computational Materials.

[30]  William L. Jorgensen,et al.  Journal of Chemical Information and Modeling , 2005, J. Chem. Inf. Model..

[31]  Yoshua Bengio,et al.  Deep Learning of Representations for Unsupervised and Transfer Learning , 2011, ICML Unsupervised and Transfer Learning.

[32]  Yoshua Bengio,et al.  How transferable are features in deep neural networks? , 2014, NIPS.

[33]  Shie Mannor,et al.  Deep Learning Reconstruction of Ultra-Short Pulses , 2018, ArXiv.

[34]  Andrea J. Liu,et al.  A structural approach to relaxation in glassy liquids , 2015, Nature Physics.

[35]  Steven G. Johnson,et al.  Optimization of broadband optical response of multilayer nanospheres. , 2012, Optics express.

[36]  Matthias Troyer,et al.  Solving the quantum many-body problem with artificial neural networks , 2016, Science.

[37]  Razvan Pascanu,et al.  Sim-to-Real Robot Learning from Pixels with Progressive Nets , 2016, CoRL.

[38]  Gisbert Schneider,et al.  Deep Learning in Drug Discovery , 2016, Molecular informatics.

[39]  Zongfu Yu,et al.  Training Deep Neural Networks for the Inverse Design of Nanophotonic Structures , 2017, 2019 Conference on Lasers and Electro-Optics (CLEO).