Incorporating Prior Domain Knowledge into Deep Neural Networks

In recent years, the large amount of labeled data available has also helped tend research toward using minimal domain knowledge, e.g., in deep neural network research. However, in many situations, data is limited and of poor quality. Can domain knowledge be useful in such a setting? In this paper, we propose domain adapted neural networks (DANN) to explore how domain knowledge can be integrated into model training for deep networks. In particular, we incorporate loss terms for knowledge available as monotonicity constraints and approximation constraints. We evaluate our model on both synthetic data generated using the popular Bohachevsky function and a real-world dataset for predicting oxygen solubility in water. In both situations, we find that our DANN model outperforms its domain-agnostic counterpart yielding an overall mean performance improvement of 19.5% with a worst- and best-case performance improvement of 4% and 42.7%, respectively.

[1]  Jinlong Wu,et al.  Physics-informed machine learning approach for reconstructing Reynolds stress modeling discrepancies based on DNS data , 2016, 1606.07987.

[2]  Hod Lipson,et al.  Automated reverse engineering of nonlinear dynamical systems , 2007, Proceedings of the National Academy of Sciences.

[3]  Heng Xiao,et al.  Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier-Stokes simulations: A data-driven, physics-informed Bayesian approach , 2015, J. Comput. Phys..

[4]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  Ravi Kiran Sarvadevabhatla,et al.  A Taxonomy of Deep Convolutional Neural Nets for Computer Vision , 2016, Front. Robot. AI.

[6]  Anima Anandkumar,et al.  Combining Symbolic and Function Evaluation Expressions In Neural Programs , 2018, ArXiv.

[7]  Zhenhao Duan,et al.  Prediction of oxygen solubility in pure water and brines up to high temperatures and pressures , 2010 .

[8]  Jiajun Zhang,et al.  Deep Neural Networks in Machine Translation: An Overview , 2015, IEEE Intelligent Systems.

[9]  Yaser S. Abu-Mostafa,et al.  A Method for Learning From Hints , 1992, NIPS.

[10]  Darryl Charles,et al.  Machine learning in digital games: a survey , 2008, Artificial Intelligence Review.

[11]  Chris Dyer,et al.  Neural Arithmetic Logic Units , 2018, NeurIPS.

[12]  Heng Xiao,et al.  Incorporating Prior Knowledge for Quantifying and Reducing Model-Form Uncertainty in RANS Simulations , 2015 .

[13]  Erik Cambria,et al.  Recent Trends in Deep Learning Based Natural Language Processing , 2017, IEEE Comput. Intell. Mag..

[14]  Milos Jovanovic,et al.  Framework for integration of domain knowledge into logistic regression , 2018, WIMS.

[15]  Gary Marcus,et al.  Innateness, AlphaZero, and Artificial Intelligence , 2018, ArXiv.

[16]  Demis Hassabis,et al.  Mastering the game of Go without human knowledge , 2017, Nature.

[17]  Rui Wang,et al.  A Survey of Domain Adaptation for Neural Machine Translation , 2018, COLING.

[18]  G. A. Truesdale,et al.  The solubility of oxygen in pure water and sea‐water , 2007 .

[19]  Anuj Karpatne,et al.  Physics-guided Neural Networks (PGNN): An Application in Lake Temperature Modeling , 2017, ArXiv.