Mixed Effect Composite RNN-GP: A Personalized and Reliable Prediction Model for Healthcare

We present a personalized and reliable prediction model for healthcare, which can provide individually tailored medical services such as diagnosis, disease treatment and prevention. Our proposed framework targets to making reliable predictions from time-series data, such as Electronic Health Records (EHR), by modeling two complementary components: i) shared component that captures global trend across diverse patients and ii) patient-specific component that models idiosyncratic variability for each patient. To this end, we propose a composite model of a deep recurrent neural network (RNN) to exploit expressive power of the RNN in estimating global trends from large number of patients, and Gaussian Processes (GP) to probabilistically model individual time-series given relatively small number of time points. We evaluate the strength of our model on diverse and heterogeneous tasks in EHR datasets. The results show that our model significantly outperforms baselines such as RNN, demonstrating clear advantage over existing models when working with noisy medical data.

[1]  David C. Kale,et al.  Modeling Missing Data in Clinical Time Series with RNNs , 2016 .

[2]  David A. Clifton,et al.  Gaussian Processes for Personalized e-Health Monitoring With Wearable Sensors , 2013, IEEE Transactions on Biomedical Engineering.

[3]  Peter Szolovits,et al.  MIMIC-III, a freely accessible critical care database , 2016, Scientific Data.

[4]  Zoubin Ghahramani,et al.  Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning , 2015, ICML.

[5]  Katherine A. Heller,et al.  Learning to Detect Sepsis with a Multitask Gaussian Process RNN Classifier , 2017, ICML.

[6]  Christopher K. I. Williams Computing with Infinite Networks , 1996, NIPS.

[7]  Edwin V. Bonilla,et al.  Kernel Multi-task Learning using Task-specific Features , 2007, AISTATS.

[8]  Walter F. Stewart,et al.  Doctor AI: Predicting Clinical Events via Recurrent Neural Networks , 2015, MLHC.

[9]  Kenney Ng,et al.  Personalized Predictive Modeling and Risk Factor Identification using Patient Similarity , 2015, AMIA Joint Summits on Translational Science proceedings. AMIA Joint Summits on Translational Science.

[10]  Zoubin Ghahramani,et al.  Sparse Gaussian Processes using Pseudo-inputs , 2005, NIPS.

[11]  Jimeng Sun,et al.  RETAIN: An Interpretable Predictive Model for Healthcare using Reverse Time Attention Mechanism , 2016, NIPS.

[12]  Ognjen Rudovic,et al.  Personalized Gaussian Processes for Future Prediction of Alzheimer's Disease Progression , 2017, ArXiv.

[13]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[14]  C. Rasmussen,et al.  Approximations for Binary Gaussian Process Classification , 2008 .

[15]  Michalis K. Titsias,et al.  Variational Learning of Inducing Variables in Sparse Gaussian Processes , 2009, AISTATS.

[16]  Charles Elkan,et al.  Learning to Diagnose with LSTM Recurrent Neural Networks , 2015, ICLR.

[17]  Yan Liu,et al.  Recurrent Neural Networks for Multivariate Time Series with Missing Values , 2016, Scientific Reports.

[18]  David A. Clifton,et al.  Multitask Gaussian Processes for Multivariate Physiological Time-Series Analysis , 2015, IEEE Transactions on Biomedical Engineering.

[19]  Katherine A. Heller,et al.  An Improved Multi-Output Gaussian Process RNN with Real-Time Validation for Early Sepsis Detection , 2017, MLHC.

[20]  Suchi Saria,et al.  A Framework for Individualizing Predictions of Disease Trajectories by Exploiting Multi-Resolution Structure , 2015, NIPS.

[21]  Neil D. Lawrence,et al.  Deep Gaussian Processes , 2012, AISTATS.

[22]  Edwin V. Bonilla,et al.  Multi-task Gaussian Process Prediction , 2007, NIPS.

[23]  Mihaela van der Schaar,et al.  Personalized Risk Scoring for Critical Care Patients using Mixtures of Gaussian Process Experts , 2016, ArXiv.

[24]  Manfred Opper,et al.  The Variational Gaussian Approximation Revisited , 2009, Neural Computation.

[25]  Yuesheng Xu,et al.  Universal Kernels , 2006, J. Mach. Learn. Res..

[26]  Geoffrey E. Hinton,et al.  Adaptive Mixtures of Local Experts , 1991, Neural Computation.

[27]  Aram Galstyan,et al.  Multitask learning and benchmarking with clinical time series data , 2017, Scientific Data.

[28]  V. Roshan Joseph,et al.  Composite Gaussian process models for emulating expensive functions , 2012, 1301.2503.

[29]  Andrew Gordon Wilson,et al.  Deep Kernel Learning , 2015, AISTATS.

[30]  Katherine A. Heller,et al.  Predicting Disease Progression with a Model for Multivariate Longitudinal Clinical Data , 2016, MLHC.