Modelling traditional Chinese medicine therapy planning with POMDP

During the traditional Chinese medicine (TCM) treatment procedure, the manifestations of patients could be observed but the health state and TCM diagnosis of patient are uncertain. Thus, the real–world TCM therapy planning is a typical kind of dynamic decision making under uncertainty. Partially observable Markov decision process (POMDP) constitutes a powerful mathematical model for planning and is suitable for TCM therapy planning. In this paper, we apply POMDP to solve TCM therapy planning problem with all the dynamics inferred from TCM clinical data for type 2 diabetes treatment. This POMDP model contains 55 health states, 67 observation variables and 414 actions, it could order prescriptions for patients with type 2 diabetes. The results demonstrate that the POMDP model for TCM therapy planning is reasonable and helpful in clinical practice.

[1]  S. Wild,et al.  Global prevalence of diabetes: estimates for the year 2000 and projections for 2030. , 2004, Diabetes care.

[2]  J P Kassirer,et al.  Alternative medicine--the risks of untested and unregulated remedies. , 1998, The New England journal of medicine.

[3]  Ross D. Shachter,et al.  Dynamic programming and influence diagrams , 1990, IEEE Trans. Syst. Man Cybern..

[4]  Giovanni Maciocia,et al.  The Foundations of Chinese Medicine: A Comprehensive Text for Acupuncturists and Herbalists , 2005 .

[5]  Michael L. Littman,et al.  Incremental Pruning: A Simple, Fast, Exact Method for Partially Observable Markov Decision Processes , 1997, UAI.

[6]  Jon C. Aster,et al.  Robbins & Cotran Pathologic Basis of Disease , 2014 .

[7]  Joelle Pineau,et al.  Point-based value iteration: An anytime algorithm for POMDPs , 2003, IJCAI.

[8]  Yonghong Peng,et al.  Text mining for traditional Chinese medical knowledge discovery: A survey , 2010, J. Biomed. Informatics.

[9]  Oguzhan Alagoz,et al.  A Clinically Based Discrete-Event Simulation of End-Stage Liver Disease and the Organ Allocation Process , 2005, Medical decision making : an international journal of the Society for Medical Decision Making.

[10]  Baoyan Liu,et al.  Development of traditional Chinese medicine clinical data warehouse for medical knowledge discovery and decision support , 2010, Artif. Intell. Medicine.

[11]  Leslie A Lenert,et al.  Discrete State Analysis for Interpretation of Data From Clinical Trials , 2004, Medical care.

[12]  Rury R Holman,et al.  Development and progression of nephropathy in type 2 diabetes: the United Kingdom Prospective Diabetes Study (UKPDS 64). , 2003, Kidney international.

[13]  Weihong Zhang,et al.  Speeding Up the Convergence of Value Iteration in Partially Observable Markov Decision Processes , 2011, J. Artif. Intell. Res..

[14]  Nikos A. Vlassis,et al.  Perseus: Randomized Point-based Value Iteration for POMDPs , 2005, J. Artif. Intell. Res..

[15]  Guy Shani,et al.  Forward Search Value Iteration for POMDPs , 2007, IJCAI.

[16]  Craig Boutilier,et al.  Computing Optimal Policies for Partially Observable Decision Processes Using Compact Representations , 1996, AAAI/IAAI, Vol. 2.

[17]  Andrew J. Schaefer,et al.  Modeling Medical Treatment Using Markov Decision Processes , 2005 .

[18]  Reid G. Simmons,et al.  Heuristic Search Value Iteration for POMDPs , 2004, UAI.

[19]  Andrew J. Schaefer,et al.  The Optimal Timing of Living-Donor Liver Transplantation , 2004, Manag. Sci..

[20]  Edward J. Sondik,et al.  The Optimal Control of Partially Observable Markov Processes over a Finite Horizon , 1973, Oper. Res..

[21]  Jin-Ling Tang,et al.  Traditional Chinese medicine , 2008, The Lancet.

[22]  Lisa M. Maillart,et al.  Assessing Dynamic Breast Cancer Screening Policies , 2008, Oper. Res..

[23]  Susan A. Murphy,et al.  Efficient Reinforcement Learning with Multiple Reward Functions for Randomized Controlled Trial Analysis , 2010, ICML.

[24]  S. Murphy,et al.  Methodological Challenges in Constructing Effective Treatment Sequences for Chronic Psychiatric Disorders , 2007, Neuropsychopharmacology.

[25]  Edward J. Sondik,et al.  The Optimal Control of Partially Observable Markov Processes over the Infinite Horizon: Discounted Costs , 1978, Oper. Res..

[26]  J. MacQueen Some methods for classification and analysis of multivariate observations , 1967 .

[27]  A. Cassandra,et al.  Exact and approximate algorithms for partially observable markov decision processes , 1998 .

[28]  Zhengzhu Feng,et al.  Dynamic Programming for POMDPs Using a Factored State Representation , 2000, AIPS.

[29]  Michael L. Littman,et al.  A tutorial on partially observable Markov decision processes , 2009 .

[30]  Nilay D Shah,et al.  Optimizing the Start Time of Statin Therapy for Patients with Diabetes , 2009, Medical decision making : an international journal of the Society for Medical Decision Making.

[31]  Vili Podgorelec,et al.  Decision Trees: An Overview and Their Use in Medicine , 2002, Journal of Medical Systems.

[32]  Milos Hauskrecht,et al.  Planning treatment of ischemic heart disease with partially observable Markov decision processes , 2000, Artif. Intell. Medicine.

[33]  Marcel A. J. van Gerven,et al.  Selecting treatment strategies with dynamic limited-memory influence diagrams , 2007, Artif. Intell. Medicine.

[34]  Martin L. Puterman,et al.  Markov Decision Processes: Discrete Stochastic Dynamic Programming , 1994 .

[35]  N. B. Peek Predictive probabilistic models for treatment planning in paediatric cardiology , 1998 .

[36]  Silvana Quaglini,et al.  Deciding when to intervene: a Markov decision process approach , 2000, Int. J. Medical Informatics.

[37]  Gareth M. James,et al.  Hidden Markov Models for Longitudinal Comparisons , 2005 .

[38]  Zhaohui Wu,et al.  Knowledge discovery in traditional Chinese medicine: State of the art and perspectives , 2006, Artif. Intell. Medicine.

[39]  Steven L. Shafer,et al.  Comparison of Some Suboptimal Control Policies in Medical Drug Therapy , 1996, Oper. Res..

[40]  Jian Yu,et al.  A MDP solution for Traditional Chinese medicine treatment planning , 2010, 2010 3rd International Conference on Biomedical Engineering and Informatics.

[41]  Leslie Pack Kaelbling,et al.  Planning and Acting in Partially Observable Stochastic Domains , 1998, Artif. Intell..

[42]  D. Fryback,et al.  HALYS and QALYS and DALYS, Oh My: similarities and differences in summary measures of population Health. , 2002, Annual review of public health.