Modeling Medical Treatment Using Markov Decision Processes
暂无分享,去创建一个
Andrew J. Schaefer | Matthew D. Bailey | Mark S. Roberts | Steven M. Shechter | S. Shechter | A. Schaefer | M. Roberts
[1] Steven L. Shafer,et al. Comparison of Some Suboptimal Control Policies in Medical Drug Therapy , 1996, Oper. Res..
[2] Dimitri P. Bertsekas,et al. Dynamic Programming and Optimal Control, Two Volume Set , 1995 .
[3] Benjamin Van Roy. Neuro-Dynamic Programming: Overview and Recent Trends , 2002 .
[4] K. Kirby,et al. Modeling Myopic Decisions: Evidence for Hyperbolic Delay-Discounting within Subjects and Amounts , 1995 .
[5] G. Loewenstein,et al. Time and Decision: Economic and Psychological Perspectives of Intertemporal Choice , 2003 .
[6] W. Lovejoy. A survey of algorithmic methods for partially observed Markov decision processes , 1991 .
[7] G. Loewenstein,et al. Time Discounting and Time Preference: A Critical Review , 2002 .
[8] Alan Morris. Developing and Implementing Computerized Protocols for Standardization of Clinical Decisions , 2000, Annals of Internal Medicine.
[9] A B Nattinger,et al. Geographic variation in the use of breast-conserving treatment for breast cancer. , 1992, The New England journal of medicine.
[10] George W. Torrance,et al. Social preferences for health states: An empirical evaluation of three measurement techniques , 1976 .
[11] Ronald A. Howard,et al. Dynamic Programming and Markov Processes , 1960 .
[12] S. Wilson. Methods for the economic evaluation of health care programmes , 1987 .
[13] William S. Jewell,et al. Markov-Renewal Programming. I: Formulation, Finite Return Models , 1963 .
[14] C. Striebel. Sufficient statistics in the optimum control of stochastic systems , 1965 .
[15] L. Shapley,et al. Stochastic Games* , 1953, Proceedings of the National Academy of Sciences.
[16] A. Tversky,et al. Judgment under Uncertainty: Heuristics and Biases , 1974, Science.
[17] W. O’Neill,et al. Regional variation across the United States in the management of acute myocardial infarction. , 1995, The New England journal of medicine.
[18] Ulrich Dieter,et al. Discrete Optimization Approximating k-cuts using network strength as a Lagrangean relaxation , 2000 .
[19] L Pilote,et al. Regional variation across the United States in the management of acute myocardial infarction. GUSTO-1 Investigators. Global Utilization of Streptokinase and Tissue Plasminogen Activator for Occluded Coronary Arteries. , 1995, The New England journal of medicine.
[20] Claude Lefèvre,et al. Optimal Control of a Birth and Death Epidemic Process , 1981, Oper. Res..
[21] M. Gold. Cost-effectiveness in health and medicine , 2016 .
[22] P. Samuelson. A Note on Measurement of Utility , 1937 .
[23] Ulrich Dieter,et al. Simulation and optimization : proceedings of the International Workshop on Computationally Intensive Methods in Simulation and Optimization, held at the International Institute for Applied Systmes Analysis (IIASA), Laxenburg, Austria, August 23-25, 1990 , 1992 .
[24] Sean R Eddy,et al. What is dynamic programming? , 2004, Nature Biotechnology.
[25] A. Gittelsohn,et al. Small Area Variations in Health Care Delivery , 1973, Science.
[26] D. Feeny,et al. Multiattribute utility function for a comprehensive health status classification system. Health Utilities Index Mark 2. , 1996, Medical care.
[27] Milos Hauskrecht,et al. Planning treatment of ischemic heart disease with partially observable Markov decision processes , 2000, Artif. Intell. Medicine.
[28] Daniel Kahneman,et al. Availability: A heuristic for judging frequency and probability , 1973 .
[29] Silvana Quaglini,et al. Deciding when to intervene: a Markov decision process approach , 2000, Int. J. Medical Informatics.
[30] Mark S. Roberts,et al. Decision Modeling Techniques , 2000 .
[31] Benjamin Van Roy,et al. The Linear Programming Approach to Approximate Dynamic Programming , 2003, Oper. Res..
[32] John Hornberger,et al. Involving Patients in the Cadaveric Kidney Transplant Allocation Process: A Decision-Theoretic Perspective , 1996 .
[33] Martin L. Puterman,et al. Markov Decision Processes: Discrete Stochastic Dynamic Programming , 1994 .
[34] Steven A. Lippman,et al. Applying a New Device in the Optimization of Exponential Queuing Systems , 1975, Oper. Res..
[35] Richard F. Serfozo,et al. Technical Note - An Equivalence Between Continuous and Discrete Time Markov Decision Processes , 1979, Oper. Res..
[36] W. Jewell. Markov-Renewal Programming. II: Infinite Return Models, Example , 1963 .
[37] C. McDonald,et al. Toward Electronic Medical Records That Improve Care , 1995, Annals of Internal Medicine.
[38] J. Christensen-Szalanski. Discount Functions and the Measurement of Patients' Values , 1984, Medical decision making : an international journal of the Society for Medical Decision Making.
[39] Chelsea C. White,et al. Solution Procedures for Partially Observed Markov Decision Processes , 1989, Oper. Res..
[40] M. K. Ghosh,et al. Discrete-time controlled Markov processes with average cost criterion: a survey , 1993 .
[41] A. Shwartz,et al. Handbook of Markov decision processes : methods and applications , 2002 .