Use of real-world evidence in economic assessments of pharmaceuticals in the United States

BACKGROUND: Despite the increasing interest in expanding the use of real-world evidence (RWE) in economic assessments of pharmaceuticals, decision makers face uncertainty about how RWE should be used. OBJECTIVE: To assess the use of RWE in economic assessments of drugs by the Institute for Clinical and Economic Review (ICER). METHODS: We reviewed cost-effectiveness and budget impact analyses in final evidence reports of pharmaceuticals published by ICER. We calculated the total number of RWE uses and the proportion of model inputs informed by RWE per report. We classified model inputs into 15 categories based on their attributes and then examined what category each RWE informed to classify the reason for RWE use. Finally, we characterized RWE by study design, data source, and sponsor type. RESULTS: We identified 33 reports, all of which used RWE; the mean RWE use per report was 12 (range = 4-26). The average proportion of model inputs informed by RWE per report was 32.7%, but this proportion had a wide range (range = 4.1%-76.9%). RWE was most commonly used for disease progression inputs (28.7%) and health care resource utilization and costs (21.1%), but was rarely used for drug-specific clinical outcomes such as effectiveness (1.5%), adverse drug event rates (0.5%), and discontinuation rates (1.2%). The most frequently used study design was a retrospective cohort (56.6%), and the most frequently used data source was registry data (41.4%). About a third (30.2%) of RWE was industry-sponsored. CONCLUSIONS: RWE has been commonly used to inform pharmaceutical value assessments conducted by ICER. However, there has been relatively limited use of RWE to inform drug-specific effectiveness, despite calls for greater inclusion of RWE in value assessments for real-world drug effectiveness.

[1]  J. Peter,et al.  Institute for Clinical and Economic Review , 2021, The Right Price.

[2]  K. Reynolds,et al.  Institute for clinical and economic review (ICER) psoriasis update 2018: what it means for dermatologists treating moderate-to-severe plaque psoriasis , 2019, The Journal of dermatological treatment.

[3]  Maarten J. IJzerman,et al.  Identifying the Need for Good Practices in Health Technology Assessment: Summary of the ISPOR HTA Council Working Group Report on Good Practices in HTA. , 2019, Value in health : the journal of the International Society for Pharmacoeconomics and Outcomes Research.

[4]  N. Dreyer,et al.  Considerations in characterizing real‐world data relevance and quality for regulatory purposes: A commentary , 2018, Pharmacoepidemiology and drug safety.

[5]  T. Garland,et al.  Will Any Value Frameworks Gain Acceptance as an Informal Health Technology Assessment in the United States? , 2018, Journal of Clinical Pathways.

[6]  Adrian Towse,et al.  Real-world evidence for coverage decisions: opportunities and challenges. , 2018, Journal of comparative effectiveness research.

[7]  X. Wang,et al.  Health Techonolgy Assessment (HTA) Agencies Consideration of Real World Evidence (RWE) , 2018 .

[8]  Olaf Klungel,et al.  Using Real-World Data in Health Technology Assessment (HTA) Practice: A Comparative Study of Five HTA Agencies , 2017, PharmacoEconomics.

[9]  D. Malone,et al.  Real-World Evidence: Useful in the Real World of US Payer Decision Making? How? When? And What Studies? , 2017, Value in health : the journal of the International Society for Pharmacoeconomics and Outcomes Research.

[10]  David Madigan,et al.  Good practices for real‐world data studies of treatment and/or comparative effectiveness: Recommendations from the joint ISPOR‐ISPE Special Task Force on real‐world evidence in health care decision making , 2017, Pharmacoepidemiology and drug safety.

[11]  R. Califf,et al.  Real-World Evidence - What Is It and What Can It Tell Us? , 2016, The New England journal of medicine.

[12]  H. Hillege,et al.  Policies for Use of Real-World Data in Health Technology Assessment (HTA): A Comparative Study of Six HTA Agencies. , 2016, Value in health : the journal of the International Society for Pharmacoeconomics and Outcomes Research.

[13]  N. Dreyer,et al.  The GRACE Checklist: A Validated Assessment Tool for High Quality Observational Studies of Comparative Effectiveness , 2016, Journal of managed care & specialty pharmacy.

[14]  T. Trikalinos,et al.  Recommendations for Conduct, Methodological Practices, and Reporting of Cost-effectiveness Analyses: Second Panel on Cost-Effectiveness in Health and Medicine. , 2016, JAMA.

[15]  J. Segal,et al.  Using Certification to Promote Uptake of Real-World Evidence by Payers , 2016, Journal of managed care & specialty pharmacy.

[16]  Isabel Fong,et al.  The CER (Comparative Effectiveness Research) collaborative toolkit. , 2014, Journal of Managed Care & Specialty Pharmacy.

[17]  Andrew Briggs,et al.  Performance-based risk-sharing arrangements-good practices for design, implementation, and evaluation: report of the ISPOR good practices for performance-based risk-sharing arrangements task force. , 2013, Value in health : the journal of the International Society for Pharmacoeconomics and Outcomes Research.

[18]  Tony Tse,et al.  Registration of observational studies: Is it time? , 2010, Canadian Medical Association Journal.

[19]  Richard E Gliklich,et al.  GRACE principles: recognizing high-quality observational studies of comparative effectiveness. , 2010, The American journal of managed care.