Use of PRECIS-2 Ratings to Advance Understanding of Pragmatic Trial Design Domains: NIH Funded Trials (USA) Paula Darby Lipman, Ph.D. Westat Rockville,

Slides:



Advertisements
Similar presentations
Donald T. Simeon Caribbean Health Research Council
Advertisements

Engaging Patients and Other Stakeholders in Clinical Research
The Community Engagement Studio: Strengthening Research Capacity through Community Engagement Consuelo H. Wilkins, MD, MSCI Executive Director, Meharry.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Clinical Management Nutr 564: Management Summer 2003.
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
Questions from a patient or carer perspective
Critical Appraisal of Clinical Practice Guidelines
Assessing Chronic Illness Care in Prison (ACIC-P): A Tool for Tracking Chronic Illness Care in Prison Emily Wang, M.D., MAS Yale University School of Medicine.
Darren A. DeWalt, MD, MPH Division of General Internal Medicine Maihan B. Vu, Dr.PH, MPH Center for Health Promotion and Disease Prevention University.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
KENTUCKY YOUTH FIRST Grant Period August July
1 Sex/Gender and Minority Inclusion in NIH Clinical Research What Investigators Need to Know! Presenter: Miriam F. Kelty, PhD, National Institute on Aging,
Elizabeth A. Martinez, MD, MHS Johns Hopkins Medical Institutions September 10, 2008 Organization of Care and Outcomes in Cardiac Surgery AHRQ grant 1K08HS A1.
Journal Club Alcohol, Other Drugs, and Health: Current Evidence May–June 2014.
Kathy Corbiere Service Delivery and Performance Commission
Strategies to Improve Clinical Practice Through Guidelines: Experience from Recent Studies Michael Cabana (UCSF) Rebecca Beyth (University of Florida)
NIAMS Training Grant and Career Development Award Program Evaluation Presented by David Wofsy, M.D. Chairman Evaluation Working Group September 27, 2007.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
NIHR Themed Call Prevention and treatment of obesity Writing a good application and the role of the RDS 19 th January 2016.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
Why are we Here? Russell E. Glasgow, Ph.D. University of Colorado School of Medicine With thanks to the NIH Health Care Systems Research Collaboratory.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
PRAGMATIC Study Designs: Elderly Cancer Trials
The PRECIS-2 tool: Matching Intent with Methods David Hahn, MD, MS, WREN Director Department of Family Medicine & Community Health University.
Clinical Quality Improvement: Achieving BP Control
Learning Health Systems: Working at the Intersection of Research, Operations and Big Data Welcome May 5, 2017.
Translating Research Into Practice: Pragmatic Research Approaches
Dissemination and Implementation Research
Incorporating Evaluation into a Clinical Project
Presentation Developed for the Academy of Managed Care Pharmacy
Role of The Physical Therapist in Critical Inquiry
Research Questions Does integration of behavioral health and primary care services, compared to simple co-location, improve patient-centered outcomes in.
MUHC Innovation Model.
Accreditation Canada Medicine Accreditation 2016.
Clinical Studies Continuum
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Clinical practice guidelines and Clinical audit
Study Population and Setting
Randomized Trials: A Brief Overview
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
School Self-Evaluation 
Chapter 16 Nursing Informatics: Improving Workflow and Meaningful Use
Reading Research Papers-A Basic Guide to Critical Analysis
Professor Stephen Pilling PhD
Multi-Sectoral Nutrition Action Planning Training Module
To start the presentation, click on this button in the lower right corner of your screen. The presentation will begin after the screen changes and you.
Managed Access to NIHR-funded Research Data
Presentation Developed for the Academy of Managed Care Pharmacy
implementing NEW NIH Human subjects guidance
New NIH Human Subjects & Clinical Trials Information
Pragmatic RCTs and the Learning Healthcare System
School of Dentistry Education Research Fund (SDERF)
Role of The Physical Therapist in Critical Inquiry
Regulated Health Professions Network Evaluation Framework
Jeremy Sugarman, MD, MPH, MA
Audit to improve consistency & reduce variation
Assessment of Service Outcomes
Regulatory Perspective of the Use of EHRs in RCTs
Developing SMART Professional Development Plans
Patient-Centered Outcomes Research Institute (PCORI)
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation Developed for the Academy of Managed Care Pharmacy
Presentation transcript:

Use of PRECIS-2 Ratings to Advance Understanding of Pragmatic Trial Design Domains: NIH Funded Trials (USA) Paula Darby Lipman, Ph.D. Westat Rockville, Maryland, USA Workshop: PRECIS-2 - Precisely how can this tool help investigators design trials to achieve practical answers to “real-world” questions? Society for Clinical Trials Annual Meeting Liverpool, United Kingdom May 8, 2017

Presentation Objectives Briefly introduce the National Institutes of Health (NIH) Pragmatic Trials Collaborative Project Describe how we incorporated PRECIS-2 Present results of a study using PRECIS-2 ratings to advance understanding of pragmatic trial design domains

Acknowledgments Pragmatic Trials Collaborative Project Investigators and research staff National Heart, Lung, and Blood Institute (NHLBI) and National Institute on Aging (NIA) program officers Co-investigator: Sean Tunis, MD (Center for Medical Technology Policy) NHLBI Program Officer: Kate Stoney, PhD Supported by NHLBI Grant 1R01HL125114-01 to Westat – Rockville, Maryland Health research organization

RFA-HL-14-019 Low-Cost, Pragmatic, Patient-Centered Randomized Controlled Intervention Trials Purpose: To encourage investigators to use existing resources to: Conduct low-cost, pragmatic, patient-centered randomized controlled trials of interventions Targeted at patients, families, health care providers, communities or health care systems through the integration of RCTs into existing clinical practice settings  Award Budget: $2.3M Total Direct Cost (5 years) Project Kick Off: November 14, 2014

2013: Statement of the Problem Only a small part of routine medical therapy provided to individual patients is based on the highest level of evidence. WHY? Studies with highly restrictive inclusion/exclusion criteria Limited testing of health care delivery strategies High cost of RCTs as they are currently conducted Conclusion: To meet the increasing demand for high-quality evidence in the health care arena, efficient, low-cost strategies for the conduct of large-scale, high-impact RCTs are needed.

Awardees of RFA HL-14-019 Principal Investigator(s) Study Title Agency Michael Simon Avidan, MD, Washington University Electroencephalograph Guidance of Anesthesia to Alleviate Geriatric Syndromes NIA Daniel Buysse, MD, University of Pittsburgh Pragmatic Trials of Behavioral Interventions for Insomnia in Hypertensive Patients NHLBI Michelle Gong, MD, Einstein College of Medicine Ognjen Gajic, MD, Mayo Clinic Prevention of Severe Acute Respiratory Failure in Patients with PROOFCheck- an Electronic Checklist to Prevent Organ Failure Scott David Halpern, MD, PhD, University of Pennsylvania Default Palliative Care Consultation for Seriously Ill Hospitalized Patients Henry Wang, MD, University of Alabama at Birmingham Pragmatic Trial of Airway Management in Out-of-Hospital Cardiac Arrest Studies are very different. Early adopters in this arena.

Westat’s Role Monitored achievement of trial planning milestones Quarterly calls with awardees to review progress Plan agenda for annual in-person meetings Bring together PIs and members of their research teams Includes NIH Program Officers Support collaborative activities Innovative use of the Pragmatic-Explanatory Continuum Indicator Summary (PRECIS-2) Instrument

Pragmatic Trials Project: PRECIS-2 Study Objectives Assess the degree to which the Pragmatic Trials were pragmatic at the planning phase Study whether and how trial design changed from the planning phase to the implementation phase Identify domains that may be more challenging and why (similar to Johnson, et al., 2016 design) – NIH collaboratory

Methods Design Quantitative data collection Mixed-methods [quan → QUAL] with sequential collection and analysis of quantitative data (PRECIS-2 ratings) and qualitative data (interviews with trial PIs) Quantitative data collection Time 1 (planning) – following a brief training (Feb ‘15) Time 2 (implementation) – after a refresher session (Apr ‘16) Qualitative data collection Interviews with each PI (Summer ‘16) Primary goals: Discuss experiences and impressions of the tool Discuss domains with a change in rating

Findings Experience with the tool Two of the PIs had used the original PRECIS tool to assess the design of their protocols. The other three PIs were not familiar with the tool (or its predecessor) prior to the project. All PIs agreed that the tool would have been helpful at the design phase of their trials. How pragmatic were the trials at the planning phase?

PRECIS-2 Results: Planning Phase

Findings (continued) As reflected in rating changes, how did trial design change over time? Total of 45 paired ratings (5 trials x 9 domains) Of these, 24 had a rating change Focus of the qualitative data collection and analysis One or more rating shifts occurred for each trial and across all nine domains Flexibility of Adherence and Follow-up = most shifts

Findings (continued) What did the shift in rating reflect? Only 3 represented a true change in trial design 21 shifts reflected understanding of the PRECIS-2 tool Three domains requiring clarification: Eligibility Flexibility of Adherence Follow-up

Eligibility #1 PI: “It’s a little more work to figure out patients who are…excluded…I just thought it was going to be very, very easy and you don’t have to think about it. But it turns out actually I have to have my staff validate it.” Less pragmatic due to additional effort needed to identify appropriate patients and validate correct identification. Eligibility refers to the extent to which the trial population matches the population intended for the intervention. The issue of effort to engage participants is more relevant to Recruitment Path, which addresses whether effort to recruit participants is greater than for patient engagement in usual care.

Eligibility #2 PI: “Once we started applying the criteria, we recognized… there are some people who we’ve excluded and I think they’re for good reasons. We haven’t changed the criteria, it’s just that as we’ve been applying them, we realized that it excludes a larger percentage of people perhaps than we thought.” Less pragmatic due to a higher proportion of patients excluded than originally anticipated. Volume of patient exclusion is not a consideration for pragmatism though it reflects the extent to which the trial participants are representative of most patients. For the Eligibility domain, one should consider the extent to which trial participants are similar to those who would receive the intervention if it were part of usual care.

Flexibility - Adherence #1 PI: “When we were in the planning phase, [it wasn’t] clear to us exactly how we were going to notify the clinicians…” Resource requirements are addressed under Organization. This domain should not have been rated, as there is no monitoring of patient adherence. More pragmatic because clinician notification [part of the intervention] was more automated than anticipated (requiring less effort).

Flexibility - Adherence #2 PI: “We’re not excluding anybody based on adherence, but we are encouraging adherence and are providing feedback on adherence.” This domain addresses how flexibly participants in the trial are monitored and encouraged compared to usual care. More pragmatic because no participants are excluded due to adherence or adherence is minimal.

Flexibility - Adherence #3 PI: “Our intervention really is executed and then it’s done, so the adherence of it is actually very minimal and the remainder of care given thereafter is just standard of care.” For surgical trials or trauma care, where there is no compliance issue after consent has been given, the domain is not applicable and should be left blank (unrated). More pragmatic because no participants are excluded due to adherence or adherence is minimal.

Follow-up #1 PI: “In clinical care, one would not necessarily seek out follow-up on patients, such as their primary outcome or survival… what made us think that it was less pragmatic was the manner by which you seek out that information.” Does not apply to this domain, which is concerned only with burden of follow-up on the participants, not whether the follow-up data are routinely collected. Less pragmatic as data needed from medical records are not routinely collected.

Follow-up #2 PI: “I have to apply in a separate IRB to a statistics department to get that long term follow-up. And that requires linking of the patient’s data. So that’s just a little less automatic…more work for me. For patients it’s the same.” Less pragmatic as collection of follow-up data is less automatic than anticipated. This domain is concerned only with burden of follow-up on the participants, not burden on research team or effort needed to collect the follow-up data.

Conclusions Not surprising to find that PRECIS-2 tool definitions and instructions were misunderstood or misapplied Positive response to using PRECIS-2 to guide conversations around trial design Provided valuable feedback to inform future trainings on the tool Conclusion that trial design decisions may be relatively stable is in need of further examination PRECIS-2 provides a valuable framework for discussion of critical elements underlying design decisions Includes communicating more effectively with each other and with their trial stakeholders

Poster Session 3 - #298 Framing the Conversation: Use of PRECIS-2 ratings to advance understanding of pragmatic trial design domains 10:00 a.m. Tuesday, May 9

Thank you!