Russell E. Glasgow, PhD (and Bridget Gaglio, PhD- PCORI )

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Research article structure: Where can reporting guidelines help? Iveta Simera The EQUATOR Network workshop.
Using RE-AIM as a tool for Program Evaluation From Research to Practice.
Donald T. Simeon Caribbean Health Research Council
Implementation, Evaluation and Getting to the Triple Aim Deborah J. Cohen. PhD Oregon Health & Science University Russell E. Glasgow, PhD University of.
The Community Engagement Studio: Strengthening Research Capacity through Community Engagement Consuelo H. Wilkins, MD, MSCI Executive Director, Meharry.
Implementation Research: Using Science to Guide Implementation of Evidence-Based Practices Brian S. Mittman, PhD Director, VA Center for Implementation.
Toolkit Series from the Office of Migrant Education Webinar: CNA Toolkit August 21, 2012.
CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.
STUDY PLANNING & DESIGN TO ENHANCE TRANSLATION OF HEALTH BEHAVIOR RESEARCH Lisa Klesges, Russell Glasgow, Paul Estabrooks, David Dzewaltowski, Sheana Bull.
Sue Huckson Program Manager National Institute of Clinical Studies Improving care for Mental Health patients in Emergency Departments.
RE-AIM Plus To Evaluate Effective Dissemination of AHRQ CER Products Michele Heisler, MD, MPA September, 2011.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
Systematic Reviews.
Dissemination and Implementation Ellen Goldstein, MA, Kevin Grumbach, MD Translating Practice into Evidence: Community Engaged Research.
Barbara Resnick, PhD, CRNP, FAAN, FAANP
Secondary Translation: Completing the process to Improving Health Daniel E. Ford, MD, MPH Vice Dean Johns Hopkins School of Medicine Introduction to Clinical.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Research article structure: Where can reporting guidelines help? Iveta Simera The EQUATOR Network workshop 10 October 2012, Freiburg, Germany.
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
Why are we Here? Russell E. Glasgow, Ph.D. University of Colorado School of Medicine With thanks to the NIH Health Care Systems Research Collaboratory.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
National Cancer Institute U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Dissemination & Implementation Research: Study Designs.
PRAGMATIC Study Designs: Elderly Cancer Trials
The PRECIS-2 tool: Matching Intent with Methods David Hahn, MD, MS, WREN Director Department of Family Medicine & Community Health University.
Stages of Research and Development
Introduction Social ecological approach to behavior change
Translating Research Into Practice: Pragmatic Research Approaches
Dissemination and Implementation Research
Incorporating Evaluation into a Clinical Project
Proctor’s Implementation Outcomes
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Training Trainers and Educators Unit 8 – How to Evaluate
Russell Glasgow Marina McCreight Borsika Rabin
Expanded CONSORT Figure for Planning and Reporting D & I Research
Design and Critique of Grants for Implementation Research
Right-sized Evaluation
Using Observation to Enhance Supervision CIMH Symposium Supervisor Track Oakland, California April 27, 2012.
Pics from trips Alan Pears AM
Clinical Studies Continuum
D&I Grant Funding: Tips for Success and Possible Futures
Adult and Child Consortium for Outcomes Research and Delivery Science
Pragmatic Models, Methods and Measures for Dissemination and Implementation Research (and Population Impact) Russell E. Glasgow, PhD Department of Family.
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
What can implementation research offer?
STROBE Statement revision
Peg Bradke and Rebecca Steinfield
YSGOL GWYDDORAU GOFAL IECHYD / SCHOOL OF HEALTHCARE SCIENCES
Training Trainers and Educators Unit 8 – How to Evaluate
What Can Researchers Do to Address Health Inequities
Developing a Health Maintenance Schedule
Fralin Translational Obesity Research Center
Dissemination and Implementation Research: Past, Present, and Future (One View) Russell E. Glasgow, PhD Department of Family Medicine, Univ. Colorado.
PCORI Research Priorities and Relevant Examples
4.2 Identify intervention outputs
Strategic Environmental Assessment (SEA)
Changing the Health System to Improve the Health of Older Patients
Experience Applying Pragmatic Models for D&I Research: Current and Future Directions for the RE-AIM Planning and Evaluation Model Russell E. Glasgow, PhD.
Building Capacity for Quality Improvement A National Approach
Chapter 15 Community As Client: Applying the Nursing Process
Program Planning: Models and Theories
Regulatory Perspective of the Use of EHRs in RCTs
Some Further Considerations in Combining Single Case and Group Designs
Sustainability of Organizational Change
Celebrating Success and Making a Plan for Sustainability
The Promise and Challenge of Dissemination and Implementation Science (DIS) Russell E. Glasgow, PhD Director, ACCORDS DISSEMINATION AND IMPLEMETATION.
Picking and Applying a Dissemination & Implementation Science Theory
@smharden Specifying implementation strategies to enhance reviewer comprehension of your work 2019 Dissemination & Implementation Science Grant Development.
Presentation transcript:

Evaluation for D&I: models, pragmatism, relevance, re-aim, resources, and reporting Russell E. Glasgow, PhD (and Bridget Gaglio, PhD- PCORI ) Director Dissemination and Implementation Science Program www.ucdenver.edu/accords/implementation Co-lead Evaluation Hub, Department of Family Medicine  www.evaluationhub.org University of Colorado School of Medicine

Conflicts of Interest No conflicts of interest to disclose Some ideas presented come from grants funded by the National Institutes of Health (NIH), Agency for Healthcare Research and Quality (AHRQ), and Robert Wood Johnson Foundation (RWJF).

Overview Evaluation- formative, iterative and summative Pragmatic Research and PRECIS Planning Model (exercise) Selecting an Evaluation Model - tips and options RE-AIM Concrete example (exercise) Resources Required Reporting and Replication (StaRI and ultimate use question) Discussion - Q and A Decide on largest font size that fits most slides

evaluation General: Evaluation is the systematic collection of information about the activities, characteristics, and results of programs to make judgments about the program, improve or further develop program effectiveness, inform decisions about future programming, and/or increase understanding. For D&I: This includes stakeholder-based assessment of multi-level and contextual influences on processes, and the outcomes related to producing broad, population-based impact, and understanding of how, why and under what conditions these results were obtained.

definition and opportunities Formative* – more than focus groups; design for dissemination Iterative- merge/interface with QI; little done; new area for D&I Summative**- many models to guide evaluation It is never too early to begin evaluation, designing for dissemination, and considering ‘ultimate use’

Ultimate use question- Take Away from this talk “What program/policy components are most effective for producing what outcomes for which populations/recipients when implemented by what type of persons under what conditions, with how many resources and how/why do these results come about?” NOT possible to address all these issues in any one study.... BUT should consider each of them pragmatically and transparently; then select and report those most relevant.

A Different Approach: Pragmatic Research External Validity/Pragmatic Criteria - often ignored Participant Representativeness Setting Representativeness Multi-level Context and Setting Community/Setting Engagement Adaptation/Change in Intervention and Implementation Strategies Sustainability Costs/Feasibility of Treatment Comparison Conditions

The 5 RS to Enhance Pragmatism and Likelihood of Translation Research that is: Relevant Rapid and recursive Redefines rigor Reports resources required Replicable Peek CJ, et al. The 5 Rs: An emerging bold standard for conducting relevant research in a changing world. Ann Fam Med 2014;12(5), 447-455. doi:10.1370/afm.1688. deGruy FV, et al. A plan for useful and timely family medicine and primary care research. Fam Med 2015;47(8), 636-42.

“The significant problems we face cannot be solved by the same level of thinking that created them.” A. Einstein

Enhancing pragmatic research “If we want more evidence-based practice, we need more practice-based evidence.” Green,L W. Public health asks of systems science: to advance our evidence-based practice, can you help us get more practice-based evidence?  Am J Public Health 2006;96(3), 406-9.

External Validity/ Pragmatic Criteria - often ignored Participant Representativeness Setting Representativeness Multi-Level Context and Setting Community/Setting Engagement Adaptation/change Sustainability Costs, Feasibility of Program or Policy Comparison Conditions In our own work and a reference provided in our materials, we have found that the CONSORT PRECIS criteria are helpful, but not completely comprehensive. We also felt the dimensions above were important, and found these dimensions, like PRECIS ones, could be reliably rated after brief training

Key differences between Traditional Randomized Control Trials (RCT) and Pragmatic Controlled Trials (PCT) A traditional RCT tests a hypothesis under ideal conditions A PCT compares treatments under everyday clinical conditions GOALS To determine causes and effects of treatment To improve practice and inform clinical and policy decisions DESIGN Tests the intervention against placebo using rigid study protocols and minimal variation Tests two or more real-world treatments using flexible protocols and local customization PARTICIPANTS Highly defined & carefully selected More representative because eligibility criteria are less strict MEASURES Require data collection outside routine clinical care Brief and designed so data can be easily collected in clinical settings RESULTS Rarely relevant to everyday practice Useful in everyday practice, especially clinical decision making Linda- needs LOTS of help....formatting, alignment; some headers missing, red instead of yellow

The Pragmatic-Explanatory Continuum Indicator Summary (PRECIS-2) Planning Tool How pragmatic (vs. explanatory or efficacy) is your study? Tool and graphic to help in planning and reporting. (see next slide) HANDOUT AND HOMEWORK: Complete next Figure Loudon K, Treweek S, Sullivan F, Donnan P, Thorpe KE, Zwarenstein M. The PRECIS-2 tool: Designing trials that are fit for purpose. BMJ 2015;350:h2147. Gaglio B, et al. How pragmatic is it? Lessons learned using PRECIS and RE-AIM for determining pragmatic characteristics of research. Implementation Science 2014;9(1), 1. Thorpe KE, et al. A pragmatic-explanatory continuum indicator summary (PRECIS). CMAJ 2009;180(10):E47-E57.

LINDA- possible to get rid of background template? https://www.precis-2.org/

LINDA*- can we get rid of template background?

Achieving the 5 Rs: RE-AIM Framework www.re-aim.org

Other models- at least 90 **Linda- remove bullet point on first line text at bottom 91 frameworks: http://dissemination-implementation.org/index.aspx Most common at NIH: REAIM and DOI Many commonalities across models and theories

Achieving the 5 Rs: RE-AIM Framework www.re-aim.org Focus on enhancing: Reach – Participation rates and representativeness Effectiveness – Breadth (quality of life), including negative or unintended effects Adoption - Setting and staff participation rate and representativeness Implementation – Consistency, adaptation and costs of the program Maintenance – Extent to which program and effects are sustained Gaglio B, et al. The RE-AIM Framework:  A systematic review of use over time.  Am J Public Health. 2013 Jun;103(6):e38-46.  Kessler RS, et al. What does it mean to “Employ” the RE-AIM Model?  Eval Health Prof. 2012 Mar; 36(1):44-66.

Why is this important? impact of Loss at each RE-AIM CONCEPT Example of Translation of Interventions into Practice Dissemination Step RE-AIM Concept % Impact 50% of settings use intervention Adoption 50.0% 50% of staff take part 25.0% Example of Translation of Interventions into Practice Dissemination Step RE-AIM Concept % Impact 50% of patients identified, accept Reach 12.5% 50% follow regimen correctly Implementation 6.2% NOTE- can you make ‘Effectiveness” in red?** RG- **animation this one so blank table with columns comes up first…(also to second column- change tilte to ‘RE-AIM Concept”...... Then have first two lines come in; then reach and implementation lines; Then effectiveness; then maintenance (and in this line make red and BOLD the final 1.6%) So have 4 steps… lets see if this works or is too much  50% benefit from the intervention Effectiveness 3.2% 50% continue to benefit after six months Maintenance 1.6% Re-aim.org

What is the moral of this story?

EXAMPLE: My Own Health Report (MOHR) Study Cluster randomized pragmatic trial of web-based brief health behavior and mental health assessment and intervention in 9 diverse pairs of primary care practices to test whether they could implement My Own Health Report (MOHR). Outcomes included: Reach of the MOHR program across patients Whether practices would adopt MOHR How practices would implement MOHR Effectiveness of the MOHR program Obvious formatting help plz. 

Can cut off some of the tips boxes at bottom- not all

Overall Reach 1768 of 3591 patients (49.2%) Mailed (patient complete) Site 1: 100 of 344 (29.1%) Site 3: 11 of 420 (2.6%) Site 4: 138 of 444 (31.3%) Site 5: 115 of 248 (46.4%) Phone (nurse complete) Site 3: 291 of 453 (64.2%) Lobby (patient + MD complete) Site 2: 192 of 437 (43.9%) Lobby (MA or coordinator) Site 6: 265 of 287 (92.3%) Site 7: 211 of 306 (69.0%) Site 8: 247 of 323 (76.5%) Site 9: 198 of 329 (60.2%)

Weekly MOHR Reach

Adoption 18 practices agreed to adopt MOHR 30 practices approached (adoption 60%) 7 of 9 sites recruited first practices approached decliners participating in other studies, worried about workload or doing HRAs Participating practices represented a diverse spectrum of primary care (see Table 1)

Implementation Practices used one or more of four main implementation strategies Web at home (n=3), called patients (n=1), completed in office on paper (n=1), or electronically in office (n=4) 4 had patients and 5 had staff complete MOHR with patients 8 needed research team or health system help 8 had clinicians counsel patients, 4 had some follow-up, 1 had no counseling or follow-up Delivering MOHR took ~28 minutes (16-31) including assessment and feedback For RUSS- Next to last line- might mention majority of 28 minutes was for counseling- took about 10 minutes to administer and most time to counsel on results

Number of Health Risk Factors by Site

Did anyone help you set a goal?

Have you made any positive changes?

Moral of this example?

RE-AIM Precision (Personalized) Medicine Questions Determine: What percent and types of patients are Reached; For whom among them is the intervention Effective, in improving what outcomes, with what unanticipated consequences; In what percent and types of settings and staff is this approach Adopted; How consistently are different parts of it Implemented at what cost to different parties; And how well are the intervention components and their effects Maintained?

Evolution of Re-Aim (All models are WRONG) Reviews documenting use over time Applicability to many different content areas Used for both planning and evaluation Underreporting of key components Setting level factors reported much less often (e.g., adoption) Maintenance (sustainability) reported least often NEW AREAS Health Policy Multilevel Community Interventions Built Environment Understanding Adaptations to Programs Mixed Methods Focus on Context! Gaglio, et al. The RE-AIM framework….AJPH 2013;103: 38-46.

HANDOUT AND HOMEWORK: Key translation and pragmatic questions to consider for the RE-AIM dimensions Key pragmatic questions to consider and answer Reach WHO is/was intended to benefit, and who actually participates or is exposed to the intervention? Effectiveness WHAT is/was the most important benefits you are trying to achieve and what is/was the likelihood of negative outcomes? Adoption WHERE is/was the program or policy applied, and WHO applied it? Implementation HOW consistently is/was the program or policy delivered, HOW will/was it be/ing adapted, HOW much will/did it cost, and HOW/WHY will/did the results come about? Maintenance WHEN will (was) the initiative become operational; how long will (was) it be sustained (Setting level); and how long are the results sustained (Individual level)?

Reporting Resources Required Reporting on cost and other resources in a standardized manner is useful in: Demonstrating value Promoting rigor, transparency and relevance to stakeholders Present from perspective of stakeholders and decision makers Simple is fine – sophisticated economic analyses are not needed Report costs of conducting or replicating interventions Beyond money, costs can include clinician and staff time, training, infrastructure, startup costs, opportunity costs Ritzwoller, D P, et al. Costing behavioral interventions: A practical guide to enhance translation. Ann Beh Med 2009;37(2), 218-27.

Standards for Reporting implementation studies (stari) 27 reporting criteria identified by systematic review and international experts. INCLUDING: both implementation strategies and intervention effects related to, but expansion of CONSORT criteria reporting of context including SDOH and policy issues cost and economic outcomes fidelity and adaptations harms and unintended effects representativeness and estimated scalability **LINDA- please move ref onto the slide please- thx Pinnock H. Barwick M, Carpenter CR, Eldridge S, Grandes G, et al. Standards for reporting implementation studies (StaRI) statement.  BMJ 2017;356:i6795.

Ultimate use question “What program/policy components are most effective for producing what outcomes for which populations/recipients when implemented by what type of persons under what conditions, with how many resources and how/why do these results come about?” NOT possible to address all these issues in any one study.... BUT should consider each or them pragmatically and transparently; then select and report those most relevant.

Questions? Comments? I’m all ears *** LINDA- can you please male second line “I m all ears’ move to first line

Resources (see handout) Brownson, RC, Colditz GA, Proctor EK. Dissemination and implementation research in health: translating science to practice. Oxford University Press, 2012. NEW EDITION- 11/17???? www.re-aim.org www.ucdenver.edu/accords/implementation www.Dissemination-Implementation.org

Extended CONSORT Diagram RG- NICE- can we make font on text in boxes more readable- maybe bold or ??? THIS IS IT- THANKS!...IF CAN MANIPULATE IT; MIGHT ADD ONE MORE LEVEL- SHOW SETTING AS IS; THEN ANOTHER LINE SHOWING STAFF WHO PARTICIPATE (UNLESS MAKES TOO BUSY/IMPOSSIBLE TO SEE) How is this? Can edit further if need be. re-aim.org: https://www.re-aim.hnfe.vt.edu/resources_and_tools/figures_and_tables/consort.pdf

*LINDA- can y9u please see about extracting the reference that is super-impsed and putting it at bottom of figure? Thx Moore GF, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ 2015;350, h1258