Presentation is loading. Please wait.

Presentation is loading. Please wait.

National Cancer Institute U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Dissemination & Implementation Research: Study Designs.

Similar presentations


Presentation on theme: "National Cancer Institute U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Dissemination & Implementation Research: Study Designs."— Presentation transcript:

1 National Cancer Institute U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Dissemination & Implementation Research: Study Designs David Chambers, DPhil & Wynne Norton, PhD Division of Cancer Control and Population Sciences, NCI CPCRN Spring Meeting May 24, 2016

2 Research Question to Study Design Range of Study Designs Questions, Comments Practical Group Exercise Outline

3 Framing Your Study Design: Key Questions What is my primary question? Where am I looking to answer it? How could it BEST be answered? How could it FEASIBLY be answered? What do I have control over? What data are currently available? What data do I need to gather?

4 Key Questions What is my primary question? –Can you describe your primary question in 1-3 sentences? Evaluation plan (design) and measurement must flow from clear question(s) Consider significance: Why is this question important and how does it fill an important research gap? Where am I looking to answer the question? –Who will use the outcomes of your study (identify stakeholders early)? –How does this drive your selection of setting and population (consider representativeness)?

5 Key Questions How could it BEST be answered? How could it FEASIBLY be answered? What do I have control over? What data are currently available? What data do I need to collect? –Rigorous design that accounts for context, complexity, practical considerations, and external validity –Data sources: What is needed to answer primary aim? (informed by your conceptual model!) –Who will use the outcomes of your study? How does this inform your selection of study design and measurement?

6 Example #1 What is the impact of a natural experiment to implement an evidence-based intervention (EBI) to improve cancer screening within an HMO’s primary care clinics? What is my primary research question? –Does the EBI get implemented, how does implementation vary, and what happens as a result (IS + Effectiveness)? Where am I looking to answer it? –Multiple primary care clinics (external validity?) How could it OPTIMALLY be answered? –Randomized Comparison Group How could it FEASIBLY be answered? –Stepped-wedge, non-randomized “matched” comparison sites, other?

7 Example #1 (cont’d) What is the impact of a natural experiment to implement an evidence-based intervention (EBI) within an HMO’s primary care clinics? What do I have control over? –HMO is willing to do a phase-in roll-out What data are currently available? –EHR, claims, pharmacy data What data do I need to gather? –How the EBI was delivered (implementation strategies/processes); patient outcomes; provider outcomes; organizational processes/outcomes

8 Example #2 What is the comparative effectiveness of two strategies to disseminate evidence-based guidelines for diet and exercise to schools? What is my primary question? –Is one strategy better than the other? Where am I looking to answer it? –Schools How could it OPTIMALLY be answered? –Matched-pair cluster randomized (personnel, students diversity, size, SES) How could it FEASIBLY be answered? –Same

9 Example #2 (cont’d) What is the comparative effectiveness of two strategies to disseminate evidence-based guidelines for diet and exercise to schools? What do I have control over? –Dissemination strategy, timeframe, data collection What data are currently available? –Unsure: Explore availability of curriculum outlines (health, science, or PE class), cafeteria purchasing data (student or school level), student fitness measures from PE What data do I need to gather? –Teacher behavior, student outcomes, organizational variables, other?

10 Glasgow, R. E. & Chambers, D. Clin Transl Sci 2012:5, 48-55 D&I Characteristics and Implications for Study Design

11

12 Range of Study Designs in IS Observational: Neither manipulation nor random assignment –Cohort, Cross-sectional Experimental: Randomization and manipulation –Randomized controlled trials (RCTs), pragmatic RCTs (pRCTs), cluster RCTs, stepped-wedge cluster RCTs Quasi-Experimental: Manipulation but no randomization –Interrupted Time Series (ITS), Regression Discontinuity Design, Non-Equivalent Control Group Design

13 Range of Study Designs in IS Effectiveness-Implementation Hybrid Designs: Dual focus a priori in assessing effectiveness and implementation –Type 1, Type 2, Type 3 Mixed Methods: Collection and integration of qualitative and quantitative data –Embedded, explanatory, exploratory Simulation/Modeling –System dynamics, network analysis, agent-based modeling

14 Loudon et al. (2015). The PRECIS-2 tool: Designing trials that are fit for purpose. BMJ Experimental: pRCTs

15 Experimental: Stepped-Wedge Cluster RCT Brown & Lilford (2006). The stepped wedge trial design: A systematic review. BMC Med Res Method.

16 Quasi-Experimental Designs: ITS Flodgren & Oddgard-Jensen for Effective Practice and Organization of Care (2013). Interrupted time series analyses. Cochrane.

17 Curran et. al. (2012). Effectiveness-Implementation Hybrid Designs. Med Care. Effectiveness-Implementation Hybrid Designs

18 Hybrid Designs: 1, 2, 3 Curran et al. (2013). Effectiveness-implementation hybrid designs: Combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care.

19  Collect qualitative and quantitative data to obtain broader and more comprehensive understanding of context  Conduct one study within the other type of design  Useful for understanding context and processes  Qualitative + Quantitative  Concurrent, embed, unequal Mixed Methods Designs: Embedded

20  Qualitative data helps explain or build on initial quantitative results  Use qualitative to explain atypical or confusing quantitative results  Use (quantitative) participant characteristics to guide purposeful sampling for qualitative interviews  Quantitative Qualitative  Sequential, connect, unequal Mixed Methods Designs: Explanatory

21  Quantitative data helps explain or build on initial qualitative results  Exploration is needed due to lack of available data, limited understanding of context, and/or few available instruments  Qualitative Quantitative  Sequential, connect, unequal Mixed Methods Designs: Exploratory

22 Basic: How do specific stakeholders interpret information about implementation? Applied: How does intervention X best get implemented in setting Y? Measurement: How do I validly measure implementation outcomes (or processes/strategies or contexts)? Design: How to account for variation at multiple levels? Range of D&I Questions

23 Take Home Points What is the best design? Depends on your research question(s)! Each design has strengths and weaknesses Valid measures exist, but not for all constructs Funded studies have variation in design Maximize rigor, relevance and feasibility of study design.

24 Practical Group Exercise

25 Instructions Each table has 4 decks of cards Pick one of each (randomly) Intervention (ITV) Context Key Question If applicable, pick implementation strategy (your choice)

26 Instructions (cont’d) Think about a study design that fits the Intervention (ITV), Context, and Key Question Discuss potential study designs and pros/cons of each viable study design Which study design is best suited based on the intervention, context and key question? Why? After 15 minutes, discuss as a group

27 Quick Reports of Group Exercise Quick Reports What design is best suited to answer your research question? Why? What factors influenced your decision- making process?

28 Take a Card from the Wild Card Deck… What Changes Would You Make? (10 Minutes) The Other Shoe

29 Quick Reports of Group Exercise (Part 2) What did you have to account for? Problems? Solutions?

30 Questions? Comments? Thank you!

31 Contact Information David A. Chambers, D.Phil Deputy Director, Implementation Science DCCPS, NCI dchamber@mail.nih.gov 240-276-5090 Wynne E. Norton, PhD Program Officer, Implementation Science DCCPS, NCI wynne.norton@nih.gov 240-276-6875 http://cancercontrol.cancer.gov/IS/


Download ppt "National Cancer Institute U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Dissemination & Implementation Research: Study Designs."

Similar presentations


Ads by Google