National Cancer Institute U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Dissemination & Implementation Research: Study Designs.

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

Chronic disease self management – a systematic review of proactive telephone applications Carly Muller Dean Schillinger Division of General Internal Medicine.
Mywish K. Maredia Michigan State University
Michelle O’Reilly. Quantitative research is outcomes driven Qualitative research is process driven Please offer up your definitions.
CHAPTER 12, evaluation research
Teaching/Learning Strategies to Support Evidence-Based Practice Asoc. prof. Vida Staniuliene Klaipeda State College Dean of Faculty of Health Sciences.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
UPenn Prevention Research Center’s CPCRN Collaborating Center University of Pennsylvania (UPenn) Prevention Research Center is a new PRC, Principal.
Utilizing Evidence Based Practice in the Acute Care Clinical Setting Brenda P. Johnson, PhD, RN Department of Nursing Southeast Missouri State University.
Chapter 13: Descriptive and Exploratory Research
How to make the best of qualitative phases of mixed method research Professor Kim Usher Centre for Chronic Disease Prevention Mixed Methods in Prevention.
Slide 5.1 Saunders, Lewis and Thornhill, Research Methods for Business Students, 5 th Edition, © Mark Saunders, Philip Lewis and Adrian Thornhill 2009.
By Dr. Ahmed Mostafa Assist. Prof. of anesthesia & I.C.U. Evidence-based medicine.
Research Design Mixed Methods
1 SUMMER CONFERENCE What is “Mixed Methods” Research Research studies that include both QUALitative and QUANtitative data. QUAL and QUAN data purposely.
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
Funding Priorities: National Cancer Institute Carly Parry, PhD, MSW National Cancer Institute.
METHODS IN BEHAVIORAL RESEARCH NINTH EDITION PAUL C. COZBY Copyright © 2007 The McGraw-Hill Companies, Inc.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
Business and Management Research
INTRODUCTION TO OPERATIONS RESEARCH Niranjan Saggurti, PhD Senior Researcher Scientific Development Workshop Operations Research on GBV/HIV AIDS 2014,
I want to test a wound treatment or educational program in my clinical setting with patient groups that are convenient or that already exist, How do I.
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Mixed Methods and Other Special Types of Research.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
Research Design Blue print for conducting a study APRIL 21, 2014 RG 701- ADVANCE RESEARCH METHODS.
Planning an Applied Research Project Chapter 7 – Forms of Quantitative Research © 2014 John Wiley & Sons, Inc. All rights reserved.
Introduction to MAST Kristian Kidholm Odense University Hospital, Denmark.
Evaluating a Research Report
Follow How to integrate evidence into practice and policy: Knowledge translation resources for practitioners with limited.
HOW TO WRITE RESEARCH PROPOSAL BY DR. NIK MAHERAN NIK MUHAMMAD.
STANDARDS OF EVIDENCE FOR INFORMING DECISIONS ON CHOOSING AMONG ALTERNATIVE APPROACHES TO PROVIDING RH/FP SERVICES Ian Askew, Population Council July 30,
Secondary Translation: Completing the process to Improving Health Daniel E. Ford, MD, MPH Vice Dean Johns Hopkins School of Medicine Introduction to Clinical.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
SOCI 2003B: Sociological Methods Colleen Anne Dell, Ph.D. Carleton University, Department of Sociology & Anthropology Canadian Centre on Substance Abuse.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
+ NASP’s Position Statement on Prevention and Intervention Research in the Schools Training School Psychologists to be Experts in Evidence Based Practices.
Chapter 10 Finding Relationships Among Variables: Non-Experimental Research.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
Program Evaluation Principles and Applications PAS 2010.
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
Evidence-Based Practice
Evidence Based Practice (EBP) Riphah College of Rehabilitation Sciences(RCRS) Riphah International University Islamabad.
Choosing Questions and Planning the Evaluation. What do we mean by choosing questions? Evaluation questions are the questions your evaluation is meant.
Unit 8.  Program improvement or appraisal  Assessing the value of a program  Measuring the efficacy of particular components of a program  Meeting.
Managing Marketing Information 4 Principles of Marketing.
Assessment of Your Program Why is it Important? What are the Key Elements?
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
DATA COLLECTION METHODS IN NURSING RESEARCH
The Research Design Continuum
MPU 1024 Mixed Methods.
Clinical Studies Continuum
Business and Management Research
Resource 1. Evaluation Planning Template
Learning Module 11 Case Study Research.
Business and Management Research
Community Scientist Academy
Regulatory Perspective of the Use of EHRs in RCTs
Misc Internal Validity Scenarios External Validity Construct Validity
Kenneth Sherr Embedded implementation science to enhance the relevance of effectiveness trials for structural interventions Kenneth Sherr.
Presentation transcript:

National Cancer Institute U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Dissemination & Implementation Research: Study Designs David Chambers, DPhil & Wynne Norton, PhD Division of Cancer Control and Population Sciences, NCI CPCRN Spring Meeting May 24, 2016

Research Question to Study Design Range of Study Designs Questions, Comments Practical Group Exercise Outline

Framing Your Study Design: Key Questions What is my primary question? Where am I looking to answer it? How could it BEST be answered? How could it FEASIBLY be answered? What do I have control over? What data are currently available? What data do I need to gather?

Key Questions What is my primary question? –Can you describe your primary question in 1-3 sentences? Evaluation plan (design) and measurement must flow from clear question(s) Consider significance: Why is this question important and how does it fill an important research gap? Where am I looking to answer the question? –Who will use the outcomes of your study (identify stakeholders early)? –How does this drive your selection of setting and population (consider representativeness)?

Key Questions How could it BEST be answered? How could it FEASIBLY be answered? What do I have control over? What data are currently available? What data do I need to collect? –Rigorous design that accounts for context, complexity, practical considerations, and external validity –Data sources: What is needed to answer primary aim? (informed by your conceptual model!) –Who will use the outcomes of your study? How does this inform your selection of study design and measurement?

Example #1 What is the impact of a natural experiment to implement an evidence-based intervention (EBI) to improve cancer screening within an HMO’s primary care clinics? What is my primary research question? –Does the EBI get implemented, how does implementation vary, and what happens as a result (IS + Effectiveness)? Where am I looking to answer it? –Multiple primary care clinics (external validity?) How could it OPTIMALLY be answered? –Randomized Comparison Group How could it FEASIBLY be answered? –Stepped-wedge, non-randomized “matched” comparison sites, other?

Example #1 (cont’d) What is the impact of a natural experiment to implement an evidence-based intervention (EBI) within an HMO’s primary care clinics? What do I have control over? –HMO is willing to do a phase-in roll-out What data are currently available? –EHR, claims, pharmacy data What data do I need to gather? –How the EBI was delivered (implementation strategies/processes); patient outcomes; provider outcomes; organizational processes/outcomes

Example #2 What is the comparative effectiveness of two strategies to disseminate evidence-based guidelines for diet and exercise to schools? What is my primary question? –Is one strategy better than the other? Where am I looking to answer it? –Schools How could it OPTIMALLY be answered? –Matched-pair cluster randomized (personnel, students diversity, size, SES) How could it FEASIBLY be answered? –Same

Example #2 (cont’d) What is the comparative effectiveness of two strategies to disseminate evidence-based guidelines for diet and exercise to schools? What do I have control over? –Dissemination strategy, timeframe, data collection What data are currently available? –Unsure: Explore availability of curriculum outlines (health, science, or PE class), cafeteria purchasing data (student or school level), student fitness measures from PE What data do I need to gather? –Teacher behavior, student outcomes, organizational variables, other?

Glasgow, R. E. & Chambers, D. Clin Transl Sci 2012:5, D&I Characteristics and Implications for Study Design

Range of Study Designs in IS Observational: Neither manipulation nor random assignment –Cohort, Cross-sectional Experimental: Randomization and manipulation –Randomized controlled trials (RCTs), pragmatic RCTs (pRCTs), cluster RCTs, stepped-wedge cluster RCTs Quasi-Experimental: Manipulation but no randomization –Interrupted Time Series (ITS), Regression Discontinuity Design, Non-Equivalent Control Group Design

Range of Study Designs in IS Effectiveness-Implementation Hybrid Designs: Dual focus a priori in assessing effectiveness and implementation –Type 1, Type 2, Type 3 Mixed Methods: Collection and integration of qualitative and quantitative data –Embedded, explanatory, exploratory Simulation/Modeling –System dynamics, network analysis, agent-based modeling

Loudon et al. (2015). The PRECIS-2 tool: Designing trials that are fit for purpose. BMJ Experimental: pRCTs

Experimental: Stepped-Wedge Cluster RCT Brown & Lilford (2006). The stepped wedge trial design: A systematic review. BMC Med Res Method.

Quasi-Experimental Designs: ITS Flodgren & Oddgard-Jensen for Effective Practice and Organization of Care (2013). Interrupted time series analyses. Cochrane.

Curran et. al. (2012). Effectiveness-Implementation Hybrid Designs. Med Care. Effectiveness-Implementation Hybrid Designs

Hybrid Designs: 1, 2, 3 Curran et al. (2013). Effectiveness-implementation hybrid designs: Combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care.

 Collect qualitative and quantitative data to obtain broader and more comprehensive understanding of context  Conduct one study within the other type of design  Useful for understanding context and processes  Qualitative + Quantitative  Concurrent, embed, unequal Mixed Methods Designs: Embedded

 Qualitative data helps explain or build on initial quantitative results  Use qualitative to explain atypical or confusing quantitative results  Use (quantitative) participant characteristics to guide purposeful sampling for qualitative interviews  Quantitative Qualitative  Sequential, connect, unequal Mixed Methods Designs: Explanatory

 Quantitative data helps explain or build on initial qualitative results  Exploration is needed due to lack of available data, limited understanding of context, and/or few available instruments  Qualitative Quantitative  Sequential, connect, unequal Mixed Methods Designs: Exploratory

Basic: How do specific stakeholders interpret information about implementation? Applied: How does intervention X best get implemented in setting Y? Measurement: How do I validly measure implementation outcomes (or processes/strategies or contexts)? Design: How to account for variation at multiple levels? Range of D&I Questions

Take Home Points What is the best design? Depends on your research question(s)! Each design has strengths and weaknesses Valid measures exist, but not for all constructs Funded studies have variation in design Maximize rigor, relevance and feasibility of study design.

Practical Group Exercise

Instructions Each table has 4 decks of cards Pick one of each (randomly) Intervention (ITV) Context Key Question If applicable, pick implementation strategy (your choice)

Instructions (cont’d) Think about a study design that fits the Intervention (ITV), Context, and Key Question Discuss potential study designs and pros/cons of each viable study design Which study design is best suited based on the intervention, context and key question? Why? After 15 minutes, discuss as a group

Quick Reports of Group Exercise Quick Reports What design is best suited to answer your research question? Why? What factors influenced your decision- making process?

Take a Card from the Wild Card Deck… What Changes Would You Make? (10 Minutes) The Other Shoe

Quick Reports of Group Exercise (Part 2) What did you have to account for? Problems? Solutions?

Questions? Comments? Thank you!

Contact Information David A. Chambers, D.Phil Deputy Director, Implementation Science DCCPS, NCI Wynne E. Norton, PhD Program Officer, Implementation Science DCCPS, NCI