H676 Week 5 - Plan for Today Review your project and coding to date

Slides:



Advertisements
Similar presentations
Protocol Development.
Advertisements

Cross Cultural Research
Self-Administered Surveys: Mail Survey Methods ChihChien Chen Lauren Teffeau Week 10.
Evaluation Research Pierre-Auguste Renoir: Le Grenouillere, 1869.
Systematic Review of the Effectiveness of health behavior interventions based on TTM.
Experimental Research Designs
Developing Indicators to Assess Program Outcomes Kathryn E. H. Race Race & Associates, Ltd. Panel Presentation at American Evaluation Association Meeting.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Using Between-Subjects and Within-Subjects Experimental Designs
Specifying a Purpose, Research Questions or Hypothesis
SOWK 6003 Social Work Research Week 10 Quantitative Data Analysis
DTC Quantitative Research Methods Three (or more) Variables: Extensions to Cross- tabular Analyses Thursday 13 th November 2014.
TimeCleanser: A Visual Analytics Approach for Data Cleansing of Time-Oriented Data Theresia Gschwandtner, Wolfgang Aigner, Silvia Miksch, Johannes Gärtner,
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Correlational Designs
Association vs. Causation
(5) Moderators - Coding. Overview General Information to keep in mind:  Coded variables - Objective – which are usually study characteristics that can.
 For the IB Diploma Programme psychology course, the experimental method is defined as requiring: 1. The manipulation of one independent variable while.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
The Campbell Collaborationwww.campbellcollaboration.org C2 Training: May 9 – 10, 2011 Data Evaluation: Initial screening and Coding Adapted from David.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Correlational Research Chapter Fifteen Bring Schraw et al.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
Dependencies Complex Data in Meta-analysis. Common Dependencies Independent subgroups within a study (nested in lab?) Multiple outcomes on the same people.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
IMPLEMENTATION QUALITY RESEARCH OF PREVENTION PROGRAMS IN CROATIA MIRANDA NOVAK University of Zagreb, Faculty of Education and Rehabilitation Sciences.
Advanced Meta-Analyses Heterogeneity Analyses Fixed & Random Efffects models Single-variable Fixed Effects Model – “Q” Wilson Macro for Fixed Efffects.
Process Quality in ONS Rachel Skentelbery, Rachael Viles & Sarah Green
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Designs of Quasi-Experiments Studies for Assessing the Transport Enhancements and Physical Activity.
Additional elements: structure Goal/standard/audit Feedback Action plans Characteristics of recipients General/external aspects Mechanism.
CORRELATIONS: PART II. Overview  Interpreting Correlations: p-values  Challenges in Observational Research  Correlations reduced by poor psychometrics.
Smith/Davis (c) 2005 Prentice Hall Chapter Fifteen Inferential Tests of Significance III: Analyzing and Interpreting Experiments with Multiple Independent.
Randomized Single-Case Intervention Designs Joel R
The FDES revision process: progress so far, state of the art, the way forward United Nations Statistics Division.
Single-Subject and Correlational Research Bring Schraw et al.
CENTER FOR PREVENTION AND EARLY INTERVENTION  A COLLABORATION BETWEEN THE JHU BLOOMBERG SCHOOL OF PUBLIC HEALTH, BLOOMBERG SCHOOL OF PUBLIC HEALTH, THE.
Designing An Adaptive Treatment Susan A. Murphy Univ. of Michigan Joint with Linda Collins & Karen Bierman Pennsylvania State Univ.
ISSUES & CHALLENGES Adaptation, translation, and global application DiClemente, Crosby, & Kegler, 2009.
University of Warwick, Department of Sociology, 2014/15 SO 201: SSAASS (Surveys and Statistics) (Richard Lampard) Week 3 Multivariate analysis.
Bringing Diversity into Impact Evaluation: Towards a Broadened View of Design and Methods for Impact Evaluation Sanjeev Sridharan.
Standards of Evidence for Prevention Programs Brian R. Flay, D.Phil. Distinguished Professor Public Health and Psychology University of Illinois at Chicago.
Week Seven.  The systematic and rigorous integration and synthesis of evidence is a cornerstone of EBP  Impossible to develop “best practice” guidelines,
Mediators and moderators of physical activity maintenance: a systematic review. Jennifer Murray1, Dr. Sarah Brennan1, Prof David French2, Prof Chris Patterson1,
SESRI Workshop on Survey-based Experiments
Chapter 6 The Traditional Approach to Requirements.
Intervention Development in Elderly Adults (IDEA)
Coding Manual and Process
Implementation of Clinical Guidelines Author: dr. Martin Rusnák
Between-Subjects, within-subjects, and factorial Experimental Designs
Supplementary Table 1. PRISMA checklist
SESRI Workshop on Survey-based Experiments
H676 Meta-Analysis Brian Flay WEEK 1 Fall 2016 Thursdays 4-6:50
Between-Subjects Factorial Design
CHAPTER 29: Multiple Regression*
Effect size measures for single-case designs: General considerations
What qualifies as evidence of student learning?
Cross Sectional Designs
Research Methods: Concepts and Connections First Edition
Gerald Dyer, Jr., MPH October 20, 2016
Developing a Knowledge Translation Curriculum within TEACH
HMS Academy Fellowship in Medical Education Research June 2, 2016
Correlated-Groups and Single-Subject Designs
An Introduction to Correlational Research
THEORY OF CHANGE VS. LOGICAL FRAMEWORK
Factorial Designs Factorial design: a research design that includes two or more factors (Independent Variables) A two-factor design has two IVs. Example:
Author Department of general internal medicine Conference Date
Some Further Considerations in Combining Single Case and Group Designs
Meta-analysis, systematic reviews and research syntheses
Presentation transcript:

H676 Week 5 - Plan for Today Review your project and coding to date Complex Data (TEXT chapters 22-26) Multiple independent subgroups Multiple outcomes or time-points More than one comparison or treatment group Review your project and coding to date Statistical Power - Jaehun TEXT chapter 29 and Valentine et al., 2010 Complex Interventions (Multiple Readings) Map aspects of complexity onto sources of evidence

Complex Data (Borenstein et al., 2009) Multiple independent subgroups within studies E.g., by age, gender, ethnicity, etc. Multiple outcomes or time-points within studies E.g., Knowledge, attitudes, behavior, health indicators E.g., immediate, 3-months, 12-months More than one comparison or treatment group Multiple interventions By dosage or type Multiple controls – E.g., treatment as usual, placebo, no treatment All require separate rows when coding

Potential Moderators Variation between studies in Year of study or intervention delivery, published or not Research design Types of intervention or control conditions Sample characteristics E.g., demographics Assessed outcomes and measures of each outcome Multiple outcomes, Measures or Times All variations are coded in separate columns E.g., one column to indicate level of each variable for each row

Review of Coding 1: First columns First columns include: Study label and Statistics needed to calculate ES For intervention studies, use separate blocks of columns for: control and intervention groups at both pretest and follow-up Within each study, use one line per moderator: E.g., condition, subgroup, outcome, time of measure, etc. See next slide

Review of Coding 2: Subsequent columns Subsequent columns include: Items that describe the study Authors, Year Items that describe the research Research design, conditions, sample sizes by condition Items that describe the sample Age, ethnicity, health status, etc. For intervention studies, items that describe the intervention and its implementation/delivery Length (hours days or months), length and # of sessions Measures of implementation integrity Any of these factors (except authors) could be considered as moderators

Your turn Up to 5 minutes for each of you Tell us what your project is about Tell us how far along you are (or are not) Tell or show us your coding set-up Do you have any problems that you need help with?

Statistical Power Jaehun

Complex Interventions & Evaluations Complexity in causes + contexts + interventions + implementation & outcomes  Complex interventions & Complex research factors

Complex Interventions Multiple components Multiple groups or organizational levels Flexibility or tailoring permitted Self-organization, evolution over time

Complex Causal Pathways Nonlinear relationships, phase changes Multiple mediators Multiple moderators Feedback loops Synergy between components Multiple outcomes

Complex research questions (Pettigrew et al., 2013) Start with a straight-forward question Then consider multiple outcomes Whether from simple or complex interventions Some of these link in causal chains Suggesting mediation Then consider intervention complexity Develop a logic model of how the class of interventions being reviewed might work Code only those intervention components that vary across studies They become possible moderators Then consider aspects of context They also become possible moderators At all levels, code only those aspects that vary across studies

Research Questions about Complex Interventions (Squires et al., 2013) Research questions should consider PICO(C/S) Participants, Intervention(s), Comparison(s), Outcomes, and Context or Study design Complex interventions can contain a mix of effective and ineffective (or even harmful) components Interactions might be Synergistic (effect is > sum of components) or Dysynergistic (effect is < sum of components) Interdependencies might be Contemporaneous (effect of one depends on the presence of another) or Temporal (effect of one depends on a prior result)

Scope of Review: Lumping vs Splitting Splitters minimize complexity of review by limiting studies to close replications Lumpers include interventions with different combinations of complexity A lumped review could end up being “split” into two subgroup analyses as a result of identified heterogeneity

Specification of Complex Interventions Pragmatic features are easier to define than theoretical differences Clear definition of prototypical components Those deemed necessary to define the intervention Define discretionary components/delivery Might or might not be present in each instance (code it) A scoping review can help identify these variations Logic models help reviewers/readers visualize intervention Can include implementation, process & economic data

Intervention components that can lead to hetero-geneity of results

Methodological Characteristics that contribute to heterogeneity (Pigott & Shepperd, 2013)

Need for linguistic/conceptual precision (Noyes et al., 2013) Complexity in: Intervention Implementation Context Participant responses Interactions between any two or more of these