Download presentation
Presentation is loading. Please wait.
Published byLoren Bishop Modified over 9 years ago
1
Choosing the Appropriate Design to Optimize Behavioral and Social Interventions C Hendricks Brown University of Miami Miller School of Medicine Director Prevention Science & Methodology Group (PSMG) Director, Center for Prevention Implementation Methodology (NIDA, OBSSR) Director, Social Systems Informatics at the Center for Computational Science
2
Center for Prevention Implementation Methodology (Ce-PIM) on Drug Abuse and Sexual Risk Behavior NIDA, OBSSR
3
Prevention Science and Methodology Group (PSMG) Collaborative Data Synthesis for Adolescent Depression Trials NIMH R01-MH040859- 23 NIH Summer Institute on Social and Behavioral Intervention Research 3
4
What do we mean by Design Optimize Behavioral and Social Interventions NIH Summer Institute on Social and Behavioral Intervention Research 4
5
What do we mean by Choosing -- who? Design -- randomized or not? Optimize – ways to have greater impact? Behavioral and Social Interventions – what type of interventions? NIH Summer Institute on Social and Behavioral Intervention Research 5
6
DESIGN and design Starting Place for a Design – Community, Research, and Funder Stakeholders NIH Summer Institute on Social and Behavioral Intervention Research 6
7
Schematic of an Intervention Research Design NIH Summer Institute on Social and Behavioral Intervention Research 7 Determine Research Question(s) Research Team Expertise Institutional & Community Partnerships Intervention Conditions Population Measures and Follow-up/ Assessment Procedures Sampling, Sample Size and Enrolling Assignment to Intervention Condition Ethics/Human Subjects/Values Financial/Logistics
8
Design comes through an Integration of Research, Community, and Methodology NIH Summer Institute on Social and Behavioral Intervention Research 8 Research Community Methodology Develop Programs Tech Assistance Conduct Evaluation Mandate Community Resources, Deliver Programs Quality Improvement Measuring Modeling Testing
9
Where are we going in Research? NIH Summer Institute on Social and Behavioral Intervention Research 9
10
Just like windshield wipers, Good Designs help us see….. NIH Summer Institute on Social and Behavioral Intervention Research 10
11
What do we mean by Optimal Intervention? 1.Find the most efficacious intervention based on overall impact Adaptive Trial Designs Comparative Effectiveness Trial 2. Deliver intervention that addresses the specific needs of a target population Designs to Test Moderation 3.Deliver an optimal intervention for each person – Personalized intervention trials Preference Trials 4.Implement intervention to have broadest impact at a population level Enrollment Trials Implementation Trials Roll-Out Dynamic Wait-Listed Trials NIH Summer Institute on Social and Behavioral Intervention Research 11
12
Find the most efficacious intervention based on overall impact Adaptive Trial Designs Comparative Effectiveness Trial Adaptive Some of these adaptations affect (1)specific elements of the study design of an ongoing trial, Adaptive Design (2) some relate to the next trial or stages within a trial: Adaptive Sequencing of Trials (3) relate to the intervention or to its delivery. Adaptive Intervention NIH Summer Institute on Social and Behavioral Intervention Research 12
13
Adaptive Design, Sequencing, and Intervention Brown, Ten Have, Jo et al., 2009 Ann Rev P H Adaptive Design: planned modification of characteristics of the trial itself based on information from the data already accumulated. Optimal Dose Trial (Draglin & Fedrov 2004) First Trial 1: 0 10* 20 Second Trial 2: 5 10 15 Internet, m-health based trials NIH Summer Institute on Social and Behavioral Intervention Research 13
14
Optimal Dose Presentation to Center for Personalized Prevention Research 14
15
Presentation to Center for Personalized Prevention Research 15
16
Presentation to Center for Personalized Prevention Research 16
17
When to use Optimal Dosage Trials Potential for enrolling many people (Munoz Internet Interventions for Smoking Cessation) Quick response (intermediate outcome) to interventions Easily manipulate dosage Mobile and Internet behavioral interventions fit these well. NIH Summer Institute on Social and Behavioral Intervention Research 17
18
Comparative Effectiveness Trials Test Two or More Treatments Head to Head Potential advantage in treatment (vs. prevention trials) where requirement is to do something Example: Earlise Ward (U Wisc Madison) Behavioral Intervention for Depressed African American Adults Testing Standard Coping with Depression Course vs Oh Happy Day Culturally Adapted Version NIH Summer Institute on Social and Behavioral Intervention Research 18
19
Issues with delivery in group settings Attempt to recruit 10 women per group Unethical (and inefficient) to delay treatment too long Formed a group of 4 women, 2 couldn’t work with that schedule Need a design protocol that is sufficiently flexible to deal with anticipated problems without destroying the trial NIH Summer Institute on Social and Behavioral Intervention Research 19
20
Comparative Effectiveness It is possible to compare two interventions that are never directly tested against one another Trial 1 A vs Control A – C Trial 2 B vs ControlB – C Difference estimates A – B Network Meta-Analysis Hoaglin & Hawkins 2011 Report of the ISPOR Task Force on Indirect Comparisons: Good Research Practices Part II, Value in Health 14, 429-437. NIH Summer Institute on Social and Behavioral Intervention Research 20
21
Example: Indirect Comparison of Good Behavior Game (G) and Mastery Learning (M) through Controls (C) NIH Summer Institute on Social and Behavioral Intervention Research 21 School 1 School 6 School 7School 12 Ctl GM Kellam, Brown et al.; Brown, Wang et al., Drug & Alc Dep 2008
22
Findings NIH Summer Institute on Social and Behavioral Intervention Research 22
23
2. Optimal Designs: Deliver intervention that addresses the specific needs of a target population How does benefit vary by baseline level of risk; should there be different interventions by level of risk? Do impulsive youth respond differently to interventions to prevent drug abuse than do non-impulsive youth? Do minority, depressed women do better on CBT or antidepressants? Should antidepressants be given for those with more moderate levels of baseline depression? NIH Summer Institute on Social and Behavioral Intervention Research 23
24
Example: Moderation of Intervention Effect on Proximal Target by Baseline Presentation to Center for Personalized Prevention Research 24 Brief Teen Intn Extended Teen Intn Randomize Impulsive Decision Making Drug Use Impulsive
25
Statistical Power to Detect Moderation Effects for Subgroups Essentially need at least 4 times the size you need for examining main effects Need either a very large study or combine different studies together in an integrative data analysis (Brown et al., Prev Sci, 2011) NIH Summer Institute on Social and Behavioral Intervention Research 25
26
Growth Mixture Models: WECare Miranda et al, JAMA 2003, Siddique et al., under review NIH Summer Institute on Social and Behavioral Intervention Research 26 Med CBT
27
Statistical Power can be calculated for Longer term studies: growth models Variation in impact: growth mixture models Muthen, Brown et al., 2002 Biostatistics NIH Summer Institute on Social and Behavioral Intervention Research 27
28
How Does Baseline Level of Depression Affect Improvement in the Slope of Depressive Symptoms for Antidepressants? Helen Denne Schulte School of Nursing Visiting Lecture 28
29
3. Deliver an optimal intervention for each person Personalized intention: Collins, Murphy, et al. AJPM 2007 What are generally better intervention components? Multiphase Optimization Strategy Trial (MOST) A, a, B,b C, c D, d, E, e, F, f -- 64 different combinations reduce # of combinations tested by assuming higher order interactions are small Fractional factorial design: ABCD, abcd, aBcD, … 16 combinations NIH Summer Institute on Social and Behavioral Intervention Research 29
30
Nonresponders to a first intervention get a follow-up intervention Sequential Multiple Assignment Randomized Trial (SMART), Collins et al., 2007 NIH Summer Institute on Social and Behavioral Intervention Research 30
31
SMART Trial for Conduct Problem Prevention Presentation to Center for Personalized Prevention Research 31 Youth Focused Parent Focused Responder Monitor Randomize Monitor Randomize CPP Youth Focused CPP Parent/Youth Focused CPP Parent Focused Non Responder
32
Example 1: Proposed SMART Trial for Conduct Problem Prevention Presentation to Center for Personalized Prevention Research 32 Youth Focused Parent Focused Responder Monitor Randomize Monitor Randomize CPP Youth Focused CPP Parent/Youth Focused CPP Parent Focused Non Responder
33
Another Personalized Intervention: Preference Preference Trial NIH Summer Institute on Social and Behavioral Intervention Research 33
34
Would allowing subjects to choose their preferred treatment... Increase the uptake of evidence-based interventions? Increase the adherence to one of these interventions? Improve outcomes? NIH Summer Institute on Social and Behavioral Intervention Research 34
35
Equipoise: Another approach related to Preference or Choice Equipoise – ideally there should be no preference in a trial for one intervention over another Clinician equipoise Subject equipoise One interesting Equipoise design: Offer a range of different intervention options for a subject, but randomize only to one that the subject states she is willing to take. Nothing, Watchful waiting, SSRI, TCA, CBT – X x NIH Summer Institute on Social and Behavioral Intervention Research 35
36
Preference Trial Presentation to Center for Personalized Prevention Research 36 Assign by Preference Standard Assignment Randomize Usual Service Home PMTO Clinic PMTO In-Person Group PMTO Choice On-Line Group PMTO
37
Example 3: Preference Trial Presentation to Center for Personalized Prevention Research 37 Assign by Preference Standard Assignment Randomize Usual Service Home PMTO Clinic PMTO In-Person Group PMTO Choice On-Line Group PMTO
38
Traditional Randomized Trial: Randomization AFTER Consent NIH Summer Institute on Social and Behavioral Intervention Research 38 Random ized Consent to be Randomized to RCT? No Info CBTAntidepressant
39
Doubly Randomized Trial: Randomization BEFORE and AFTER Consent NIH Summer Institute on Social and Behavioral Intervention Research 39 Random ized to RCT Consent to be Randomized To RCT? No Info CBTAntidepressant Consent to Preference Arm of Trial Randomiz ed to Invitation CBTAD No Both Yes No No Info
40
Adaptive Enrollment Designs: Who gets into a trial Testing Motivational Interviewing Strategies Presentation to Center for Personalized Prevention Research 40
41
Example 3: Encouragement/Preference Trial Presentation to Center for Personalized Prevention Research 41 MI - 1 MI - 2 Randomize Choice Home PMTO Clinic PMTO In-Person Group PMTO On-Line Group PMTO
42
Complier (Participant) Average Causal Effect (CACE) Modeling for Encouragement Designs Presentation to Center for Personalized Prevention Research 42 LowHigh Participant Non- Participant
43
Designs to Screen a Large Number of Baseline Characteristics Interacting with Intervention (e.g., G x E) So far, reports are limited to single locus alleles (Brody) How to begin searching for multiple gene interactions NIH Summer Institute on Social and Behavioral Intervention Research 43
44
Propose 2-Phase Trials to Screen and Confirm Interactions Phase I Phase II Presentation to Center for Personalized Prevention Research 44
45
Propose 2-Phase Trials To Look at Interaction Phase I: Test for interactions among a finite subset of K covariates (genotypes), say with Power of 0.9, but Type I error of say 0.2. From analysis, Select the L top covariates Phase II: Compute a new sample size Rerun New Study but Limit Interactions to those who have passed through first phase Presentation to Center for Personalized Prevention Research 45
46
An Example Looking at 20 G x E Interactions Phase I: set α = 0.20 β = 0.90 z = 1.28 Screen in for 2 nd stage all significant interactions, drop all non-significant ones Start up new study: set α = 0.05 β = 0.90 z = 1.96 Select all that are significant, drop others 20 Interactions: 5 are Non-Null, 15 Null How well does this procedure Find ANY Non-Null interaction? Find ALL Non-Null interactions? Find at least HALF the Non-Null interactions? Presentation to Center for Personalized Prevention Research 46
47
Good at picking up at least half Presentation to Center for Personalized Prevention Research 47
48
4. Optimum Intervention: Deliver intervention to have broadest impact at a population level The best intervention won’t produce population impact if people won’t take it, isn’t delivered with fidelity, or sustained. Dissemination and Implementation Research Enrollment Trials Encouragement Trials Implementation Trials Roll-Out Dynamic Wait-Listed Trials NIH Summer Institute on Social and Behavioral Intervention Research 48
49
Glasgow, AJPM 2007 Example: Dissemination of eHealth Interventions To what extent did participants log onto the website each week? decrease their fast food consumption? Assessment Automated measures of website engagement participant self-monitoring. NIH Summer Institute on Social and Behavioral Intervention Research 49
50
Designs for Dissemination Who gets invited and who comes? — What percent of those invited participate? — What are the characteristics of participants? — What are barriers to patient participation in this context? NIH Summer Institute on Social and Behavioral Intervention Research 50
51
RE-AIM (Glasgow et al., 1999) What Population-Level Impact Can You Anticipate? Reach Effectiveness Adoption Implementation or consistency of program Maintenance of sustainability NIH Summer Institute on Social and Behavioral Intervention Research 51
52
Useful Nonrandomized Designs for Dissemination RE-AIM components 1. Broadcast of an Intervention Standardize Invitation and See Who Comes Kellam SG, Branch J, Brown CH, Russell G. Why teenagers come for treatment: A ten-year prospective study of Woodlawn. Journal of the American Academy of Child Psychiatry 20:477-95, 1981. 2. Narrowcast of an Intervention Social Marketing to a Targeted Audience 3. “Timecast” of an Intervention Multiple Baseline design = Interrupted Time Series o Repeated measures over time of a community outcome o Introduce the intervention to a community midway through o Check whether community outcome differs before and after introduction NIH Summer Institute on Social and Behavioral Intervention Research 52
53
Hawkins NG et al., AJPM 2007 NIH Summer Institute on Social and Behavioral Intervention Research 53
54
Biglan et al., AJCP 2006 NIH Summer Institute on Social and Behavioral Intervention Research 54
55
Other Communities NIH Summer Institute on Social and Behavioral Intervention Research 55
56
Inferential Challenge with this Design What if and Exogenous Factor Happens at one of these Times of Transition? What if you Select the Most Promising Communities to Work with First? What if there are only a small number of communities? Harder to Conclude that Dissemination Caused Change. NIH Summer Institute on Social and Behavioral Intervention Research 56
57
Turning a Multiple Baseline Design into a True Randomized Experiment: Roll- Out Design ROLL-OUT DESIGN Divide Available Communities into Comparable Batches Start Measuring Outcomes for All Communities Randomly Assign Each Comparable Batch to WHEN the Dissemination Begins At the end, ALL Communities Are Exposed Analysis Uses All Communities and All Times Communities Still Serve as Own Controls Communities Compared by Exposure Status Across Time NIH Summer Institute on Social and Behavioral Intervention Research 57
58
Roll-Out Randomized Trials for Dissemination Research (Brown et al., 2006 2008) NIH Summer Institute on Social and Behavioral Intervention Research 58 Population 1 2 3 4 5 Time of Transition in Dissemination Randomly Determined R Equivalent Subsets that are Ordered Randomly
59
Timing of Dissemination ( 0 to x) 1 0 0 x x x x x 2 0 0 0 x x x x 3 0 0 0 0 x x x 4 0 0 0 0 0 x x 5 0 0 0 0 0 0 x NIH Summer Institute on Social and Behavioral Intervention Research 59
60
Implementation Questions NIH Summer Institute on Social and Behavioral Intervention Research 60
61
Common Implementation Research Questions How to test whether intervention effects are sustained as supports are lifted? How to increase the adoption of a program, by communities as well as individuals? Does adaptation change outcomes? What support structures lead to successful implementation, high fidelity, and sustainability? NIH Summer Institute on Social and Behavioral Intervention Research 61
62
What is Unique about Implementation Trials 2. Intervention Arm Assignments Efficacy/ Effectiveness Trials Active Condition vs. Control Implementation Trials NIH Summer Institute on Social and Behavioral Intervention Research 62 Active ControlActive R Active, Implementation I Active, Implementation II R
63
What is Unique about Implementation Trials 3. Open to Unanticipated Events that Cannot be Controlled but Must be Handled to Maintain the Design’s Integrity NIH Summer Institute on Social and Behavioral Intervention Research 63
64
Types of Protocol Deviations DeviationsConsequence to Causal Inferences No Need to Impose Protocol RulesNone Protocols were FollowedTypically None Protocols were Not FollowedPossibly Severe No Protocols AvailablePossibly Severe NIH Summer Institute on Social and Behavioral Intervention Research 64
65
CAL-40 Trial NIMH R01 Implementation Trial Chamberlain, Brown, Reid, Saldana, Marenich, Sosna, Padgett, Bouman, Wang Chamberlain et al., (2008). Adm Policy MH Chamberlain et al., (2009). DeGennaro & Fogel NIH Summer Institute on Social and Behavioral Intervention Research 65
66
Multidimensional Treatment Foster Care MTFC Alternative to incarceration or placing youth in residential or group care 24 hour support for foster parents backed up by a clinical team – Behavioral parent training and support – Family therapy for bio parents – Skill training for youth – Supportive therapy for youth – School-based behavioral interventions – Psychiatric consultation and med mgt NIH Summer Institute on Social and Behavioral Intervention Research 66
67
MTFC Efficacy and Evidence-Based Chamberlain & Reid, 1998; Leve, Chamberlain & Reid, 2005 MTFC is one of 10 evidence-based National Model Programs rated by the Office of Juvenile Justice and Delinquency Prevention (OJJDP) (Elliott, 1998), One of nine National Exemplary Safe, Disciplined, and Drug-Free Schools Model Programs. NIH Summer Institute on Social and Behavioral Intervention Research 67
68
CAL-40 Trial MTFC – “Evidence-based Program” Offered to all California Counties -- MH, CW, JJ 10% Early Adoption Can Implementation be Increased for Non-Early Adopters? Community Development Teams (CDT) versus assistant to counties independently (IND) Randomize Counties to 2 Different Models of Technical Assistance: CTD or IND Outcomes Proportion of counties that adopt MTFC and the rate of adoption. Stage of MTFC implementation that counties attain. Fidelity of implementation, including model adherence and practitioner competence. Sustainability of the program over time. Implementation Trial, Not Effectiveness Trial NIH Summer Institute on Social and Behavioral Intervention Research 68
69
Randomly Assign Counties Inclusion/Exclusion Criteria All California Counties Included Except Early Adopters Counties with too few Placements One where Court Case Occurring (LA) Counties balanced 3 Cohorts and 2 Intervention Conditions (CDT and IND) NIH Summer Institute on Social and Behavioral Intervention Research 69
70
Randomization Design NIH Summer Institute on Social and Behavioral Intervention Research 70
71
NIH Summer Institute on Social and Behavioral Intervention Research 71
72
Issues in the CA-MTFC Design Acceptance of the Design was Complete Some Counties Were Not Ready to take Part Moved Up Counties from Next Cohort Rural Counties required some adaptation Management of Peer-to-Peer Communications NIH Summer Institute on Social and Behavioral Intervention Research 72
73
New Issue Running Randomized Trials During a Recession Primary Outcome is Time it Takes for a County to Place its First Family in MTFC Foster Care NIH Summer Institute on Social and Behavioral Intervention Research 73
74
Types of Protocol Deviations DeviationsExampleResponseConsequence No Need to Impose Protocol RulesAll Counties Were Willing to Participate Protocols were FollowedWhen County Was not Ready to Start in their Assigned Cohort Randomly and Blindly Selected County in SAME Condition in Next Cohort So Far Little Imbalance, little indication of differential replacement bias Interactions of Counties in Different Conditions Allowed and measured Potential reduction in statistical power Protocols were Not Followed No Protocols AvailableStagnancy of County Governments’ Budgets Extend trial to another state with same inclusion/exclusio n criteria Statistical Power NIH Summer Institute on Social and Behavioral Intervention Research 74
75
What’s different about Implementation Methodology? Study Interactional Processes Program Delivery: Fidelity, adherence, quality, and dosage NIH Summer Institute on Social and Behavioral Intervention Research 75
76
An Example of Interactions: Why Different Methods are Needed: Relationship between Dosage and Outcome Traditional Clinical Dosage Trial – Randomly assign units to a dose – Measure outcome – Summarize Dose Response NIH Summer Institute on Social and Behavioral Intervention Research 76
77
Typical Dose Response Trial NIH Summer Institute on Social and Behavioral Intervention Research 77
78
Real Implementation: Amount of MH Treatment Last 10 Years vs Current Sx NIH Summer Institute on Social and Behavioral Intervention Research 78
79
Amount of MH Treatment Predicting Current Symptoms NIH Summer Institute on Social and Behavioral Intervention Research 79
80
Interaction Between Symptoms and Treatment: Antidepressants in Youth NIH Summer Institute on Social and Behavioral Intervention Research 80 Symptoms Tx - Gibbons RD, Hur K, Brown CH, Davis JM, Mann JJ (2012). Who Benefits from Antidepressants? Synthesis of 6-Week Patient-Level Outcomes from Double-Blind Placebo Controlled Randomized Trials of Fluoxetine and Venlafaxine. Arch of Psychiatry.
81
Interaction Between Symptoms and Treatment: Antidepressants in Youth NIH Summer Institute on Social and Behavioral Intervention Research 81 Symptoms Tx + - Gibbons RD, Hur K, Brown CH, Davis JM, Mann JJ (2012). Who Benefits from Antidepressants? Synthesis of 6-Week Patient-Level Outcomes from Double-Blind Placebo Controlled Randomized Trials of Fluoxetine and Venlafaxine. Arch of Psychiatry.
82
What Happens with a Policy Change that Affects How Treatments are Provided? NIH Summer Institute on Social and Behavioral Intervention Research 82 Symptoms Tx + - FDA Policy Blackbox Warning on Antidepressants Suicide Risk Gibbons RD, Brown CH, et al.. Amer J Psychiatry, 164: 1044-1049, 2007. Gibbons RD, Brown CH, et al.. Amer J Psychiatry, 164: 1356-1363, 2007. - + -- Or Null
83
Challenges in implementation research that can be addressed from the point of view of agent based models. Implementation is All About Interactions Agents can represent individuals, groups of individuals, social networks, or other definable objects. Types of Agents for Implementation Agents: Children, Teachers, Coaches of the Good Behavior Game GBG involves a group contingent approach to reducing aggressive/disruptive behavior Heterogeneity across agents Some children aggressive, others virtually not Some teachers deliver GBG well, others not Some coaches provide successful training, others not Nonlinear effects Increasing the amount of training that a teacher receives may enhance fidelity up to a point; Some teachers may actively resisting too much training with fidelity degrading accordingly. NIH Summer Institute on Social and Behavioral Intervention Research 83
84
“Agents” Involved in Implementation Implementation Agent or Agency NIH Summer Institute on Social and Behavioral Intervention Research 84 Intervention Agent or Agency Target Research Agent Purveyor/ Developer Intermediary or Broker Funder Snyder J., Reid J., et al. Measurement Systems for Randomized Intervention Trials: The Role of Behavioral Observation in Studying Treatment Mediators and Short-Term Outcomes. J. Prevention Science, 7, 43-56, 2006.
85
Methodologic Approaches in Intervention Research TimePhaseMajor DesignMajor Modeling 1950sEfficacyRCTANCOVA 1970s/80sExamine TrajectoriesRepeated Measures Growth Curves 1970s/80sClusteringGroup Randomized Trials Multilevel Modeling 1990sVariation in ImpactDesigns for Person, Place, Time Growth Mixture Modeling 2000sCausalityMediation Designs Mediation Analysis 2010sNetworksNetwork Interventions Network Analysis 2010sImplementationImplementation Trial /Quality Improvement Agent-Based Models, Machine Learning, AI NIH Summer Institute on Social and Behavioral Intervention Research 85
86
Your Role in the Next Stage of Research: Relying on What We Now Know NIH Summer Institute on Social and Behavioral Intervention Research 86
87
Your Role in the Next Stage of Implementation Research: Your Ideas in Building This Field NIH Summer Institute on Social and Behavioral Intervention Research 87
88
Related Papers Aarons GA, Hurlburt M, Horwitz SM. Advancing a Conceptual Model of Evidence-Based Practice Implementation in Public Service Sectors. Administration and Policy in Mental Health and Mental Health Services Research. In press. Brown, C. H., P. A. Wyman, et al. (2007). "The role of randomized trials in testing interventions for the prevention of youth suicide." International Review of Psychiatry 19(6): 617-631. Brown, C. H., P. A. Wyman, et al. (2006). "Dynamic wait-listed designs for randomized trials: new designs for prevention of youth suicide." Clinical Trials 3(3): 259-271. Brown, Kellam, Muthen, Wang, Kaupert, Ogihara, Valente, McManus, Pantin, Szapocznik (accepted). Partnerships for Effectiveness and Implementation Research: Experiences of the Prevention Science and Methodology Group Brown CH. Design principles and their application in preventive field trials. In WJ Bukoski and Z Sloboda, Handbook of Drug Abuse Prevention: Theory, Science, and Practice. New York: Plenum Press, pp. 523-540, 2003. Brown, CH, Berndt D, Brinales JM, Zong X, and Bhagwat D. Evaluating the Evidence of Effectiveness for Preventive Interventions: Using a Registry System to Influence Policy through Science. Addictive Behaviors, 25, 955-964, 2000. Brown, CH, Ten Have TR, Jo B, Dagne G, Wyman PA, Muthén BO, Gibbons RD. Adaptive Designs in Public Health. Annual Review Public Health, 30: 17.1-17.25, 2009. Brown CH, Sloboda Z, Faggiano F, Teasdale B, Keller F, Burkhart G (Forthcoming). Methods for Synthesizing Findings on Moderation Effects Across Multiple Randomized Trials. To appear in Prevention Science. NIH Summer Institute on Social and Behavioral Intervention Research 88
89
Related Papers Chamberlain P, Saldana L, Brown CH, Leve LD. Implementation of MTFC in California: A Randomized Trial of an Evidence-Based Practice. In M Roberts-DeGennaro & SJ Fogel (Eds.) Empirically Supported Interventions for Community and Organizational Change. Lyceum Books, Inc, In Press. Chamberlain, P., Saldana, L., Brown, H., & Leve, L. (2010). Implementation of MTFC in California: A Randomized Trial of an Evidence-Based Practice. In M. Roberts-DeGennaro, & S. J. Fogel (Eds.), Using Evidence to Inform Practice for Community and Organizational Change (pp.218–234). Chicago: Lyceum. Chamberlain, P, Marsenich L, Sosna, T, et al. (accepted for publication). Three collaborative models for scaling up evidence-based programs. Flay, B, Biglan A, et al. (2005). Standards of Evidence: Criteria for Efficacy, Effectiveness and Dissemination, Prevention Sci, 6, 152-175. Landsverk J, Brown C, Rolls Reutz J, Palinkas L, Horwitz S. Design Elements in Implementation Research: A Structured Review of Child Welfare and Child Mental Health Studies. Administration and Policy in Mental Health and Mental Health Services Research. 2011:1-10. NIH Summer Institute on Social and Behavioral Intervention Research 89
90
Related Papers Wang, Saldana, Brown, Chamberlain (2010). Factors that influenced county system leaders to implement an evidence-based program: a baseline survey within a randomized controlled trial. Implementation Science. Chamberlain P, Saldana L, Brown CH, Leve LD. Implementation of MTFC in California: A Randomized Trial of an Evidence-Based Practice. In M Roberts-DeGennaro & SJ Fogel (Eds.) Empirically Supported Interventions for Community and Organizational Change. Lyceum Books, Inc, In Press. Aarons, Horwitz, Hurlburt, Landsverk (accepted for publication). Advancing a Conceptual Model of Evidence-Based Practice Implementation in Public Mental Health and Child Welfare Sectors Landsverk, Brown, Chamberlain, Palinkas, Horwitz (in preparation). Design and Analysis in Dissemination and Implementation Research Brown, Kellam, Muthen, Wang, Kaupert, Ogihara, Valente, McManus, Pantin, Szapocznik (in preparation). Partnerships for Effectiveness and Implementation Research: Experiences of the Prevention Science and Methodology Group Books Valente (2010), Social Networks and Health: Models, Methods and Applications Palinkas and Soydan (2010). Translation and Implementation of Evidence Based Practice in Social Work: A Strategy for Research NIH Summer Institute on Social and Behavioral Intervention Research 90
91
Where do you go for statistical power for complicated designs? Adding in the factor of baseline characteristics or covariates (ANCOVA) – G*Power Group/Place-based randomization: Optimal Design (OD) (Spybrook et al, 2009). Free download. http://sitemaker.umich.edu/group- based/optimal_design_softwarehttp://sitemaker.umich.edu/group- based/optimal_design_software Repeated measures/growth models: RMASS, free, web-based http://www.healthstats.org/rmass/ - 3-level mixed effects regression modelshttp://www.healthstats.org/rmass/ More complex analyses: Mplus (Muthen & Muthen, 2008) – Monte Carlo simulation Multilevel designs: OD, RMASS, G*POWER NIH Summer Institute on Social and Behavioral Intervention Research 91
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.