Download presentation
Presentation is loading. Please wait.
Published byLionel Thomas Modified over 9 years ago
1
MiBLSi Schools’ Implementation Process and Student Outcomes Anna L. Harms Michigan State University MiBLSi State Conference 2009 1
2
Agenda Reasons for studying implementation and ways to do it Linking research to our schools’ data Next steps Questions and Feedback MiBLSi State Conference 20092
3
The Status of Research Primary focus has been on developing and identifying practices... – National Reading Panel Reports – What Works Clearinghouse – Florida Center for Reading Research Reviews – OJJDP Model Programs – Center for the Study and Prevention of Violence Model Programs MiBLSi State Conference 20093
4
What determines the evidence base for a practice? MiBLSi State Conference 20094 Independent randomized control trial is the gold standard Effect size (Cohen, 1988) : – Large:.80 – Moderate:.50 – Minimal/Weak:.20
5
Efficacy vs. Effectiveness (Christensen, Carlson, Valdez, 2003) Efficacy – controlled conditions – Conducted by innovation developers Effectiveness – External to the developers of an innovation – Replication – Under different conditions MiBLSi State Conference 20095 RESEARCHPRACTICE IMPLEMENTATION
6
Greenberg, Domitrovich, Graczyk, Zins (2005) MiBLSi State Conference 20096 PLANNED INTERVENTION PLANNED INTERVENTION PLANNED IMPLEMENTATION SYSTEM PLANNED IMPLEMENTATION SYSTEM PROGRAM AS IMPLEMENTED ACTUAL INTERVENTION ACTUAL MPLEMENTATION SUPPORT =
7
NIRN/SISEP Framework for Implementation Stages of Implementation Core Implementation Components Multi-level Influences on Successful Implementation MiBLSi State Conference 20097
8
Effective Intervention Practices + Effective Implementation Strategies _______________________________ = Positive Outcomes for Students SISEP, 2009 MiBLSi State Conference 20098
9
Getting into the Habit of Collecting, Analyzing, and Acting Upon Data MiBLSi State Conference 20099 Problem Identification Problem Analysis Plan Selection Plan Implementation Plan Evaluation DATA & DOCUMENTATION
10
Response to I ________ I ntervention ? I nstruction ? I mplementation of evidence-based practices MiBLSi State Conference 200910
11
Reasons for Studying and Monitoring Implementation Effort evaluation Quality improvement Documentation Internal validity Program theory Process evaluation Diffusion Evaluation quality MiBLSi State Conference 200911 Greenberg, M. T., Domitrovich, C. E., Graczyk, P. A., & Zins, J. E. (2005).
12
What tools can we use to measure implementation of school-wide systems? MiBLSi State Conference 200912
13
Tier 1 Implementation Tools READINGBEHAVIOR Planning and Evaluation ToolEffective Behavior Supports Team Implementation Checklist Effective Reading Supports Team Implementation Checklist Effective Behavior Supports Self Assessment Survey Observational Protocols School-wide Evaluation Tool Principle’s Reading Walkthrough Documents Benchmarks of Quality School Climate Survey MiBLSi State Conference 200913
14
Tier 2 & 3 Implementation Tools READINGBEHAVIOR Intervention Validity Checklists Checklist for Individual Student Systems IEP Implementation Validity Checks MiBLSi State Conference 200914
15
MiBLSi Mission Statement “to develop support systems and sustained implementation of a data-driven, problem solving model in schools to help students become better readers with social skills necessary for success” MiBLSi State Conference 200915
16
Our Data COHORTSTART DATESCHOOLS*YEARS OF DATA AVAILABLE 1January 2003154.5 2February 2005273.5 3January 2006502.5 4.1January 2007651.5 4.2March 2007271.3 4.3June, 2007111 MiBLSi State Conference 200916 * Refers to # of elementary schools included in this study. MiBLSi’s existing data Elementary Schools (any combination of K-6)
17
Purpose of the Study To systematically examine schools’ process of implementing school-wide positive behavior supports and a school-wide reading model during participation with a statewide RtI project. To systematically examine the relation between implementation fidelity of an integrated three-tier model and student outcomes. MiBLSi State Conference 200917
18
Conceptual Framework MiBLSi State Conference 200918 PLANNED INTERVENTION School-wide Positive Behavior Supports Response to Intervention for Reading ACTUAL IMPLEMENTATION Submission of Implementation Checklists Scores on Implementation Checklists STUDENT OUTCOMES Office Discipline Referrals Performance on Curriculum-Based Literacy Measures Performance on State- Wide Standardized Test in Reading (Chen, 1998; Greenberg et al., 2005)
19
Measuring Implementation Effective Behavior Support Self Assessment Survey (EBS-SAS) Spring of each school year Total % implementation by building location Effective Behavior Support Team Implementation Checklist (EBS-TIC) 4 x per school year (quarterly) Total % Implementation Planning and Evaluation Tool for Effective Reading Supports- Revised (PET-R) Fall of each school year Total/Overall % implementation MiBLSi State Conference 200919
20
THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 200920
21
Systems Implementation Research Expect 3-5 years for full implementation (Fixsen, Naoom, Blase, Friedman & Wallace, 2004; OSEP Center on Positive Behavioral Interventions and Supports, 2004; Sprague et al., 2001) Studies often split up implementation and outcomes (Reading First--U.S. Department of Education, 2006) View implementation at one point in time (McCurdy, Mannella & Eldridge, 2003); McIntosh, Chard, Boland & Horner, 2006; Mass- Galloway, Panyan, Smith & Wessendorf, 2008) A need for systematic research MiBLSi State Conference 200921
22
THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 200922
23
Process and Progress Just as we measure student progress, we should also measure our progress toward implementation efforts. What is our current level of implementation? What is our goal? How do we get from here to there? MiBLSi State Conference 200923
24
How do scores vary by year of implementation? MiBLSi State Conference 200924
25
MiBLSi State Conference 200925
26
MiBLSi State Conference 200926
27
MiBLSi State Conference 200927
28
THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 200928
29
How long does it take? 2-5 years MiBLSi State Conference 200929
30
At each year of implementation, what % of schools attain criterion levels of implementation? MiBLSi State Conference 200930
31
MiBLSi State Conference 200931 PET-R: COHORT 3 (N=50) 0-5 mo.6-11 mo.1:6-1:112:6-2:113:6-3:114:6-4:11 24 (48%) 1 (2%) 25 schools (50% did not attain criterion scores)
32
MiBLSi State Conference 200932 EBS-SAS: COHORT 3 (N=50) 0-5 mo. 21 schools (42% did not attain criterion scores) 6-11 mo. 1:0-1:52:0-2:53:0-3:54:0-4:55:0-5:5 13 (26%) 2 (4%) 14 (28%)
33
MiBLSi State Conference 200933 EBS-TIC: COHORT 3 (N=50) 0-5 mo. 13 schools (26% did not attain criterion scores) 6-11 mo. 1:0-1:52:0-2:53:0-3:54:0-4:55:0-5:5 6 (12%) 1 (2%) 30 (60%)
34
THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 200934
35
Sustainability Think and work – Up – Down – Out MiBLSi State Conference 200935
36
What percent of schools that attain criterion levels of implementation are able to maintain or improve their score in all subsequent years? MiBLSi State Conference 200936
37
MiBLSi State Conference 200937 PET-R: COHORT 3 (N=50) 6-11 mo.1:6-1:11 1 (2%) 1
38
MiBLSi State Conference 200938 EBS-SAS: COHORT 3 (N=50) 0-5 mo. 6-11 mo. 1:0-1:52:0-2:53:0-3:54:0-4:55:0-5:5 13 (26%) 2 (4%) 14 (28%) 2 12 2
39
MiBLSi State Conference 200939 EBS-TIC: COHORT 3 (N=50) 0-5 mo. 6-11 mo. 1:0-1:52:0-2:53:0-3:54:0-4:55:0-5:5 6 (12%) 1 (2%) 30 (60%) 15 01
40
Another way of looking at implementation... MiBLSi State Conference 200940
41
What % of implementation data do schools submit for each year of implementation? MiBLSi State Conference 200941
42
% of Schools Submitting PET-R Data Each Year 123456 C1-- 93%80%73%60% C2--78%89%78%-- C3--90%94%-- C4.1--97%-- C4.2--96%-- C4.391%-- MiBLSi State Conference 200942
43
% of Schools Submitting EBS-SAS Data Each Year 123456 C1-- 60% 47%53% C270%--74%63%67%-- C384%--70%78%-- C4.195%--86%-- C4.289%--81%-- C4.3--82%-- MiBLSi State Conference 200943
44
% of Schools Submitting EBS-TIC Data Each Year MiBLSi State Conference 200944 123456 C1-- 47%53%73%53% C274%--78%70%56%-- C360%--80%58%-- C4.177%--80%-- C4.256%--48%-- C4.3--45%--
45
THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 200945
46
Is the % of behavior checklist data submitted each year related to student behavior outcomes for that year? MiBLSi State Conference 200946
47
Is the % of reading checklist data submitted each year related to student reading outcomes for that year? MiBLSi State Conference 200947
48
Are scores on the behavior implementation checklists related to student behavior outcomes for that year? MiBLSi State Conference 200948
49
Are scores on the reading implementation checklist for each year of implementation related to student reading outcomes for that year? MiBLSi State Conference 200949
50
THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 200950
51
What is the impact on student outcomes when schools meet criteria on none, some, or all of the implementation checklists? MiBLSi State Conference 200951
52
Limitations Self-report implementation measures Limited number of schools in earlier cohorts We don’t know what specific factors have impacted implementation MiBLSi State Conference 200952
53
Remember... More data is not necessarily better. Data should have a purpose: – It should help us to make well-informed decisions that will improve outcomes for students. MiBLSi State Conference 200953
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.