MiBLSi Schools’ Implementation Process and Student Outcomes Anna L. Harms Michigan State University MiBLSi State Conference 2009 1.

Slides:



Advertisements
Similar presentations
Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Advertisements

Measuring Performance within School Climate Transformation Grants
Student Services Personnel and RtI: Bridging the Skill Gap FASSA Institute George M. Batsche Professor and Co-Director Institute for School Reform Florida.
Plan Evaluation/Progress Monitoring Problem Identification What is the problem? Problem Analysis Why is it happening? Progress Monitoring Did it work?
Extending RTI to School-wide Behavior Support Rob Horner University of Oregon
1 Implementing a Three-Tiered State Evaluation Structure Bob Putnam The May Institute Karen Childs University of South Florida 2009 National PBIS Leadership.
Self Assessment and Implementation Tool for Multi- Tiered Systems of Support (RtI)
CA Multi-Tiered System of Supports
The Process of VTPBiS Implementation Getting From Here to There! Vermont.
EBS Survey Vermont PBS “Bringing out the BEST in all of us.”
RTI Implementer Webinar Series: What is RTI?
Power Pack Click to begin. Click to advance Congratulations! The RtI process has just become much easier. This team member notebook contains all the information.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2011
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010
Rob Horner University of Oregon Implementation of Evidence-based practices School-wide behavior support Scaling evidence-based practices.
Pat Mueller David Merves October 6, 2008 NH RESPONDS Evaluation Component.
Support systems and sustained implementation of a data-driven, problem-solving model Margie McGlinchey MAASE Summer Institute August 11, 2009 Steve Goodman.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Welcome Oregon Scaling-up EBISS The District Data Team Meeting Blending Behavioral and Academic Multi-tiered Systems of Support Oregon.
The District Role in Implementing and Sustaining PBIS
9/15/20151 Scaling Up Presentation: SIG/SPDG Regional Meeting October 2009 Marick Tedesco, Ph.D. State Transformation Specialist for Scaling Up.
Cohort 4 Middle/Jr. High School Data Review and Action Planning: Schoolwide Behavior Spring 2009.
Blending Academics and Behavior Dawn Miller Shawnee Mission School District Steve Goodman Michigan’s Integrated Behavior and Learning.
Maine’s Response to Intervention Implementation: Moving Forward Presented by: Barbara Moody Title II Coordinator Maine Department of Education.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
Coaches Training Introduction Data Systems and Fidelity.
V 2.1 Evaluation Tools, On-Line Systems and Action Planning.
Scaling up and sustaining an integrated behavior and reading schoolwide model of supports November 18, 2008.
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
Coaching for Competence Margie McGlinchey SPDG Regional Mtg. October 1, 2009 Steve Goodman Margie McGlinchey Kathryn Schallmo Co-Directors.
Positive Behavioral Interventions & Supports (PBIS) Core Behavioral Component The Response to Intervention Best Practices Institute Wrightsville Beach,
Cohort 4 - Elementary School Data Review and Action Planning: Schoolwide Behavior Spring
Measuring Implementation: School-Wide Instructional Staff Perspective Amy Gaumer Erickson, Ph.D. University of Kansas Evaluator: Kansas & Missouri SPDGs.
“Lessons learned” regarding Michigan’s state-wide implementation of schoolwide behavior and reading support Margie McGlinchey Kathryn Schallmo Steve Goodman.
Effective Behavioral & Instructional Support Systems Overview and Guiding Principles Adapted from, Carol Sadler, Ph.D. – EBISS Coordinator Extraordinaire.
2015 WI PBIS Conference A5. Benchmark of Quality – Implementation Plan Dave Kunelius - WI RtI Center Regional Coordinator-PBIS.
SW-PBIS Cohort 8 Spring Training March Congratulations – your work has made a difference Cohort 8.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett, Cyndi Boezio,
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
V 2.1 TFI Wordle. V 2.1 Objectives of Session: 1.Not bore you to sleep 2.Types of PBIS Data 3.pbisapps.org 4.PBIS Evaluation Tools 5.Action Planning.
Rob Horner OSEP Center on PBIS Jon Potter Oregon RTI David Putnam Oregon RTI.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
NASDSE November 14, 2006 Margaret McGlinchey Kim St. Martin.
Virginia Tiered System of Supports Integrating Academics (RtI) and Behavior (PBIS) Virginia Department of Education Office of Student Services Dr. Cynthia.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Collaborative Problem Solving… Shifting to a Response to Intervention (RTI) Model  A year of change for us all  We will learn together as a TEAM  The.
Introduction to School-wide Positive Behavior Support.
DEVELOPING AND IMPLEMENTING STATE-LEVEL EVALUATION SYSTEMS BOB ALGOZZINE, HEATHER REYNOLDS, AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency.
Florida Charter School Conference Orlando, Florida November, 2009 Clark Dorman Project Leader Florida Statewide Problem-Solving/RtI Project University.
Part 2: Assisting Students Struggling with Reading: Multi-Tier System of Supports H325A
Severe Discrepancy vs. Response to Intervention. Severe Discrepancy Model of Eligibility Determination (1974 – present) This method is used for students.
Using Student Data as a Basis for Feedback to Teachers Ronnie Detrich Wing Institute ABAI, 2011.
Winter  The RTI.2 framework integrates Common Core State Standards, assessment, early intervention, and accountability for at-risk students in.
How’s Your PBIS Program Measuring Up? Jane Ballesteros Tucson PBIS Initiative June 11, 2008.
SW-PBIS Cohort 10 Spring Training & Celebration February and March 2016.
Vermont Integrated Instruction Model (ViiM) Highlights August 2012.
National Center on Response to Intervention RTI Essential Component: Schoolwide, Multi-Level Prevention System Katie Klingler Tackett National Center on.
Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools.
Spring Data Review and Action Planning: Module 1.0 Introduction Cohort 4 Middle/Jr. High Schools Spring
School-Based Problem-Solving for Individuals (SBIT)
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
Integrating Academics (RtI) and Behavior (PBIS) Virginia Department of Education Office of Student Services Dr. Cynthia A. Cave February 2014.
“Are We at Fidelity?”: Annual Evaluation and Action Planning.
XXXXX School Ci3T Implementation Report Social Validity and Treatment Integrity 20XX – 20XX ____________________________________ Fall 20XX Lane and Oakes.
Extending RTI to School-wide Behavior Support
Making SWPBIS Work for All Students
Some Considerations for Developing & Implementing Clinical Experiences
Miblsi.cenmi.org Helping Students Become Better Readers with Social Skills Necessary for Success Steve Goodman Funded through OSEP.
Assessing Readiness Erin Chaparro, Ph.D. Kimberly Ingram-West, Ph.D.
Bringing RTI to Scale in Oregon
Presentation transcript:

MiBLSi Schools’ Implementation Process and Student Outcomes Anna L. Harms Michigan State University MiBLSi State Conference

Agenda Reasons for studying implementation and ways to do it Linking research to our schools’ data Next steps Questions and Feedback MiBLSi State Conference 20092

The Status of Research Primary focus has been on developing and identifying practices... – National Reading Panel Reports – What Works Clearinghouse – Florida Center for Reading Research Reviews – OJJDP Model Programs – Center for the Study and Prevention of Violence Model Programs MiBLSi State Conference 20093

What determines the evidence base for a practice? MiBLSi State Conference Independent randomized control trial is the gold standard Effect size (Cohen, 1988) : – Large:.80 – Moderate:.50 – Minimal/Weak:.20

Efficacy vs. Effectiveness (Christensen, Carlson, Valdez, 2003) Efficacy – controlled conditions – Conducted by innovation developers Effectiveness – External to the developers of an innovation – Replication – Under different conditions MiBLSi State Conference RESEARCHPRACTICE IMPLEMENTATION

Greenberg, Domitrovich, Graczyk, Zins (2005) MiBLSi State Conference PLANNED INTERVENTION PLANNED INTERVENTION PLANNED IMPLEMENTATION SYSTEM PLANNED IMPLEMENTATION SYSTEM PROGRAM AS IMPLEMENTED ACTUAL INTERVENTION ACTUAL MPLEMENTATION SUPPORT =

NIRN/SISEP Framework for Implementation Stages of Implementation Core Implementation Components Multi-level Influences on Successful Implementation MiBLSi State Conference 20097

Effective Intervention Practices + Effective Implementation Strategies _______________________________ = Positive Outcomes for Students SISEP, 2009 MiBLSi State Conference 20098

Getting into the Habit of Collecting, Analyzing, and Acting Upon Data MiBLSi State Conference Problem Identification Problem Analysis Plan Selection Plan Implementation Plan Evaluation DATA & DOCUMENTATION

Response to I ________ I ntervention ? I nstruction ? I mplementation of evidence-based practices MiBLSi State Conference

Reasons for Studying and Monitoring Implementation Effort evaluation Quality improvement Documentation Internal validity Program theory Process evaluation Diffusion Evaluation quality MiBLSi State Conference Greenberg, M. T., Domitrovich, C. E., Graczyk, P. A., & Zins, J. E. (2005).

What tools can we use to measure implementation of school-wide systems? MiBLSi State Conference

Tier 1 Implementation Tools READINGBEHAVIOR Planning and Evaluation ToolEffective Behavior Supports Team Implementation Checklist Effective Reading Supports Team Implementation Checklist Effective Behavior Supports Self Assessment Survey Observational Protocols School-wide Evaluation Tool Principle’s Reading Walkthrough Documents Benchmarks of Quality School Climate Survey MiBLSi State Conference

Tier 2 & 3 Implementation Tools READINGBEHAVIOR Intervention Validity Checklists Checklist for Individual Student Systems IEP Implementation Validity Checks MiBLSi State Conference

MiBLSi Mission Statement “to develop support systems and sustained implementation of a data-driven, problem solving model in schools to help students become better readers with social skills necessary for success” MiBLSi State Conference

Our Data COHORTSTART DATESCHOOLS*YEARS OF DATA AVAILABLE 1January February January January March June, MiBLSi State Conference * Refers to # of elementary schools included in this study. MiBLSi’s existing data Elementary Schools (any combination of K-6)

Purpose of the Study To systematically examine schools’ process of implementing school-wide positive behavior supports and a school-wide reading model during participation with a statewide RtI project. To systematically examine the relation between implementation fidelity of an integrated three-tier model and student outcomes. MiBLSi State Conference

Conceptual Framework MiBLSi State Conference PLANNED INTERVENTION School-wide Positive Behavior Supports Response to Intervention for Reading ACTUAL IMPLEMENTATION Submission of Implementation Checklists Scores on Implementation Checklists STUDENT OUTCOMES Office Discipline Referrals Performance on Curriculum-Based Literacy Measures Performance on State- Wide Standardized Test in Reading (Chen, 1998; Greenberg et al., 2005)

Measuring Implementation Effective Behavior Support Self Assessment Survey (EBS-SAS) Spring of each school year Total % implementation by building location Effective Behavior Support Team Implementation Checklist (EBS-TIC) 4 x per school year (quarterly) Total % Implementation Planning and Evaluation Tool for Effective Reading Supports- Revised (PET-R) Fall of each school year Total/Overall % implementation MiBLSi State Conference

THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference

Systems Implementation Research Expect 3-5 years for full implementation (Fixsen, Naoom, Blase, Friedman & Wallace, 2004; OSEP Center on Positive Behavioral Interventions and Supports, 2004; Sprague et al., 2001) Studies often split up implementation and outcomes (Reading First--U.S. Department of Education, 2006) View implementation at one point in time (McCurdy, Mannella & Eldridge, 2003); McIntosh, Chard, Boland & Horner, 2006; Mass- Galloway, Panyan, Smith & Wessendorf, 2008) A need for systematic research MiBLSi State Conference

THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference

Process and Progress Just as we measure student progress, we should also measure our progress toward implementation efforts. What is our current level of implementation? What is our goal? How do we get from here to there? MiBLSi State Conference

How do scores vary by year of implementation? MiBLSi State Conference

MiBLSi State Conference

MiBLSi State Conference

MiBLSi State Conference

THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference

How long does it take? 2-5 years MiBLSi State Conference

At each year of implementation, what % of schools attain criterion levels of implementation? MiBLSi State Conference

MiBLSi State Conference PET-R: COHORT 3 (N=50) 0-5 mo.6-11 mo.1:6-1:112:6-2:113:6-3:114:6-4:11 24 (48%) 1 (2%) 25 schools (50% did not attain criterion scores)

MiBLSi State Conference EBS-SAS: COHORT 3 (N=50) 0-5 mo. 21 schools (42% did not attain criterion scores) 6-11 mo. 1:0-1:52:0-2:53:0-3:54:0-4:55:0-5:5 13 (26%) 2 (4%) 14 (28%)

MiBLSi State Conference EBS-TIC: COHORT 3 (N=50) 0-5 mo. 13 schools (26% did not attain criterion scores) 6-11 mo. 1:0-1:52:0-2:53:0-3:54:0-4:55:0-5:5 6 (12%) 1 (2%) 30 (60%)

THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference

Sustainability Think and work – Up – Down – Out MiBLSi State Conference

What percent of schools that attain criterion levels of implementation are able to maintain or improve their score in all subsequent years? MiBLSi State Conference

MiBLSi State Conference PET-R: COHORT 3 (N=50) 6-11 mo.1:6-1:11 1 (2%) 1

MiBLSi State Conference EBS-SAS: COHORT 3 (N=50) 0-5 mo mo. 1:0-1:52:0-2:53:0-3:54:0-4:55:0-5:5 13 (26%) 2 (4%) 14 (28%)

MiBLSi State Conference EBS-TIC: COHORT 3 (N=50) 0-5 mo mo. 1:0-1:52:0-2:53:0-3:54:0-4:55:0-5:5 6 (12%) 1 (2%) 30 (60%) 15 01

Another way of looking at implementation... MiBLSi State Conference

What % of implementation data do schools submit for each year of implementation? MiBLSi State Conference

% of Schools Submitting PET-R Data Each Year C1-- 93%80%73%60% C2--78%89%78%-- C3--90%94%-- C %-- C %-- C4.391%-- MiBLSi State Conference

% of Schools Submitting EBS-SAS Data Each Year C1-- 60% 47%53% C270%--74%63%67%-- C384%--70%78%-- C4.195%--86%-- C4.289%--81%-- C %-- MiBLSi State Conference

% of Schools Submitting EBS-TIC Data Each Year MiBLSi State Conference C1-- 47%53%73%53% C274%--78%70%56%-- C360%--80%58%-- C4.177%--80%-- C4.256%--48%-- C %--

THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference

Is the % of behavior checklist data submitted each year related to student behavior outcomes for that year? MiBLSi State Conference

Is the % of reading checklist data submitted each year related to student reading outcomes for that year? MiBLSi State Conference

Are scores on the behavior implementation checklists related to student behavior outcomes for that year? MiBLSi State Conference

Are scores on the reading implementation checklist for each year of implementation related to student reading outcomes for that year? MiBLSi State Conference

THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference

What is the impact on student outcomes when schools meet criteria on none, some, or all of the implementation checklists? MiBLSi State Conference

Limitations Self-report implementation measures Limited number of schools in earlier cohorts We don’t know what specific factors have impacted implementation MiBLSi State Conference

Remember... More data is not necessarily better. Data should have a purpose: – It should help us to make well-informed decisions that will improve outcomes for students. MiBLSi State Conference