Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010

Similar presentations


Presentation on theme: "Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010"— Presentation transcript:

1 Evaluation in Michigan’s Model Steve Goodman sgoodman@oaisd.org National PBIS Leadership Forum October, 2010 http://miblsi.cenmi.org

2 MiBLSi Evaluation Team (the people that make this presentation possible) Anna Harms, Evaluation Coordinator Ed Huth, Data Analyst Jennifer Rollenhagen, PBIS Assessments Coordinator Terri Metcalf, Reading Assessments Coordinator Nikki Matthews, Data Entry, PBIS Surveys Support Nancy Centers, DIBELS, AIMSweb Support Donna Golden, Data Entry Steve Goodman, MiBLSi Co-Director

3 Mission Statement To develop support systems and sustained implementation of a data-driven, problem solving model in schools to help students become better readers with social skills necessary for success.

4 Cumulative Totals of Supported MiBLSi Schools

5 Building Staff Building Leadership Team LEA District Leadership Team Collecting information to evaluate implementation effects and using this information for continuous improvement Fidelity of implementation (across schools) Systems integrity (district-LEA) Student success (district-wide) Fidelity of implementation (state) Systems integrity (project) Student success (project-wide) Fidelity of implementation (across grades) Systems integrity (school) Student success (school-wide) Student success/Intervention effectiveness ISD Leadership Team MiBLSi Project Fidelity of implementation (across districts) Systems integrity (district-ISD) Student success

6 Several Purposes of MiBLSi Assessments Audit –for “taking stock” of current strengths/weaknesses and action planning Formative evaluation –for improving program while it is in the process of being implemented Summative evaluation –for improvement of future reiterations

7 Assessments Elementary Schools Major Discipline Referrals PBIS Self-Assessment Survey PBIS Team Implementation Checklist Benchmarks of Quality (BOQ) Schoolwide Evaluation Tool (SET) Benchmarks for Advanced Tiers (BAT) Dynamic Indicators of Basic Early Literacy Skills (DIBELS) Planning and Evaluation Tool (PET) for Effective Schoolwide Reading Programs Effective Reading Support Team Implementation Checklist Special Education Data Collection Form Schoolwide Reading Analysis Support Page Middle/Junior High Schools Major Discipline Referrals PBIS Self-Assessment Survey PBIS Team Implementation Checklist Benchmarks of Quality (BOQ) Schoolwide Evaluation Tool (SET) ORF/MAZE through AIMSWeb School-Wide Evaluation and Planning Tool for Middle School Literacy (SWEPT) Middle School Reading Team Implementation Checklist Special Education Data Collection Form

8

9

10

11 Building Level

12 Assist Teams in Using Data for Decision-making First Year –Winter systems review –Spring Data Review Second Year –Fall data review –Winter data review –Spring data review Third Year –Fall data review –Winter data review –Spring data review

13 Assessment Booklet Description of assessments Data collection schedule Data summary Data forms and assessment forms

14 Team Evaluation of Outcome, Process and Systems Data

15 Assessment Schedule (for Cohort 7 from MiBLSi website)

16 Video examples for completing and submitting PBIS assessments

17 Improving the accuracy and Consistency of Recording Office Discipline Referrals

18 Developing Fluency with Discipline Referral Categories Example Exercise 2: Match the example situation below to the correct problem behavior on the discipline categories answer sheet. Write the letter in the column for Exercise 2.

19 District Level

20 Increase 8% Decrease 14.6% Focus on Implementing with Fidelity using Benchmarks of Quality (BoQ)/ODR ’06-’07 and ’07-’08

21 District Implementation Tracking Form

22 Leadership-Implementation Support Team Self- Assessment

23 Project Level

24 File Maker Pro Data Base

25 One major activity of MiBLSi involves professional development Over 422 training days are currently scheduled for the 2010-2011 school year.

26

27 On-Line evaluation Trainer evaluation of trainer workdays Participant evaluation of training sessions

28 On-Line Evaluation Form

29 Trainer Work Day Questions 1.The training goals were clearly defined and reviewed frequently with checking for understanding. 2.The trainers were knowledgeable about the training content and were able to respond to participants' questions and share experiences to support understanding. 3.The trainers presented the content in such a way that promoted active engagement and opportunities for processing, working, and/or learning the content. 4.The materials were accessible in a timely manner (posted two weeks prior to trainer work day). 5.The trainer notes and activities of the day were a valuable use of my time as it relates to preparing for this upcoming training. 6.Potential challenges that participants may experience were highlighted with some ideas for addressing those challenges. 7.The big ideas of the day's training were emphasized, and areas to cut or condense were described in enough detail so that I am confident about how to adjust for different groups.

30 MiBLSi Project Data

31 Behavior and Reading Interaction

32 Proficiency on 4 th Grade High Stakes Reading Test and Percent of Major Discipline Referrals from Classroom: 132 Elementary Schools

33 Average MEAP Reading Scores and fidelity in PBIS implementation based on Benchmarks of Quality *29 Elementary Schools from multiple districts

34 MiBLSi Project Data Implementation Fidelity

35 Comparison of Schoolwide Evaluation Tool (SET) Scores after training and after MiBLSi Implementation

36 MiBLSi School-wide Evaluation Tool (SET) Average Scores for Elementary and Middle Schools for 2009-10

37 MiBLSi Project Data Student Outcome

38 Percent of Students meeting DIBELS Spring Benchmark for Cohorts 1 - 5 (combined grades)

39 Percent of Students at DIBELS Intensive Level across year by Cohort

40 Elementary Schools with complete data sets: Average Major Discipline Referrals per 100 Students per Day

41 Spring Reading Curriculum-Based Measurement “Established Level” for Cohort 4-6 Middle Schools

42 Special Education Referral and Eligibility Rates for Cohort 1 - 4 Schools (Comparison of 2007-08 and 2008-09) *n = 84 schools

43 Middle Schools with Complete Data Sets: Average Major Discipline Referrals per 100 Students per Day

44 Lesson Learned Teams need to be taught how to analyze and use data Emphasis on directing resources to need and removing competing activities As we grow, it is even more important to systematic gather data that is accurate and then act on the data for continuous improvement

45 “Even if you’re on the right track, you’ll get run over if you just sit there” - Will Rogers


Download ppt "Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010"

Similar presentations


Ads by Google