Download presentation
Presentation is loading. Please wait.
Published byCollin Basil Simon Modified over 9 years ago
1
DEVELOPING AND IMPLEMENTING STATE-LEVEL EVALUATION SYSTEMS BOB ALGOZZINE, HEATHER REYNOLDS, AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 27, 2011
2
Objectives Describe core features of an effective evaluation system Evidence to document program, initiative, or intervention Evidence to improve and support continuation Evidence to direct policies and practices Share ongoing and exemplary state-level evaluations Provide an opportunity for question-answer collaboration
3
Program Evaluation Simplified Design/Plan [Redesign/Re-Plan] Implement Intentionally and Document Fidelity Assess Continuously and Document Intended and Unintended Outcomes
4
Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story that helps to… document program, initiative, or intervention context, input, fidelity, and impact evidence improve and support continuation stages of innovation and continuous improvement evidence direct policies and practices efficient and effective reporting and dissemination of evidence
5
Document Program, Initiative, or Intervention A simple plan? Organize evidence around what you need to know and questions you can answer. Why (i.e., circumstances, conditions, or events) was the program implemented? [Statement of the problem and data on which to build evaluation…] What program was implemented? [Program description including key features…] What other programs were considered? Why was program selected over other programs? How was the program implemented? [Pilot sites, administrative dictum, widespread panic, quiet riot, volunteers…] Was program implemented with fidelity sufficient to produce change? [Statement of the problem and data on which to build evaluation…] What short-, intermediate-, and long-term changes resulted from implementing the program? [Statement of the problem and data on which to build evaluation…] Improvements in school and classroom ecology? Improvements in academic and social behavior? Did implementation improve the capacity of the state/district to continue the program? [Statement of the problem and data on which to build evaluation…] An important reminder: What you need to know and the questions you can answer will depend on where you are in the implementation process. EXPLORATION INSTALLATION IMPLEMENTATION CONTINUATION INNOVATION Context Input Fidelity Impact
6
Documenting Program Context and Input What to collect and report? Information about need and intervention Information about national, state, and local education agency leadership personnel and program providers Information about program participants Information about program Focus, critical features, and content Type and amount of support Perceptions and other indicators of appropriateness Expectations for change Context Input
7
Documenting Program Fidelity Intervention Level Self-Assessment Measures Progress Monitoring Measures Research Measures UniversalSelf-Assessment Survey (SAS) Benchmarks of Quality (BoQ) Team Implementation Checklist (TIC) School-wide Evaluation Tool (SET) Secondary and Tertiary Benchmarks of Advanced Tiers (BAT) Individual Student School-wide Evaluation Tool (I-SSET) OverallImplementation Phases Inventory (IPI) Phases of Implementation (POI) Forms on www.pbisassessment.orgwww.pbisassessment.org What to collect and report? Fidelity
8
Documenting Program Impact Social Behavior Benefits Fidelity Indicators School and Classroom Climate Attitudes Attendance Office Discipline Referrals (ODRs) Individual Student Points/Behavior Records Proportion of Time in Typical Educational Contexts Referrals to Special Education Academic Behavior Benefits Fidelity Indicators Instructional Climate Attitudes Universal Screening and Progress Monitoring (vocabulary, oral reading fluency) Standardized Test Scores What to collect and report? Impact
9
Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story Evidence to Document Program, Initiative, or Intervention context, input, fidelity, and impact Evidence to Improve and Support Continuation stages of innovation/continuous improvement cycles Evidence to Direct Policies and Practices efficient and effective annual reports
10
Evidence to Improve and Support Continuation Stages of Implementation Exploration Installation Initial Implementation Full Implementation Innovation Sustainability 2 – 4 Years Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 [report]report What to collect and report? Design/Plan [Redesign/Re-Plan] Implement Intentionally and Document Fidelity Assess Continuously and Document Intended and Unintended Outcomes Continuous Improvement Process
11
Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story Evidence to Document Program, Initiative, or Intervention context, input, fidelity, and impact Evidence to Improve and Support Continuation stages of innovation/continuous improvement cycles Evidence to Direct Policies and Practices efficient and effective annual reports o external support o www. pbisassessment.org www. pbisassessment.org o www. pbseval.org www. pbseval.org
12
Evidence to Direct, Support, and Revise Policy Decisions Evaluation Blueprint The OSEP Technical Assistance Center on Positive Behavioral Interventions and Supports has developed a document for individuals who are implementing School-wide Positive Behavior Intervention and Support (SWPBIS) in districts, regions, or states. The purpose of the “blueprint” is to provide a formal structure for evaluating if implementation efforts are (a) occurring as planned, (b) resulting in change in schools, and (c) producing improvement in student outcomes.OSEP Technical Assistance Center on Positive Behavioral Interventions and Supports (blueprint)blueprint
13
Evidence to Direct, Support, and Revise Policy Decisions North Carolina Annual Performance Report Annual reports highlight development and continued growth of PBIS in North Carolina as well as indicators of fidelity of implementation and the impact PBIS is having on participating schools across the state. In addition, the reports include information about plans for sustainability through training, coaching, and partnerships with other initiatives, in particular Responsiveness to Instruction (RtI).PBIS in North Carolina Michigan’s Integrated Behavior and Learning Support Initiative (MiBLSi) News http://miblsi.cenmi.org/News.aspx Illinois Evaluation Reports http://pbisillinois.org/ Florida’s Positive Behavior Support Project Childs, K. E., Kincaid, D., & George, H. P. (2010). A model for statewide evaluation of a universal positive behavior support initiative. Journal of Positive Behavior Interventions, 12, 198–210.
14
Evidence from Exemplary State-Level Evaluations North Carolina North Carolina has been implementing a statewide Positive Behavior Intervention and Support (PBIS) Initiative for 10 years. Heather Reynolds is the State PBIS Consultant.Positive Behavior Intervention and Support Michigan Michigan’s Integrated Behavior and Learning Support Initiative Michigan’s Integrated Behavior and Learning Support Initiative (MiBLSi) works with schools to develop a multi-tiered system of support for both reading and behavior; PBIS is a key part of the Initiative’s process for creating and sustaining safe and effective schools. Steve Goodman is Director of Michigan Integrated Behavior and Learning Support Initiative and PBIS Coordinator.
15
Presentation Questions and Answers Bibliography and Selected Resources Evaluation Action Plan
16
Abma, T. A., & Stake, R. E. (2001). Stake’s responsive evaluation: Core ideas and evolution. In J. C. Greene & T. A. Abma (Eds.), New directions for evaluation: No. 92. Responsive evaluation (pp. 7-21). San Francisco: Jossey-Bass. Algozzine, B., Horner, R. H., Sugai, G., Barrett, S., Dickey, S. R., Eber, L., Kincaid, D., et al. (2011). Evaluation blueprint for school-wide positive behavior support. Eugene, OR: National Technical Assistance Center on Positive Behavior Interventions and Support. Retrieved from www.pbis.orgwww.pbis.org Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231).FMHI Publication #231 Ruhe, V., & Zumbo, B. D. (2009). Evaluation in distance education and e-learning. New York: Guilford. Scriven, M., & Coryn, C. L. S. (2008). The logic of research evaluation. In C. L. S. Coryn & M. Scriven (Eds.), Reforming the evaluation of research. New Directions for Evaluation, No. 118, pp. 89- 106). San Francisco, CA: Jossey-Bass. Stufflebeam, D. L. (2001). Evaluation models. In D. L. Stufflebeam, New directions for evaluation: No. 89. Responsive evaluation (pp. 7-98). San Francisco: Jossey-Bass. Stufflebeam, D. L., & Shinkfield, A. J. (2007). Evaluation theory, models, and applications. San Francisco: Jossey-Bass/Pfeiffer. The Evaluation Center. (2011). Evaluation checklists. Kalamazoo, MI: Western Michigan University. Retrieved from http://www.wmich.edu/evalctr/checklists/http://www.wmich.edu/evalctr/checklists/ The Joint Committee on Standards for Educational Evaluation (1994). The program evaluation standards. Thousand Oaks, CA: Sage Publications, Inc. Bibliography and Selected Resources
17
Evaluation Action Plan Evaluation_Action_Plan
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.