DEVELOPING AND IMPLEMENTING STATE-LEVEL EVALUATION SYSTEMS BOB ALGOZZINE, HEATHER REYNOLDS, AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency.

Slides:



Advertisements
Similar presentations
Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Advertisements

Measuring Performance within School Climate Transformation Grants
Student Services Personnel and RtI: Bridging the Skill Gap FASSA Institute George M. Batsche Professor and Co-Director Institute for School Reform Florida.
Current Status and Emerging Directions for PBIS
Fidelity Instruments and School Burden Patricia Mueller, Ed.D., Brent Garrett, Ph.D., & David Merves, C.A.S. Evergreen Evaluation & Consulting, LLC AEA.
Establishing an Effective Network of PB4L: School wide Coaches
Schoolwide Positive Behavior Interventions and Support -SWPBIS- Mitchell L. Yell, Ph.D. University of South Carolina
1 Implementing a Three-Tiered State Evaluation Structure Bob Putnam The May Institute Karen Childs University of South Florida 2009 National PBIS Leadership.
The Role and Expectations for School-wide PBS Coaches Rob Horner and George Sugai OSEP TA-Center on PBS Pbis.org.
MARY BETH GEORGE, USD 305 PBIS DISTRICT COORDINATOR USD #305 PBIS Evaluation.
Coaching: Tier 2 and 3 Rainbow Crane Dr. Eleanore Castillo-Sumi.
John Carter Project Coordinator PBIS Idaho: Menu button: Idaho PBIS Presentations and Webinars.
Rob Horner and Steve Goodman. Goals Logic for investing in Trainer development For state leadership teams developing action plan For individuals identified.
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
Positive Behavioral Interventions and Supports Going to Scale in Maryland’s Local School Systems
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2011
PBIS Applications NWPBIS Washington Conference November 5, 2012.
Coming June 30,  Purpose of PBIS Assessment  Implications of the move from PBIS Surveys  Overview of available Tools and Surveys  Criteria for.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010
Northern California PBIS Symposium November 18, 2013.
Washington PBIS Conference Northwest PBIS Network Spokane, WA November 2013 Nadia K. Sampson & Dr. Kelsey R. Morris University of Oregon.
Rob Horner University of Oregon Implementation of Evidence-based practices School-wide behavior support Scaling evidence-based practices.
Professional Development and Technical Assistance: The National Blueprint for Success Tim Lewis, Ph.D University of Missouri OSEP Center on Positive Behavioral.
BUILDING CAPACITY FOR UNIVERSAL PREVENTION THROUGH STATE-NONPROFIT-UNIVERSITY- SCHOOL SYSTEM PARTNERSHIPS Philip J. Leaf, Ph.D. Johns Hopkins University.
SW-PBS District Administration Team Orientation
Support systems and sustained implementation of a data-driven, problem-solving model Margie McGlinchey MAASE Summer Institute August 11, 2009 Steve Goodman.
MU Center for SW-PBS College of Education University of Missouri Missouri SW-PBS Annual Reporting pbismissouri.org.
Dean Fixsen, Karen Blase, Rob Horner, and George Sugai University of North Carolina – Chapel Hill University of Oregon University of Connecticut Scaling.
The District Role in Implementing and Sustaining PBIS
9/15/20151 Scaling Up Presentation: SIG/SPDG Regional Meeting October 2009 Marick Tedesco, Ph.D. State Transformation Specialist for Scaling Up.
Blending Academics and Behavior Dawn Miller Shawnee Mission School District Steve Goodman Michigan’s Integrated Behavior and Learning.
Maine’s Response to Intervention Implementation: Moving Forward Presented by: Barbara Moody Title II Coordinator Maine Department of Education.
Rob Horner University of Oregonwww.pbis.org. Celebrate: PBS now being used in many parts of society. Focus: On school-wide positive behavior support.
Brockton PBIS: Tier 2 Coaches Meeting March 2014 Adam Feinberg
MiBLSi Schools’ Implementation Process and Student Outcomes Anna L. Harms Michigan State University MiBLSi State Conference
Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
Positive Behavioral Interventions and Supports Going to Scale in Maryland’s Local School Systems
SW-PBIS Cohort 8 Spring Training March Congratulations – your work has made a difference Cohort 8.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett, Cyndi Boezio,
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
 This is a presentation of the IL PBIS Network. All rights reserved. Recognition Process Materials available on
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon December 9, 2011.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Bob Algozzine Rob Horner National PBIS Leadership Forum Chicago Hyatt Regency O’Hare October 8, /
Using the WIKI to Support Training and Technical Assistance October 27, 2011 Susan Barrett Implementer Partner, Center on PBIS Sheppard Pratt Health System.
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Evaluation Planning & Reporting for School Climate Transformation Grant (SCTG) Sites Bob Algozzine University of North Carolina at Charlotte Steve GoodmanMichigan's.
School-Wide Positive Behavioral Interventions and Supports: District Coaches’ Meeting Donna Morelli Cynthia Zingler Education Specialists Positive Behavioral.
Notes for Trainers (Day Training)
Scaling and Sustaining PBIS: State, District, School Roles Rob Horner University of Oregon
Leadership Teams Implementing PBIS Module 14. Objectives Define role and function of PBIS Leadership Teams Define Leadership Team’s impact on PBIS implementation.
Aligning PBIS to Achieve Educational Excellence Rob Horner University of Oregon Acknowledge: George Sugai, Lucille Eber, Susan Barrett, Justyn Poulos,
Scaling and Sustaining PBIS: State, District, School Roles Rob Horner University of Oregon
SW-PBIS Cohort 10 Spring Training & Celebration February and March 2016.
Vermont Integrated Instruction Model (ViiM) Highlights August 2012.
District Implementation of PBIS C-1 Rob Horner Brian Megert University of Oregon Springfield School District.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Min.cenmi.org Michigan Implementation Network Providing Support through District Leadership and Implementation Team April 29, 2010 Michigan Implementation.
“Are We at Fidelity?”: Annual Evaluation and Action Planning.
School-wide Positive Behavioral Interventions and Supports District-wide Implementation: Ensuring Success Kentucky Center for Instructional Discipline.
An Introduction to Implementation Tools to Help Build Implementation Capacity SPDG Evaluators May 2012 Michelle A. Duda, Dean L. Fixsen,
State and District Role in
Making SWPBIS Work for All Students
Miblsi.cenmi.org Helping Students Become Better Readers with Social Skills Necessary for Success Steve Goodman Funded through OSEP.
Assessing Readiness Erin Chaparro, Ph.D. Kimberly Ingram-West, Ph.D.
Presentation transcript:

DEVELOPING AND IMPLEMENTING STATE-LEVEL EVALUATION SYSTEMS BOB ALGOZZINE, HEATHER REYNOLDS, AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 27, 2011

Objectives  Describe core features of an effective evaluation system  Evidence to document program, initiative, or intervention  Evidence to improve and support continuation  Evidence to direct policies and practices  Share ongoing and exemplary state-level evaluations  Provide an opportunity for question-answer collaboration

Program Evaluation Simplified Design/Plan [Redesign/Re-Plan] Implement Intentionally and Document Fidelity Assess Continuously and Document Intended and Unintended Outcomes

Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story that helps to…  document program, initiative, or intervention context, input, fidelity, and impact evidence  improve and support continuation stages of innovation and continuous improvement evidence  direct policies and practices efficient and effective reporting and dissemination of evidence

Document Program, Initiative, or Intervention A simple plan? Organize evidence around what you need to know and questions you can answer.  Why (i.e., circumstances, conditions, or events) was the program implemented? [Statement of the problem and data on which to build evaluation…]  What program was implemented? [Program description including key features…]  What other programs were considered?  Why was program selected over other programs?  How was the program implemented? [Pilot sites, administrative dictum, widespread panic, quiet riot, volunteers…]  Was program implemented with fidelity sufficient to produce change? [Statement of the problem and data on which to build evaluation…]  What short-, intermediate-, and long-term changes resulted from implementing the program? [Statement of the problem and data on which to build evaluation…] Improvements in school and classroom ecology? Improvements in academic and social behavior?  Did implementation improve the capacity of the state/district to continue the program? [Statement of the problem and data on which to build evaluation…] An important reminder: What you need to know and the questions you can answer will depend on where you are in the implementation process. EXPLORATION INSTALLATION IMPLEMENTATION CONTINUATION INNOVATION Context Input Fidelity Impact

Documenting Program Context and Input What to collect and report?  Information about need and intervention  Information about national, state, and local education agency leadership personnel and program providers  Information about program participants  Information about program  Focus, critical features, and content  Type and amount of support  Perceptions and other indicators of appropriateness  Expectations for change Context Input

Documenting Program Fidelity Intervention Level Self-Assessment Measures Progress Monitoring Measures Research Measures UniversalSelf-Assessment Survey (SAS) Benchmarks of Quality (BoQ) Team Implementation Checklist (TIC) School-wide Evaluation Tool (SET) Secondary and Tertiary Benchmarks of Advanced Tiers (BAT) Individual Student School-wide Evaluation Tool (I-SSET) OverallImplementation Phases Inventory (IPI) Phases of Implementation (POI) Forms on What to collect and report? Fidelity

Documenting Program Impact  Social Behavior Benefits  Fidelity Indicators  School and Classroom Climate  Attitudes  Attendance  Office Discipline Referrals (ODRs)  Individual Student Points/Behavior Records  Proportion of Time in Typical Educational Contexts  Referrals to Special Education  Academic Behavior Benefits  Fidelity Indicators  Instructional Climate  Attitudes  Universal Screening and Progress Monitoring (vocabulary, oral reading fluency)  Standardized Test Scores What to collect and report? Impact

Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story  Evidence to Document Program, Initiative, or Intervention context, input, fidelity, and impact  Evidence to Improve and Support Continuation stages of innovation/continuous improvement cycles  Evidence to Direct Policies and Practices efficient and effective annual reports

Evidence to Improve and Support Continuation Stages of Implementation  Exploration  Installation  Initial Implementation  Full Implementation  Innovation  Sustainability 2 – 4 Years Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 [report]report What to collect and report? Design/Plan [Redesign/Re-Plan] Implement Intentionally and Document Fidelity Assess Continuously and Document Intended and Unintended Outcomes Continuous Improvement Process

Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story  Evidence to Document Program, Initiative, or Intervention context, input, fidelity, and impact  Evidence to Improve and Support Continuation stages of innovation/continuous improvement cycles  Evidence to Direct Policies and Practices efficient and effective annual reports o external support o www. pbisassessment.org www. pbisassessment.org o www. pbseval.org www. pbseval.org

Evidence to Direct, Support, and Revise Policy Decisions Evaluation Blueprint The OSEP Technical Assistance Center on Positive Behavioral Interventions and Supports has developed a document for individuals who are implementing School-wide Positive Behavior Intervention and Support (SWPBIS) in districts, regions, or states. The purpose of the “blueprint” is to provide a formal structure for evaluating if implementation efforts are (a) occurring as planned, (b) resulting in change in schools, and (c) producing improvement in student outcomes.OSEP Technical Assistance Center on Positive Behavioral Interventions and Supports (blueprint)blueprint

Evidence to Direct, Support, and Revise Policy Decisions North Carolina Annual Performance Report Annual reports highlight development and continued growth of PBIS in North Carolina as well as indicators of fidelity of implementation and the impact PBIS is having on participating schools across the state. In addition, the reports include information about plans for sustainability through training, coaching, and partnerships with other initiatives, in particular Responsiveness to Instruction (RtI).PBIS in North Carolina Michigan’s Integrated Behavior and Learning Support Initiative (MiBLSi) News Illinois Evaluation Reports Florida’s Positive Behavior Support Project Childs, K. E., Kincaid, D., & George, H. P. (2010). A model for statewide evaluation of a universal positive behavior support initiative. Journal of Positive Behavior Interventions, 12, 198–210.

Evidence from Exemplary State-Level Evaluations North Carolina North Carolina has been implementing a statewide Positive Behavior Intervention and Support (PBIS) Initiative for 10 years. Heather Reynolds is the State PBIS Consultant.Positive Behavior Intervention and Support Michigan Michigan’s Integrated Behavior and Learning Support Initiative Michigan’s Integrated Behavior and Learning Support Initiative (MiBLSi) works with schools to develop a multi-tiered system of support for both reading and behavior; PBIS is a key part of the Initiative’s process for creating and sustaining safe and effective schools. Steve Goodman is Director of Michigan Integrated Behavior and Learning Support Initiative and PBIS Coordinator.

Presentation Questions and Answers Bibliography and Selected Resources Evaluation Action Plan

Abma, T. A., & Stake, R. E. (2001). Stake’s responsive evaluation: Core ideas and evolution. In J. C. Greene & T. A. Abma (Eds.), New directions for evaluation: No. 92. Responsive evaluation (pp. 7-21). San Francisco: Jossey-Bass. Algozzine, B., Horner, R. H., Sugai, G., Barrett, S., Dickey, S. R., Eber, L., Kincaid, D., et al. (2011). Evaluation blueprint for school-wide positive behavior support. Eugene, OR: National Technical Assistance Center on Positive Behavior Interventions and Support. Retrieved from Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231).FMHI Publication #231 Ruhe, V., & Zumbo, B. D. (2009). Evaluation in distance education and e-learning. New York: Guilford. Scriven, M., & Coryn, C. L. S. (2008). The logic of research evaluation. In C. L. S. Coryn & M. Scriven (Eds.), Reforming the evaluation of research. New Directions for Evaluation, No. 118, pp ). San Francisco, CA: Jossey-Bass. Stufflebeam, D. L. (2001). Evaluation models. In D. L. Stufflebeam, New directions for evaluation: No. 89. Responsive evaluation (pp. 7-98). San Francisco: Jossey-Bass. Stufflebeam, D. L., & Shinkfield, A. J. (2007). Evaluation theory, models, and applications. San Francisco: Jossey-Bass/Pfeiffer. The Evaluation Center. (2011). Evaluation checklists. Kalamazoo, MI: Western Michigan University. Retrieved from The Joint Committee on Standards for Educational Evaluation (1994). The program evaluation standards. Thousand Oaks, CA: Sage Publications, Inc. Bibliography and Selected Resources

Evaluation Action Plan Evaluation_Action_Plan