Pat Mueller David Merves October 6, 2008 NH RESPONDS Evaluation Component.

Slides:



Advertisements
Similar presentations
Capturing Fidelity of Implementation in NH RESPONDS State Personnel Development Grant Patricia H. Mueller, Ed.D. Evergreen Evaluation & Consulting AEA.
Advertisements

How Do We Know We Are Making Progress? Virginia Department of Education Office of School Improvement
Measuring Performance within School Climate Transformation Grants
Edward S. Shapiro Director, Center for Promoting Research to Practice Lehigh University, Bethlehem, PA Planning for the Implementation of RTI: Lessons.
Semonti Basu PBS Technical Assistance Facilitator Grace Martino-Brewster PBS Specialist Austin Independent School District Connecting Data Dots Using data-based.
Fidelity Instruments and School Burden Patricia Mueller, Ed.D., Brent Garrett, Ph.D., & David Merves, C.A.S. Evergreen Evaluation & Consulting, LLC AEA.
Schoolwide Positive Behavior Interventions and Support -SWPBIS- Mitchell L. Yell, Ph.D. University of South Carolina
Office of Student and School Success “Ensure equality of outcome for Washington State’s 1.1 million students” Our Mission… Andrew E. Kelly Assistant Superintendent.
Office of Special Education Programs U.S. Department of Education With thanks to NECTAC & Christy Kavulic GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
CA Multi-Tiered System of Supports
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Developing School-Based Systems of Support: Ohio’s Integrated Systems Model Y.S.U. March 30, 2006.
To use this PowerPoint you will need the following documents found in this module: OIP Stage 3 for the BLT The BLT 5-Step Process Visual The BLT 5-Step.
 AKA CIPP  Evaluators: Elaine Carlson and Tom Munk  Assist in summative evaluation of the center  Helped develop standardized logic model  Helped.
MU Center for SW-PBS College of Education University of Missouri Missouri SW-PBS Annual Reporting pbismissouri.org.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Preparing to Use This Video with Staff: Materials/Resources:  Print copies for each person of the following resources found on any OIP Stage 0 Module.
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Evaluation Assists with allocating resources what is working how things can work better.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
Coaches Training Introduction Data Systems and Fidelity.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
Performance-based Contracting and Maine’s State Personnel Development Grant (SPDG) Dawn Kliphan March 28, 2010.
MiBLSi Schools’ Implementation Process and Student Outcomes Anna L. Harms Michigan State University MiBLSi State Conference
Connecting with the SPP/APR Kansas State Personnel Development Grant.
“Lessons learned” regarding Michigan’s state-wide implementation of schoolwide behavior and reading support Margie McGlinchey Kathryn Schallmo Steve Goodman.
Effective Behavioral & Instructional Support Systems Overview and Guiding Principles Adapted from, Carol Sadler, Ph.D. – EBISS Coordinator Extraordinaire.
ANNUAL AND FINAL PERFORMANCE REPORTS 524B FORM REPORTING PERIOD BUDGET EXPENDITURES INDIRECT COST RATE PERFORMANCE MEASURES.
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
Rob Horner OSEP Center on PBIS Jon Potter Oregon RTI David Putnam Oregon RTI.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Collecting and Analyzing Student Behavior Data Nadia K. Sampson & Dr. Kelsey Morris University of Oregon.
Continuous Improvement and Focused Monitoring System US Department of Education Office of Special Education Programs Overview of the OSEP Continuous Improvement.
Office of Child Development & Early Learning Project MAX: Maximizing Access and Learning Project MAX Maximizing Access and Learning Managing and Sustaining.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Focused Review of Improvement Indicators A Self-Assessment Process SPP Stakeholder Meeting December 16, 2009.
Evaluation Planning & Reporting for School Climate Transformation Grant (SCTG) Sites Bob Algozzine University of North Carolina at Charlotte Steve GoodmanMichigan's.
BISD Update Teacher & Principal Evaluation Update Board of Directors October 27,
The Significance Section Jennifer Doolittle, Ph.D. April 23, 2009.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Leadership Teams Implementing PBIS Module 14. Objectives Define role and function of PBIS Leadership Teams Define Leadership Team’s impact on PBIS implementation.
ANNUAL AND FINAL PERFORMANCE REPORTS 524B FORM REPORTING PERIOD BUDGET EXPENDITURES INDIRECT COST RATE PERFORMANCE MEASURES.
BoQ Critical Element: Faculty Commitment. Critical Element: Faculty Commitment 4. Faculty are aware of behavior problems across campus (regular data sharing)
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Cyndi Boezio, PhD Colorado Department of Education Supervisor for the Office of Learning Supports CO SPDG Director Reporting on the Training Implementation.
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Tier 1 Positive Behavior Support Response to Intervention for Behavior Faculty Overview.
February 25, Today’s Agenda  Introductions  USDOE School Improvement Information  Timelines and Feedback on submitted plans  Implementing plans.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Louisiana State Personnel Development Grant Personnel Development Grant to improve outcomes for SWD Awarded to the LDOE in 2016.
SSIP Implementation: Alignment & Evaluation Across the State System
Child Outcomes Summary Process April 26, 2017
Pat Mueller, EEC David Merves, EEC Vitaliy Shyyan, NCEO
High Quality Coaching: How Do We Know It? April 1, 2015

Assessing Professional Learning via Data Management Systems
Facilitators: Jennifer Coffey, OSEP Project Officer
APR Informational Webinar
Presentation transcript:

Pat Mueller David Merves October 6, 2008 NH RESPONDS Evaluation Component

Why We Collect Data/Evaluate? Somebody said you had to Inform instruction School improvement Local, state, and federal accountability Public information Choose/set policy Marketing Because that’s what all the cool kids are doing…

State Personnel Development Improvement Grants (SIG/SPDGs) SIG/SPDGs are measured against… OSEP program performance measures NH RESPONDS performance measures

SPDG Program Performance Measures % of personnel receiving professional development (PD) on scientific-or evidence-based instructional practices. % of projects that have implemented PD/training activities that are aligned with improvement strategies in the State Performance Plan (SPP). % of PD/training activities provided that are based on scientific-or evidence-based instructional/behavioral practices.

SPDG Program Performance Measures % of PD/ training activities that are sustained through on-going and comprehensive practices (e.g., mentoring, coaching). % of SPDG projects that successfully replicate the use of scientifically based or evidence-based instructional/behavioral practice in schools

SELECT THE CRITERIA AND STANDARDS DEVELOP LOGIC MODEL PREPARE AN EVALUATION PLAN COLLECT DATA ANALYZING DATA & UNDERSTANDING RESULTS COMMUNICATE THE FINDINGS Steps for Conducting an Evaluation

1. Define the Criteria to Be Evaluated Terminology  Goals Long-Term Outcomes or Impact  Objectives Short-term & Intermediate Outcomes  Activities Outputs

Two Types of Evaluation Standards Process/Formative: Assesses ongoing project activities  Begins at program implementation and continues throughout program  Is the program being delivered as planned?  Is the program progressing towards its goals and objectives?

Two Types of Evaluation Standards Outcome/Summative: Assesses the program’s success and whether the program or initiative had an impact.  Compares the actual results to projected goals/objectives.  Typically used for decision making purposes  Important to look for unanticipated outcomes

2. Logic Models A conceptual model that links an initiative’s goals and objectives, with expected outputs and/or outcomes. Numerous types of logic models. There are many other methods of illustrating the conceptual framework of an initiative.

3. Writing an Evaluation Plan Components to include:  Program goal/objectives  Evaluation questions  Performance indicators  Data collection procedures  Data analysis method  Person responsible  Timeline

4. Collecting Data Process/formative data  Amount and type of PD provided  Satisfaction and utility of PD provided  Products developed  These data tend to be gathered by those providing PD Outcome/Summative data  Reduced office discipline referrals  Reduced suspensions/expulsions  Improved reading scores  These data tend to be collected from the LEA or SEA

NH RESPONDS Data Collection Tools PD Activity Log completed by TA/PD providers Minutes  Leadership Team meetings  Workgroup meetings  School & District Improvement Team minutes/products Surveys/Interviews/Focus groups  Annual Participating Personnel Survey (March/April)  Workshop surveys

NH RESPONDS Data Collection Tools Fidelity instruments (for PBIS & Literacy)  Benchmarks of Quality  School-wide Evaluation Tool Existing data  Office Discipline Referrals  Suspension/expulsion data  Reading scores

5. Analyzing Data & Understanding Results SAU/School-Level  Analysis of student performance to improve instruction (i.e. reading scores)  Analysis of school-level data to improve safety and/or climate (i.e. SET) Project/Grant Level  Analysis of formative data for program improvement purposes  Aggregate analysis of all outcome data to describe impact of NH RESPONDS

6. Communicate the Findings (Reporting) Provide on-going feedback to project management ( what’s working/what’s not). Provide on-going data to the NH Bureau related to completion of objectives and success of project efforts. Provide annual report to the U.S. Department of Education related to completion of objectives and success of project efforts.

(802) Evergreen Educational Consulting