Program Measure Review – Considerations for the APR  Jennifer Coffey, PhD, OSEP Program Lead 1 “Continuous improvement is better than delayed perfection.”

Slides:



Advertisements
Similar presentations
NC Educator Evaluation System Process Orientation
Advertisements

Iowa’s Application of Rubrics to Evaluate Screening and Progress Tools John L. Hosp, PhD University of Iowa.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
MSP Evaluation Rubric and Working Definitions Xiaodong Zhang, PhD, Westat Annual State Coordinators Meeting Washington, DC, June 10-12, 2008.
Performance Management
ESEA FLEXIBILITY RENEWAL PROCESS: FREQUENTLY ASKED QUESTIONS January29, 2015.
Assessing Student Learning
Strategy for Excellence Leadership Development & Succession Planning Carl L. Harshman & Associates.
Software as a Medical Device (SaMD) Application of Quality Management System IMDRF/WG/N23 Proposed Document (PD1)R3.
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
 PAS…. Learning Targets: Administrators will be able to: Understand and Articulate the First Observation Cycle Review and Approve Student Achievement.
Intro – SPDG Program Area Meeting Jennifer Coffey, Ph.D. OSEP Project Officer & SPDG Program Lead 1.
MiBLSi Systems of Support for Training October 9,
National Center on Response to Intervention NCRTI TECHNICAL ASSISTANCE DOCUMENTATION AND IMPLEMENTATION Tessie Rose, PhD NCRTI Co-coordinator of TA and.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 7 Portfolio Assessments.
NCDPI Observation Calibration Training Pilot: Introduction & Demo November 2014.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
Faculty Performance Evaluation Reports Grand Rapids Community College Faculty Evaluation System.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
1 SPDG Jennifer Coffey 323A State Personnel Development Grants SPDG Webinar on Grant Performance Report for Continuation Funding Jennifer Coffey Office.
On-line briefing for Program Directors and Staff 1.
Barbara Sims Brenda Melcher Dean Fixsen Karen Blase Michelle Duda Washington, D.C. July 2013 Keep Dancing After the Music Stops OSEP Project Directors’
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
SIG Day 2009 Jennifer Doolittle OSEP July 20, 2009.
APR Know-how Jennifer Coffey November 2013 The Revised SPDG Program Measures and Other Reporting Requirements.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Framework In Action: Theory into Practice
Education Unit The Practicum Experience Session Two.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Completing the Summary Rating Form and End-of-Year Summative Conferencing.
STANDARD V AND WRAP-UP: NC TEACHER CANDIDATE EVALUATION TRAINING GWU TRAINING SESSION.
RIDE Educator Evaluation System Design ACEES Meeting December 6, 2010.
Revised AQTF Standards for Registered Training Organisations Strengthening our commitment to quality - COAG February August 2006.
Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Evaluating the 21 st Century Educator with Fidelity School Executive Training Beaufort County Schools April 19, 2012.
Tuesday, April 12 th 2011 SPDG Performance Measure Discussion.
Implementation Drivers: Selection The contents of this presentation were developed under a grant from the U.S. Department of Education, #H323A However,
Rubrics, and Validity, and Reliability: Oh My! Pre Conference Session The Committee on Preparation and Professional Accountability AACTE Annual Meeting.
North Carolina Educator Evaluation System Jessica Garner
1 SPDG Jennifer Coffey 323A State Personnel Development Grants SPDG Webinar on Grant Performance Report for Continuation Funding Jennifer Coffey Office.
How to Write Training Plans Child Support Directors Association 2010 Annual Child Support Training Conference & Expo October 5, 2010.
Faculty Performance Evaluation Reports Grand Rapids Community College Faculty Evaluation System.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
1 Jennifer Coffey 323A State Personnel Development Grants Webinar on Grant Performance Report for Continuation Funding Jennifer Coffey Office of Special.
Dial-in: Pass code: SPDG Grant Management PLC Webinar 524b SPDG Annual Performance Reporting 101: How to efficiently and effectively.
The North Carolina Teacher Evaluation Process November 1, 2012
© PeopleAdvantage 2013 All Rights Reserved We will Show You How to Easily Conduct Effective Performance Appraisals LCSA Conference 2013.
Eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR.
Grant Management PLC Session Discussion facilitated by Jennifer Coffey November 2011 Performance Measurement Discussion Dial-in: Participant.
PERSONNEL DEVELOPMENT PROGRAM Webinar for 325D and 325K Grantees Completing the ED Grant Performance Report (ED 524B) for the Annual Performance.
Faculty Performance Evaluation Reports Grand Rapids Community College Faculty Evaluation System.
Dial-in: Pass code: SPDG Directors’ Webinar Implementation Driver Series: Organizational – Data Decision Support Systems Facilitators.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Assessment brief Post graduate route.
Welcome to the SPDG Webinar
NC Observation Calibration Process 2015 Update & Implementation Plan North Carolina State Board of Education Kimberly Simmons, North Carolina Educator.
NCEES Process: End-of-Year for Teacher Evaluations
Measuring Project Performance: Tips and Tools to Showcase Your Results
Educator Effectiveness Regional Workshop: Round 2
SPDG GPRA/Program Measure Review
2018 OSEP Project Directors’ Conference
323A State Personnel Development Grants SPDG Webinar on Grant Performance Report for Continuation Funding Jennifer Coffey Office of Special Education.
Installation Stage and Implementation Analysis
Facilitators: Jennifer Coffey, OSEP Project Officer
APR Informational Webinar
Preparing for APR Season
Presentation transcript:

Program Measure Review – Considerations for the APR  Jennifer Coffey, PhD, OSEP Program Lead 1 “Continuous improvement is better than delayed perfection.” Mark Twain

Roll Call

Mute:*6 Unmute: #6

SPDG National Meeting Follow-up Resources and Materials Found at:  Website: annual-spdg-national-meeting annual-spdg-national-meeting  Dropbox Folder: 0Meeting_Nov Meeting_Nov2013 Archived Presentation Recordings  Allison’s Metz: Use of Data presentation  Jennifer’s Program Measure presentation 4

The External Evaluation  Pilot year – next year will be baseline  The Data Quality Initiative (Westat)  2 Reviewers  Evaluated APRs and the procedures/guidance provided to projects  OMB Review  Overall we are doing well  Meaningful measures  Some concern about Program Measure 2  Need to hear from you how we can help 5

Directors’ Webinars Schedule: 6 Apr 3 Organization Driver: Use of Data, Program Measure Exemplars May 1 Family Engagement Jun 1 Organizational Driver: Facilitated Administration & Systems (Dean Fixsen) Jul 21 Project Directors’ Conference: Program Area Meeting (DC) Sep 4 Leadership Driver Oct 21- SPDG National Meeting (DC) 23

Rubric B 7

Considerations for your APR writing  Match numbers (e.g. targets) in different sections of your APR  Give names of all fidelity measures (status chart & description)  Describe it as a fidelity measure (e.g., “---- measure assesses the presence or absence of the core components of ---- intervention”)  Describe the 20% reliability check by an outside observer in the “Explanation of Progress” section (after the status chart) 8

Further considerations…  Choose (working with your PO) 1 fidelity measure to follow for each initiative.  Create the target for that fidelity measure  Follow each initiative separately with its own data 9

Program Measure 2 Exemplar  North Carolina’s APR 10

Things not to worry about  For program measure 3 and 4 – having an exact dollar/participant target.  The target percentage is critical, however. 11

Guidance for each measure 12

Summary of the numbers: Program measure 1 Met Target YesNo Year 2 (13 initiatives) 49 Year 3 (13 initiatives) 67 Year 4 (7 initiatives) 52 Total (33 initiatives) 1518 % 45%55% 13

Measure 2 Met Target YesNo Year 2 (2 initiatives) 11 Year 3 (5 initiatives) 41 Year 4 (5 initiatives) 14 Total (12 initiatives) 66 % 50% 14

Measure 3 Project Costs Met Target Cost for TA Cost for all PD % for TAYesNo Year 2 (8 initiatives) $2,057,004$2,791,35774% Year 2 (8 initiatives) 62 Year 3 (10 initiatives) $3,010,015$4,078,19874% Year 3 (10 initiatives) 100 Year 4 (7 initiatives) $1,511,883$1,808,39684% Year 4 (7 initiatives) 61 Total (25 initiatives) $6,578,902$8,677,95176% Total (25 initiatives) 223 %88%12% 15

Inter-rater reliability Measure 1 When the 2 raters differed, they used one of several methods to determine a final rating: 1. They identified characteristics of the description that were similar to characteristics of descriptions that they rated previously, and gave it the same rating as the previously rated descriptions. 2. They each identified description elements that influenced the rating (e.g., identified information that was lacking from the description, identified critical components that were included in the description) and came to agreement on the most appropriate rating. 3. They identified description elements that were essential to the PD component and came to an agreement on how to rate descriptions that were missing one or more of the critical elements. 4. They reviewed supporting documentation cited in the rubric and discussed key aspects of the PD component that should be included in the grantee’s description, and came to agreement on the most appropriate rating. 16

Next steps  The external evaluators will modify Rubric A (Program Measure 1 template/guidance)  DQI recommends specifying where to find information relevant to each component (e.g., insert a footnote with a link to the NIRN website for information specific to expectations for trainers for domain A(2) and insert a different NIRN link for information specific to adult learning principles for domain B(2)).  DQI also recommends refining descriptions of the domains and adding information about the components, particularly when a substantial number of descriptions received ratings of inadequate or barely adequate, to improve the quality of descriptions grantees provide on professional development components. 17

Next steps  Learn from SPDGs that earned good ratings for their program measures  April Webinar (Measures 1 & 2)  Evaluator Q & A session  Feedback from you via or call 18