Download presentation
Presentation is loading. Please wait.
Published byEunice Wright Modified over 9 years ago
1
Program Measure Review – Considerations for the APR Jennifer Coffey, PhD, OSEP Program Lead 1 “Continuous improvement is better than delayed perfection.” Mark Twain
2
Roll Call
3
Mute:*6 Unmute: #6
4
SPDG National Meeting Follow-up Resources and Materials Found at: Website: http://signetwork.org/content_pages/239-3rd- annual-spdg-national-meeting http://signetwork.org/content_pages/239-3rd- annual-spdg-national-meeting Dropbox Folder: https://www.dropbox.com/home/SPDG%20National%2 0Meeting_Nov2013 https://www.dropbox.com/home/SPDG%20National%2 0Meeting_Nov2013 Archived Presentation Recordings Allison’s Metz: Use of Data presentation Jennifer’s Program Measure presentation 4
5
The External Evaluation Pilot year – next year will be baseline The Data Quality Initiative (Westat) 2 Reviewers Evaluated APRs and the procedures/guidance provided to projects OMB Review Overall we are doing well Meaningful measures Some concern about Program Measure 2 Need to hear from you how we can help 5
6
Directors’ Webinars Schedule: 6 Apr 3 Organization Driver: Use of Data, Program Measure Exemplars May 1 Family Engagement Jun 1 Organizational Driver: Facilitated Administration & Systems (Dean Fixsen) Jul 21 Project Directors’ Conference: Program Area Meeting (DC) Sep 4 Leadership Driver Oct 21- SPDG National Meeting (DC) 23
7
Rubric B 7
8
Considerations for your APR writing Match numbers (e.g. targets) in different sections of your APR Give names of all fidelity measures (status chart & description) Describe it as a fidelity measure (e.g., “---- measure assesses the presence or absence of the core components of ---- intervention”) Describe the 20% reliability check by an outside observer in the “Explanation of Progress” section (after the status chart) 8
9
Further considerations… Choose (working with your PO) 1 fidelity measure to follow for each initiative. Create the target for that fidelity measure Follow each initiative separately with its own data 9
10
Program Measure 2 Exemplar North Carolina’s APR 10
11
Things not to worry about For program measure 3 and 4 – having an exact dollar/participant target. The target percentage is critical, however. 11
12
Guidance for each measure 12
13
Summary of the numbers: Program measure 1 Met Target YesNo Year 2 (13 initiatives) 49 Year 3 (13 initiatives) 67 Year 4 (7 initiatives) 52 Total (33 initiatives) 1518 % 45%55% 13
14
Measure 2 Met Target YesNo Year 2 (2 initiatives) 11 Year 3 (5 initiatives) 41 Year 4 (5 initiatives) 14 Total (12 initiatives) 66 % 50% 14
15
Measure 3 Project Costs Met Target Cost for TA Cost for all PD % for TAYesNo Year 2 (8 initiatives) $2,057,004$2,791,35774% Year 2 (8 initiatives) 62 Year 3 (10 initiatives) $3,010,015$4,078,19874% Year 3 (10 initiatives) 100 Year 4 (7 initiatives) $1,511,883$1,808,39684% Year 4 (7 initiatives) 61 Total (25 initiatives) $6,578,902$8,677,95176% Total (25 initiatives) 223 %88%12% 15
16
Inter-rater reliability Measure 1 When the 2 raters differed, they used one of several methods to determine a final rating: 1. They identified characteristics of the description that were similar to characteristics of descriptions that they rated previously, and gave it the same rating as the previously rated descriptions. 2. They each identified description elements that influenced the rating (e.g., identified information that was lacking from the description, identified critical components that were included in the description) and came to agreement on the most appropriate rating. 3. They identified description elements that were essential to the PD component and came to an agreement on how to rate descriptions that were missing one or more of the critical elements. 4. They reviewed supporting documentation cited in the rubric and discussed key aspects of the PD component that should be included in the grantee’s description, and came to agreement on the most appropriate rating. 16
17
Next steps The external evaluators will modify Rubric A (Program Measure 1 template/guidance) DQI recommends specifying where to find information relevant to each component (e.g., insert a footnote with a link to the NIRN website for information specific to expectations for trainers for domain A(2) and insert a different NIRN link for information specific to adult learning principles for domain B(2)). DQI also recommends refining descriptions of the domains and adding information about the components, particularly when a substantial number of descriptions received ratings of inadequate or barely adequate, to improve the quality of descriptions grantees provide on professional development components. 17
18
Next steps Learn from SPDGs that earned good ratings for their program measures April Webinar (Measures 1 & 2) Evaluator Q & A session Feedback from you via email or call (jennifer.coffey@ed.gov; 202-245-6673)jennifer.coffey@ed.gov 18
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.