Download presentation
Presentation is loading. Please wait.
Published byVirgil Simon Modified over 9 years ago
1
1 SPDG Jennifer Coffey 323A State Personnel Development Grants SPDG Webinar on Grant Performance Report for Continuation Funding Jennifer Coffey Office of Special Education Programs US Department of Education Washington, DC January 2013
2
2 Today’s Agenda Today’s Topic – Grant Performance Report for Continuation Funding Overview of Performance Reporting Developing Performance Measures Completing Section A of the ED 524B Completing Sections B&C of the ED 524B Program Measures Examples
3
O S E P Annual Grant Performance Report (APR) An annual report of your activities and performance in meeting the approved objectives of the project and responsible use of federal funds Required for all active grants, including those in no cost extension (NCE) OSEP reviews the report to determine if substantial progress has been made in order to receive continued funding or a NCE 3
4
O S E P Requesting a no-cost extension (NCE) On the Grants Management page (http://www.signetwork.org/content_pages/139): Requesting No-cost extensions At the time the no-cost extension is requested, about 30 days before the end of the grant, the Project Officer will need a continuation report emailed to them. The Project Officer will also need to know: (1) the amount the grantee has remaining in their budget, (2) the activities the grantee wants to continue to conduct that align with approved objectives, (3) how much of the budget will be used for each activity, and (4) why the grantee was not able to spend the entire budget within 5 years. NCE REporting FormNCE REporting Form, This is the form OSEP provides to states to capture information about the no cost extension (NCE). NCE REporting Form 4
5
O S E P Overview Recognize strong project objectives that can be associated with high quality performance measures Develop relevant, measurable, outcome oriented performance measures related to your objectives that maximize the potential for meaningful data reporting and positive outcomes Complete the ED Grant Performance Report (aka. APR) using form ED 524B. 5
6
O S E P Why Is This Important? High quality objectives and measures … Make it easier for you to measure your progress for the purpose of grant management Allow you to report progress easily and quantitatively Establish targets (both short-term/annual & long- term) Allow OSEP staff to gather evidence of program effectiveness Taken from the Center for Evaluation and Education Policy (CEEP) presentation at www.tadnet.org 6
7
O S E P Goal – Objectives - Measures 7 Taken from the Center for Evaluation and Education Policy (CEEP) presentation at www.tadnet.org
8
O S E P Project Objectives What are you trying to accomplish? Objectives should answer this question. Preferred format for objectives: Begin the objective with a verb and define a desired outcome or condition 8
9
O S E P High Quality Project Objectives Relevance How relevant is the project objective to the overall goal of the program and/or the goal of your project? Applicability How applicable is the project objective to the specific activities that are being conducted through your particular project? 9 Taken from the Center for Evaluation and Education Policy (CEEP) presentation at www.tadnet.org
10
O S E P High Quality Project Objectives Focus How focused is the project objective? Measurability Are there concepts in the project objective that lend themselves to measurement? If so, is measurement feasible? Taken from the Center for Evaluation and Education Policy (CEEP) presentation at www.tadnet.org 10
11
O S E P Project Objectives -- Examples Establish a licensure program which will recruit, enroll, support, and assist paraprofessionals currently employed in an urban school district to meet state certification requirements in special education Implement a high-quality professional development program to help LEAs implement a multi-tiered system for behavior and academics
12
O S E P Project Objectives - Examples Provide training that enables personnel to work with and involve parents in their child’s education, including parents of low income and limited English proficient children with disabilities Provide training that enables personnel to work with and involve parents in their child’s education, including parents of low income and limited English proficient children with disabilities
13
O S E P Performance Measures How are you measuring your progress in meeting your objectives? Performance measures should answer this question. 13
14
O S E P Performance Measures Measurable indicator used to determine how well objectives are being met. How will progress be assessed? How much progress will constitute success? How will it be known if an objective or part of an objective has been achieved? Taken from the Center for Evaluation and Education Policy (CEEP) presentation at www.tadnet.org 14
15
O S E P Performance Measures 15 Taken from the Center for Evaluation and Education Policy (CEEP) presentation at www.tadnet.org
16
O S E P 2 Types of Performance Measures Program All grantees funded under the SPDG must report on the PDP program performance measures established by OSEP. Project Each grantee reports on the approved project performance measures established to meet their project objectives. 16
17
O S E P Program Performance Measures Program Measures established by OSEP for the SPDGs. Measures apply to all grants funded under the SPDG. Results on these measures are reported to Congress under the Government Performance and Results Act of 1993. Please see the Program Measures Web page for more information and recorded Webinars: http://www.signetwork.org/content_pages/205 17
18
O S E P Types of Performance Measures Project Measures that the grantee establishes to meet their project objectives Project performance measures can address both the process of working towards an objective and the outcome related to meeting the objective Ensure a mix of both process and outcome measures, but most will be outcome Taken from the Center for Evaluation and Education Policy (CEEP) presentation at www.tadnet.org 18
19
O S E P High Quality Performance Measures High quality performance measures show What will change How much change you expect Who will achieve the change When the change will take place 19 Taken from the Center for Evaluation and Education Policy (CEEP) presentation at www.tadnet.org
20
O S E P Project Performance Measure Examples Process measure (e.g.) - SPDG staff (who) will hold 4 (how much) trainings with IHE faculty on how to integrate the transition curriculum into their syllabi (what ) during the first and third years of the grant (when). 20
21
O S E P Project Performance Measure Examples Outcome measure (e.g.) - By the end of the third year of the grant (when), 80% of SPDG professional development participants (who) will demonstrate 100% reliability (how much) when using the self- assessment rubric established to evaluate implementation of the ---- program (what). 21
22
O S E P Project Performance Measure Examples Outcome measure (e.g.) - At the end of their third year of training (when), 90% (how much) of partner schools (who) will demonstrate a 15% improvement in the math scores of 4 th grade students (what). 22
23
O S E P Common Problems Activities are NOT performance measures If the best response is “Yes, we did that,” it is likely an activity (not a performance measure) Activities: Establish a stakeholder group Hold an advisory board meeting Evaluate the project 23 Taken from the Center for Evaluation and Education Policy (CEEP) presentation at www.tadnet.org
24
O S E P Common Problems Performance measures need to be measurable Examples with measurement problems (activities rather than outcomes) Will maintain collaborative partnerships with parent organizations Increase the sustainability of the personnel development program 24 Taken from the Center for Evaluation and Education Policy (CEEP) presentation at www.tadnet.org
25
O S E P Need additional information on writing performance measures? All grantees are strongly encouraged to seek training on writing performance measures. For further information on developing performance measures and logic models, see - http://www.tadnet.org/model_and_performance 25
26
O S E P Summary Projects should have a few clear objectives that explain what the project is doing to support the overall goal(s) Each objective should have a few, specific performance measures to demonstrate how progress toward meeting the objective will be measured Both program and project performance measures are included in the ED524B 26
27
O S E P Completing the ED 524B The ED 524B is a required reporting form with specific instructions. The form is used by all ED grants and has been approved by the Office of Management and Budget (OMB). Project Directors must follow the directions listed in the Dear Colleague letter and ED 524B Instructions provided by OSEP. Word or PDF versions of the forms are available at http://www2.ed.gov/fund/grant/apply/appforms/appforms.html 27
28
O S E P x 28 X
29
O S E P Reporting Period: For first year grants, the date is the beginning of the project year to February 29, 2012. For grants in years 2-4, it is the date from the end of the previous reporting period to February 29, 2012. ANNUAL PERFORMANCE REPORTS 29 2 29 2012
30
O S E P ANNUAL PERFORMANCE REPORTS Budget Expenditures: Report the expenditures during the “Reporting Period.” Must be data or information from the business or grants office. 30
31
O S E P Signatory must have authority to sign on behalf of the institution since the grant is from the Department to the institution and not to an individual. The Authorized Representative signs; not the Project Director. Performance Measure Status: This will be checked “No” since OSEP is asking for data for the reporting period covering all years of the grant, not for this budget period. The date entered here will be the due date for your Final Performance Report; which is 90 days after the end of the grant. ANNUAL PERFORMANCE REPORTS 31
32
O S E P EXECUTIVE SUMMARY SHEET 32 OMB No. 1894-0003 Exp. 02/28/2011 OMB No. 1894-0003 Exp. 02/28/2011 *** Provide highlights of the project's activities and the extent to which the expected outcomes and performance measures were achieved during the reporting period. Do NOT include the project abstract. H323A - - - - - -
33
O S E P PROJECT STATUS CHART 33 H323A - - - - - -
34
O S E P If you are a 2009, 2010, or 2011 grantee you will begin with Program Measure 1 as your 1 st project objective. Program Measure 2 will be your 2 nd objective, and so on. After these program measures you will then have your project’s objectives. Please see the program measures presentation for more information. PROJECT STATUS CHART 34 H325T - - - - - -
35
O S E P Here you identify if the performance measure is a PROGRAM measure, “PRGM,” or a PROJECT measure, “PROJ.” Note: Program measure refers to one of OSEP’s 4 performance measures for the SPDGs. Project measures are unique to your grant. PROJECT STATUS CHART 35 PRGM PROJ
36
O S E P PROJECT STATUS CHART QUANTITATIVE DATA 36 Depending on your measure, enter either a raw number, or a ratio and percentage. Enter the target number identified in the performance measure and then the actual data for this year. If complete data are not available for the measure, enter “999” (if no baseline) or “NA” in the “Raw Number” or “%” column, as appropriate. Provide an explanation at the bottom of the page under “Explanation of Progress.”
37
O S E P Information to Include in the Explanation of Progress Section Describe the data provided (e.g., what data collection methods were used, when were the data collected, how was a sample drawn, are there missing/incomplete data, what was the response rate, was a reliability measure taken). Your Project Officer should be able to understand and interpret the number in the chart from your description in this section. What changes in the data occurred since last APR (i.e., trend)? What activities were undertaken to achieve the targets? If targets were not met, what are possible reasons? How will activities that failed to meet targets be improved? 37
38
O S E P Additionally… A “template” is provided for the program measure descriptions in the Program Measure Example Continuation Report (http://www.signetwork.org/content_pages/20 5) Program Measure Example Continuation ReportProgram Measure Example Continuation Report 38
39
O S E P QUALITATIVE DATA If measure requires the collection of qualitative data, report the performance measure and type (PROG or PROJ) and then, enter “N/A” under the Raw Number and Percentage columns. N/A In the “Explanation of Progress” section of the page, referencing the performance measure by number, report applicable qualitative data along with other information about how these data were collected, targets and activities –refer to previous slide for additional content requirements. PROJECT STATUS CHART 39
40
O S E P Final Page of the Report Section B: Refer to the instructions for Section B in the ED 524B Instructions Section C: Include additional information (recruitment material, syllabi, evaluation instruments, journal articles) 40 H325 - - - - - -
41
O S E P Section B – Budget Information This section is never blank! A table can be helpful! Provide actual expenditures for this reporting period (through 2/29/2012) Estimate anticipated expenditures for the rest of this budget period and balance remaining, if any. Explain why you did not expend funds at the expected rate. Indicate how you plan to use the unexpended funds (carryover) in the next budget period. 41
42
O S E P Section B – Budget Information Describe any significant changes to your budget resulting from modifications of project activities. Describe any changes to your budget that affect your ability to achieve your approved project activities and/or project objectives. Describe any anticipated changes in your budget for the next budget period that require prior approval from the Department. Any questions … Talk to your Project Officer 42
43
O S E P Section C – Additional Information Provide a list of current partners on your grant and indicate if: Any partners changed during the reporting period. If there were changes, please describe both the changes and any impact that resulted in your ability to achieve approved project objectives and/or project activities. Any partners are anticipated to change during the next budget period. If so, please describe both the changes and any impact the change might have on your ability to achieve approved project objectives and/or project activities. Describe any changes that you wish to make in the grant’s activities for the next budget period that are consistent with the scope and objectives of your approved application. 43
44
O S E P Section C – Additional Information If requesting changes to the approved Project Director and/or other key personnel, please include the person’s name, title, and contact information. Indicate his/her proposed start date, and percentage of time working on the grant, and attach a resume or curriculum vitae to the annual performance report being submitted. Do not report on any key personnel changes that were already made during the current or previous budget period(s). Note: Departmental approval must be requested and received prior to making key personnel changes. Provide any other information about your project including unanticipated outcomes or benefits 44
45
O S E P Submitting the ED 524B 45 Submit the ED 524B at http://www.g5.gov/http://www.g5.gov/ Instructions for using G5 are in the continuation packet. Signed ED 524B Cover Sheet must be scanned and emailed in PDF format to your Project Officer. Special cases require regular email submission of the 524B and signed cover sheet in PDF format to your Project Officer rather than submission through G5 – Final Performance Reports or APRs for grants in their last performance period, or no-cost extension Grants that have been front-loaded (forward-funded) last year sometimes cannot be uploaded.
46
O S E P THE REVISED SPDG PROGRAM MEASURES: AN OVERVIEW Program Measures Web page: http://www.signetwork.org/content_page s/205 http://www.signetwork.org/content_page s/205 http://www.signetwork.org/content_page s/205 46
47
Program Measure 1: Projects use evidence-based professional development practices to support the attainment of identified competencies. Program Measure 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time. 47
48
Program Measure 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG-supported practices. (Efficiency Measure) Program Measure 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities. 48
49
Projects use evidence-based professional development practices to support the attainment of identified competencies. 49
50
Fixsen and colleagues Trivette and Dunst Guskey Learning Forward (Formerly National Staff Development Council) 50
51
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005). Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Download all or part of the monograph for free at: http://www.fpg.unc.edu/~nirn/resources/detail.cfm?resourc eID=31http://www.fpg.unc.edu/~nirn/resources/detail.cfm?resourc eID=31 Implementation Research: A Synthesis of the Literature 51
52
Evidence-Based Intervention Practices Insert your SPDG initiative here (identified competencies) Evidence-Based Implementation Practices Professional Development Competency Drivers Organization Drivers 52 Two Types of Evidence-Based Practices
53
53 HOW?
54
Job or role description should be explicit about expectations and accountability for all positions (e.g., teachers, coaches, staff, administrators) Readiness measures to select at a school building-level or school district-level. Interactive interview process (Blase, VanDyke, & Fixsen, 2010) 54
55
Training must be … › Timely › Theory grounded (adult learning) › Skill-based Information from Training feeds back to Selection and feeds forward to Coaching SelectionTraining Coaching (Blase, VanDyke, & Fixsen, 2010) 55
56
Using Research Findings to Inform Practical Approaches to Evidence-Based Practices Carl J. Dunst, Ph.D. Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Recording and resources: http://www.signetwork.org/event_calendar/events/396 http://www.signetwork.org/event_calendar/events/396 Presentation Prepared for a Webinar with the Knowledge Transfer Group, U.S. Department of Health and Human Services, Children’s Bureau Division of Research and Innovation, September 22, 2009 56
57
“ Adult learning refers to a collection of theories, methods, and approaches for describing the characteristics of and conditions under which the process of learning is optimized.” 57
58
Planning IntroduceEngage the learner in a preview of the material, knowledge or practice that is the focus of instruction or training IllustrateDemonstrate or illustrate the use or applicability of the material, knowledge or practice for the learner Application Practice Engage the learner in the use of the material, knowledge or practice EvaluateEngage the learner in a process of evaluating the consequence or outcome of the application of the material, knowledge or practice Deep Understanding ReflectionEngage the learner in self-assessment of his or her acquisition of knowledge and skills as a basis for identifying “next steps” in the learning process Mastery Engage the learner in a process of assessing his or her experience in the context of some conceptual or practical model or framework, or some external set of standards or criteria a Donovan, M. et al. (Eds.) (1999). How people learn. Washington, DC: National Academy Press. 58
59
The smaller the number of persons participating in a training (<20), the larger the effect sizes for the study outcomes. The more hours of training over an extended number of sessions, the better the study outcomes. 59
60
Practices Number Mean Effect Size (d) 95% Confide nce Interval Studies Effect Sizes Pre-class exercises991.02.63-1.41 Out of class activities/self- instruction 1220.76.44-1.09 Classroom/workshop lectures 26108.68.47-.89 Dramatic readings1840.35.13-.57 Imagery718.34.08-.59 Dramatic readings/imagery 411.15-.33-.62 Effect Sizes for Introducing Information to Learners 60
61
Practices Number Mean Effect Size (d) 95% Confid ence Interva l Studie s Effect Sizes Standards-based assessment 1344.76.42- 1.10 Self-assessment1629.67.39-.95 Effect Sizes for Self-Assessment of Learner Mastery 61
62
To be most effective need to actively involve the learners in judging the consequences of their learning experiences (evaluate, reflection, & mastery) › Need learner participation in learning new knowledge or practice › Need learner engagement in judging his or her experience in learning and using new material 62
63
Design a Coaching Service Delivery Plan Develop accountability structures for Coaching – Coach the Coach! Identify on-going professional development for coaches Coaching Performance Assessment Training (Blase, VanDyke, & Fixsen, 2010) 63
64
Must be a transparent process Use of multiple data sources Fidelity of implementation should be assessed at the local, regional, and state levels Tied to positive recognition Information from this driver feeds back to Selection, Training, and Coaching and feeds forward to the Organization Drivers 64
65
Assess fidelity of implementation at all levels and respond accordingly Identify outcome measures that are … › Intermediate and longer-term › Socially valid › Technically adequate: reliable and valid › Relevant data that is feasible to gather, useful for decision making, widely shared and reported frequently 65
66
A Building/District Leadership and Implementation Team is formed › The Team uses feedback and data to improve Implementation Drivers Policies and procedures are developed and revised to support the new ways of work Solicits and analyzes feedback from staff and stakeholders 66
67
Leadership analyzes feedback from staff and makes changes to alleviate barriers and facilitate implementation, Revising policies and procedures to support new way of work. 67
68
SPDG Professional Development Rubric 5 Domains, each with components Selection Training Coaching Performance Assessment/Data-based decision making Facilitative administration/Systems intervention Components from the National Implementation Research Network, Learning Forward (NSDC), Guskey, Trivette Each component of the domains will be rated from 1 - 468
69
69
70
Component Themes Assigning responsibility for major professional development functions (e.g., measuring fidelity and outcomes; monitoring coaching quality) Expectations stated for all roles and responsibilities (e.g., PD participants, trainers, coaches, school & district administrators) Data for each stage of PD (e.g., selection, training, implementation, coaching, outcomes)70
71
SPDG Initiatives and Evidence-based Professional Development EB-PD should be applied to those initiatives that lead to implementation (of the practice/program providing training on)71
72
Grantee Benchmarks 1 st year of funding: baseline 2 nd yr: 50% of components will have a score of 3 or 4 3 rd yr: 70% of components will have a score of 3 or 4 4 th yr: 80% of components will have a score of 3 or 4 5 th yr: 80% of components will have a score of 3 or 4 (maintenance yr)72
73
73 Program Measure 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.
74
Fidelity of implementation is traditionally defined as “the extent to which the user’s current practice matches the ideal (Loucks, 1983). 74
75
Each initiative should have a fidelity measure that notes the presence or absence of the core features of the innovation/program/system that the initiative is focused on 75
76
Use implementation measures that have already been created › For example: new RTI implementation measure from the Natl RTI Center › Literacy implementation: Planning and Evaluation Tool – Revised (PET-R) › PBIS: Schoolwide Evaluation Tool (SET) › Others… 76
77
To develop fidelity criteria, researchers often reported starting with a curriculum profile or analysis that outlined the critical components of the intervention along with an indication of the range of variations for acceptable use. The researcher or developer then outlined acceptable ranges of variation (Songer & Gotwals, 2005). A component checklist was then developed to record fidelity to these components (Hall & Loucks, 1977). 77
78
What is “it”? Operationalize Part of Speech: verb Definition: to define a concept or variable so that it can be measured or expressed quantitatively Webster's New Millennium™ Dictionary of English, Preview Edition (v 0.9.7) Copyright © 2003-2008 Lexico Publishing Group, LLC The “it” must be operationalized whether it is: »An Evidence-Based Practice or Program »A Best Practice Initiative or New Framework »A Systems Change Initiative Practice Profiles »Help Operationalize Practice, Program, and Systems Features 78
79
Searching for “It” Research findings, materials, manuals, and journal articles do not necessarily provide clarity around core intervention elements Current and new evidence-based practices, frameworks, programs will have a range of operational specificity Developing clarity around the “it” is critical 79
80
Practice Profile Defining “it” Through the Development and Use of Practice Profiles Guiding Principles identified Critical Components articulated Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 80
81
Practice Profile Defining “it” Through the Development and Use of Practice Profiles Guiding Principles identified Critical Components articulated For each critical component: Identified gold standard Identified acceptable variations in practice Identified ineffective practices and undesirable practices Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 81
82
Resources for Building Practice Profiles 82 National Centers Experts in Your State National Purveyors Manuals and Materials Implementing Districts and Schools Other States Consensus Building in Your State
83
Example Problem-Solving Practice Profiles in an RtI Framework 83 RESOURCE - Professional Practices in Problem Solving: Benchmarks and Innovation Configurations ~ Iowa Area Education Agency Directors of Special Education, 1994
84
Practice Profile Defining “it” Through the Development and Use of Practice Profiles Guiding Principles identified Critical Components articulated For each critical component: Identified gold standard Identified acceptable variations in practice Identified ineffective practices and undesirable practices Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 84
85
Practice Profiles Each Critical Component is a heading Each level of implementation specifies the activities necessary to operationalize that Critical Component Critical Component Ideal Implementation Acceptable Variation Unacceptable Variation Critical Component 1: Description Description of implementer behavior Drastic Mutation Hall and Hord, 2010, Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 85
86
Professional Problem Solving 9 Critical Components Parent Involvement Problem Statement Systematic Data Collection Problem Analysis Goal Development Intervention Plan Development Intervention Plan Implementation Progress Monitoring Decision Making Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 86
87
Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 87
88
Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 88
89
Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 89
90
Professional Problem Solving Parent Involvement – Critical Components 90
91
Michigan’s Practice Profile: Building Leadership Team Example 91
92
The project will set its own benchmarks for professional development participants 1 year into training/assistance, 2 yrs in, 3 yrs in, 4 yrs in For example: 1 yr benchmark = 40% of core features in place, 4 yr benchmark = 80% of features in place The project will then determine what percentage of participants they expect to reach this benchmark (e.g., 80% of participants) a.Participants could be individual teachers (if working with just a few teachers or other type of professional per school or district) or could be a school (if working on a school-wide basis, such as RTI or PBIS) 92
93
Self-assessment is acceptable, but projects will need to sample from the group to validate the self- assessment a.For example, if 15 schools were being measured someone from the project would observe at least 3 (20%) of the schools and compare their assessment with the self-assessment A baseline wouldn’t be necessary 93
94
94 Program Measure 3: Projects use SPDG professional development funds to provide follow- up activities designed to sustain the use of SPDG-supported practices. (Efficiency Measure
95
Professional development funds = a minimum of 90% of the overall budget being used for activities from subsection "a" of the notice/Statute › Only following the initiatives from Program Measure 1 & 2 Follow-up activities = the professional development assistance provided following training. A list of follow-up activities that are correlated with sustainability will be provided. 95
96
Coaching/mentoring* Implementation fidelity measurement & other types of observation* Mini-workshops* Determining needs through data and providing guidance or tools to meet those needs* Maintaining data systems* Peer sharing* 96
97
Model demonstration site activities Creating and disseminating enduring documents (procedural manuals)* Communities of Practice TA Networks (support from internal state/local TA&D systems Regional PD partnerships* * = Evidence-based 97
98
Research has demonstrated that “train and hope” does not work. Instead, ongoing support is needed for those who attend training. Despite this evidence, most professional development is one-time only, which is inefficient and largely a waste of money. 98
99
To demonstrate that the SPDG projects are using their money efficiently by providing the appropriate ongoing TA services that may lead to sustained use of the SPDG-supported practices. 99
100
For each initiative, grantee should report cost of activities designed to sustain learning of scientific or evidence-based instructional practices, divided by the total cost of all professional development activities carried out for the initiative. 100
101
Cost of ongoing TA Cost of all PD activities for an initiative 101
102
Only need to report on those initiatives reporting on for Measures 1 & 2 Projects will set their own targets 102
103
Consider what is happening each year of your project › Are you providing training for an entire year before you begin providing coaching? › In the final year of your project are you no longer providing training and only providing follow-up support? 103
104
Your initiative would help build local coaching capacity Projects would match/modify their training with (a) coaching, (b) performance feedback, and (c) student outcomes 104
105
105 Program Measure 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.
106
Divide the number of teachers who remain in a teaching position by all teachers who received SPDG assistance. 106
107
# of personnel retained for at least two years following participation in a SPDG teacher retention activity # of personnel participating in a SPDG activity designed to retain highly qualified special education teachers 107
108
This is only for projects that have teacher retention as an objective. 108
109
Only inservice Initial participation is defined as beginning at the time someone receives funding or services from the SPDG grant. 109
110
If the SPDG State does not have a tracking system for highly qualified special education teachers they will need to put an agreement in place with the individual receiving funds or services › This agreement will require information from that individual for the life of the grant 110
111
Contact your OSEP Project Officer with any questions! 111
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.