Presentation is loading. Please wait.

Presentation is loading. Please wait.

SPDG New Grantee Orientation

Similar presentations


Presentation on theme: "SPDG New Grantee Orientation"— Presentation transcript:

1 SPDG New Grantee Orientation
September 15, 2016 September 22, 2016 Jennifer Coffey & John Lind

2 Today’s Agenda Introductions Resources Program Measures

3 Congratulations! Welcome to the SIGnetwork Introductions
State & your position in the grant Previous grant? New to the SPDG? Describe your initiatives and the populations you are serving What would success look like for these initiatives and for your grant generally? What would you like to learn from other SPDG grantees?

4 Opportunities to learn from other SPDGs
SPDG National Meeting (Oct 11 & 12) Directors’ Webinars Communities Coaching Evaluators SSIP Family Engagement Mentoring Sending questions to the SIGnetwork listserv OSEP Project Directors’ Conference (July 2018)

5 The SIGnetwork Website
Topical resources (e.g., coaching tools) SPDG contacts SPDG project abstracts Recorded Webinars Program Measure exemplars and how-to Grant management support Events calendar

6 Annual Performance Report
Provides 2 Webinars: (1) Program Measures how-to; (2) Annual Performance Report how-to Gives you the continuation reporting package Program Measures how-to: Provides step-by-step directions, examples, exemplars, Webinars Has the document you must fill out for Program Measure 1 (EB-PD Worksheet)

7 Recommendation Review the Webinars
Then create a draft APR and send to your project officer to review with you This will help you focus your data collection in this first year Your evaluator may be able to help

8 Speaking of evaluation…
The Center to Improve Project Performance (CIPP) evaluation resource: Working with a 3rd party evaluator Evaluation resource page:

9 Other Resources www.OSEPIDEASThatWork.org
Federal Resources for Stakeholders Looking for guidance about specific OSEP priorities? Access this set of resources to learn more about OSEP recommendations relevant to all special education stakeholders. Resources for Grantees This section is designed to help OSEP Grantees accomplish their goals, including guidance on technical assistance (TA), collaboration tools, and resources and funding opportunities for doctoral students. Find a Center or Grant

10 Regional Education Laboratories (RELs)
Webinars Ability to do research for SEAs Guidance documents Tools

11 Uniform Guidance Webinar FAQs

12 New G5 Hotline Self-Help Portal
On Aug. 3, 2015, the G5 Hotline will debut a Self-Help Portal to all external users of G5, the U.S. Department of Education's Grants Management System. The Self-Help portal, accessed through the EDCAPS.FORCE.COM web address from any location at any time, allows: G5 users to submit and track their own tickets Access knowledge articles 24/7 to find answers to issues and questions about the G5 application

13 The SPDG Program/GPRA Measures: An Overview
Program Measures Web page:

14 Capturing Performance
Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies. Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

15 Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG-supported practices. (Efficiency Measure) Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.

16 Performance Measure #1 Projects use evidence-based professional development practices to support the attainment of identified competencies.

17 Best Practices in Selection
Job or role description should be explicit about expectations and accountability for all positions (e.g., teachers, coaches, staff, administrators) Readiness measures to select at a school building-level or school district-level. Interactive interview process (Blase, VanDyke, & Fixsen, 2010)

18 Best Practices in Training
Training must be … Timely Theory grounded (adult learning) Skill-based Information from Training feeds back to Selection and feeds forward to Coaching Selection Training Coaching (Blase, VanDyke, & Fixsen, 2010)

19 Using Research Findings to Inform Practical Approaches to Evidence-Based Practices
Carl J. Dunst, Ph.D., Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Recording and resources: Presentation Prepared for a Webinar with the Knowledge Transfer Group, U.S. Department of Health and Human Services, Children’s Bureau Division of Research and Innovation, September 22, 2009

20 “Adult learning refers to a collection of theories, methods, and approaches for describing the characteristics of and conditions under which the process of learning is optimized.”

21 Six Characteristics Identified in How People Learna Were Used to Code and Evaluate the Adult Learning Methods Planning Introduce Engage the learner in a preview of the material, knowledge or practice that is the focus of instruction or training Illustrate Demonstrate or illustrate the use or applicability of the material, knowledge or practice for the learner Application Practice Engage the learner in the use of the material, knowledge or practice Evaluate Engage the learner in a process of evaluating the consequence or outcome of the application of the material, knowledge or practice Deep Understanding Reflection Engage the learner in self-assessment of his or her acquisition of knowledge and skills as a basis for identifying “next steps” in the learning process Mastery Engage the learner in a process of assessing his or her experience in the context of some conceptual or practical model or framework, or some external set of standards or criteria a Donovan, M. et al. (Eds.) (1999). How people learn. Washington, DC: National Academy Press.

22 Additional Translational Synthesis Findings
The smaller the number of persons participating in a training (<20), the larger the effect sizes for the study outcomes. The more hours of training over an extended number of sessions, the better the study outcomes.

23 Effect Sizes for Introducing Information to Learners
Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Pre-class exercises 9 1.02 Out of class activities/self-instruction 12 20 .76 Classroom/workshop lectures 26 108 .68 Dramatic readings 18 40 .35 Imagery 7 .34 Dramatic readings/imagery 4 11 .15

24 Effect Sizes for Self-Assessment of Learner Mastery
Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Standards-based assessment 13 44 .76 Self-assessment 16 29 .67

25 Summary of Training Findings
To be most effective need to actively involve the learners in judging the consequences of their learning experiences (evaluate, reflection, & mastery) Need learner participation in learning new knowledge or practice Need learner engagement in judging his or her experience in learning and using new material

26 Best Practices in Coaching
Design a Coaching Service Delivery Plan Develop accountability structures for Coaching – Coach the Coach! Identify on-going professional development for coaches Training Coaching Performance Assessment (Blase, VanDyke, & Fixsen, 2010)

27 Best Practices in Performance Assessment
Must be a transparent process Use of multiple data sources Fidelity of implementation should be assessed at the local, regional, and state levels Tied to positive recognition Information from this driver feeds back to Selection, Training, and Coaching and feeds forward to the Organization Drivers

28 Best Practices in Decision Support Data Systems
Assess fidelity of implementation at all levels and respond accordingly Identify outcome measures that are … Intermediate and longer-term Socially valid Technically adequate: reliable and valid Relevant data that is feasible to gather, useful for decision making, widely shared and reported frequently

29 Best Practices in Facilitative Administration
A Building/District Leadership and Implementation Team is formed The Team uses feedback and data to improve Implementation Drivers Policies and procedures are developed and revised to support the new ways of work Solicits and analyzes feedback from staff and stakeholders

30 Best practices in Systems Interventions
Leadership analyzes feedback from staff and makes changes to alleviate barriers and facilitate implementation, Revising policies and procedures to support new way of work.

31 Evaluating Professional Development

32 Guskey’s Five Critical Levels of Professional Development Evaluation
Reprinted from: Guskey, T. R. (2000). Evaluating professional development (pp ). Thousand Oaks, CA: Corwin Press. Evaluation Level What Questions Are Addressed How Will Information Be Gathered? What Is Measured or Addressed? Be Used? 1. Participants’ reactions Did they like it? Was their time well spent? Did the material make sense? Will it be useful Was leader knowledgeable & helpful? Were the refreshments fresh & tasty? Was the room the right temp.? Were the chairs comfortable? Questionnaires administered at the end of the session Focus groups Interviews Personal learning logs Initial satisfaction with the experience To improve program design and delivery 2. Participants’ learning Did participants acquire the intended knowledge and skills? Paper-and-pencil instruments Simulations & demonstrations Participant reflections (oral and/or written Participant portfolios Case study analyses New knowledge and skills of participants To improve program content, format, and organization 3. Organization support and change What was the impact on the org.? Did it affect organizational climate and procedures? Was implementation advocated, facilitated, and supported? Was the support public and overt? Were problems addressed quickly and efficiently? Were sufficient resources available? District and school records Minutes from follow-up meetings Questionnaires Structured interviews with participants and school or district administrators The organization’s advocacy, support, accommodation, facilitation, and recognition To document and improve organizational support To inform future change efforts 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implementation To document and improve the implementation of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well-being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Structured interviews with students, parents, teachers, and/or administrators Student learning outcomes: Cognitive (performance & achievement) - Affective (attitudes and dispositions) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementation, and follow-up To demonstrate the overall impact of professional development 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implementation To document and improve the implementation of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well-being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Structured interviews with students, parents, teachers, and/or administrators Student learning outcomes: Cognitive (performance & achievement) - Affective (attitudes and dispositions) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementation, and follow-up To demonstrate the overall impact of professional development

33 Guskey’s Five Critical Levels of Professional Development Evaluation
Reprinted from: Guskey, T. R. (2000). Evaluating professional development (pp ). Thousand Oaks, CA: Corwin Press. 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implementation To document and improve the implementation of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well-being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Structured interviews with students, parents, teachers, and/or administrators Student learning outcomes: Cognitive (performance & achievement) - Affective (attitudes and dispositions) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementation, and follow-up To demonstrate the overall impact of professional development Evaluation Level What Questions Are Addressed How Will Information Be Gathered? What Is Measured or Addressed? Be Used? 1. Participants’ reactions Did they like it? Was their time well spent? Did the material make sense? Will it be useful Was leader knowledgeable & helpful? Were the refreshments fresh & tasty? Was the room the right temp.? Were the chairs comfortable? Questionnaires administered at the end of the session Focus groups Interviews Personal learning logs Initial satisfaction with the experience To improve program design and delivery 2. Participants’ learning Did participants acquire the intended knowledge and skills? Paper-and-pencil instruments Simulations & demonstrations Participant reflections (oral and/or written Participant portfolios Case study analyses New knowledge and skills of participants To improve program content, format, and organization 3. Organization support and change What was the impact on the org.? Did it affect organizational climate and procedures? Was implementation advocated, facilitated, and supported? Was the support public and overt? Were problems addressed quickly and efficiently? Were sufficient resources available? District and school records Minutes from follow-up meetings Questionnaires Structured interviews with participants and school or district administrators The organization’s advocacy, support, accommodation, facilitation, and recognition To document and improve organizational support To inform future change efforts

34 SPDG Professional Development Rubric
5 Domains, each with components Selection Training Coaching Performance Assessment/Data-based decision making Facilitative administration/Systems intervention Components from the National Implementation Research Network, Learning Forward (NSDC), Guskey, Trivette Each component of the domains will be rated from 1 - 4

35

36 Component Themes Assigning responsibility for major professional development functions (e.g., measuring fidelity and outcomes; monitoring coaching quality) Expectations stated for all roles and responsibilities (e.g., PD participants, trainers, coaches, school & district administrators) Data for each stage of PD (e.g., selection, training, implementation, coaching, outcomes)

37 SPDG Initiatives and Evidence-based Professional Development
EB-PD should be applied to those initiatives that lead to implementation (of the practice/program providing training on)

38 Grantee Benchmarks 1st year of funding: baseline
2nd yr: 50% of components will have a score of 3 or 4 3rd yr: 70% of components will have a score of 3 or 4 4th yr: 80% of components will have a score of 3 or 4 5th yr: 80% of components will have a score of 3 or 4 (maintenance yr)

39 Implementation fidelity
Fidelity of implementation is traditionally defined as “the extent to which the user’s current practice matches the ideal (Loucks, 1983).

40 Measure 2 Methodology Each initiative should have a fidelity measure that notes the presence or absence of the core features of the innovation/program/system that the initiative is focused on Be clear about the name of the measure and stating that it is a fidelity measure. Choose one fidelity measure only for your program measure. You can use other measures as project measures.

41 When possible… Use implementation measures that have already been created For example: new RTI implementation measure from the Natl RTI Center Literacy implementation: Planning and Evaluation Tool – Revised (PET-R) PBIS: Schoolwide Evaluation Tool (SET) Others…

42 Developing a Fidelity Measure (O’Donnell, 2005)
To develop fidelity criteria, researchers often reported starting with a curriculum profile or analysis that outlined the critical components of the intervention along with an indication of the range of variations for acceptable use. The researcher or developer then outlined acceptable ranges of variation (Songer & Gotwals, 2005). A component checklist was then developed to record fidelity to these components (Hall & Loucks, 1977).

43 Practice Profile Defining “it” Through the Development and Use of Practice Profiles Guiding Principles identified Critical Components articulated For each critical component: Identified gold standard Identified acceptable variations in practice Identified ineffective practices and undesirable practices Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency

44 Performance Measurement 3:
Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG-supported practices.

45 Operational definition of terms
Professional development funds = a minimum of 90% of the overall budget being used for activities from subsection "a" of the notice/Statute Only following the initiatives from Program Measure 1 & 2 Follow-up activities = the professional development assistance provided following training. A list of follow-up activities that are correlated with sustainability are provided.

46 Ongoing TA features: Coaching/mentoring
Implementation fidelity measurement & other types of observation Mini-workshops Determining needs through data and providing guidance or tools to meet those needs Maintaining data systems Peer sharing

47 Ongoing TA cont. Model demonstration site activities
Creating and disseminating enduring documents (procedural manuals) Communities of Practice TA Networks (support from internal state/local TA&D systems Regional PD partnerships

48 Methodology For each initiative, grantee should report cost of activities designed to sustain learning of scientific or evidence-based instructional practices, divided by the total cost of all professional development activities carried out for the initiative.

49 Cost of ongoing TA Cost of all PD activities for an initiative
Equation Cost of ongoing TA Cost of all PD activities for an initiative

50 Methodology cont. Only need to report on those initiatives reporting on for Measures 1 & 2 Use dollar amounts in the equation. Otherwise your measure may not be counted in the external review Projects will set their own targets

51 Setting targets Consider what is happening each year of your project
Are you providing training for an entire year before you begin providing coaching? In the final year of your project are you no longer providing training and only providing follow-up support?

52 Optimally Your initiative would help build local coaching capacity
Projects would match/modify their training with (a) coaching, (b) performance feedback, and (c) student outcomes

53 Performance Measurement 4:
Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.

54 Methodology Divide the number of teachers who remain in a teaching position by all teachers who received SPDG assistance.

55 Equation # of personnel retained for at least 2 yrs following participation in a SPDG teacher retention activity # of personnel participating in a SPDG activity designed to retain highly qualified special education teachers

56 Note This is only for projects that have teacher retention as an objective. Only inservice Initial participation is defined as beginning at the time someone receives funding or services from the SPDG grant.

57 Considerations If the SPDG State does not have a tracking system for highly qualified special education teachers they will need to put an agreement in place with the individual receiving funds or services This agreement will require information from that individual for the life of the grant

58 Program Measures Webpage
Has relevant presentations, tools, links, etc: There is a tab for it on the top of the SIGnetwork homepage

59 Questions for us? To contact us: John Lind: jlind@uoregon.edu
Jennifer Coffey: Your Project Officer…


Download ppt "SPDG New Grantee Orientation"

Similar presentations


Ads by Google