Presentation is loading. Please wait.

Presentation is loading. Please wait.

Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead.

Similar presentations


Presentation on theme: "Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead."— Presentation transcript:

1 Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead

2 Click the Person icon to: Raise Your Hand Agree/Disagree Other…. Click the Full Screen: To maximize presentation screen

3 Click This Icon to: -Send private messages -Change text size or chat color To Post Chat Messages: Type your message in this box Then hit ‘enter’ on your keyboard to send

4 Webinar Ground Rules Mute your phones: -To Mute or Un-Mute Press *6 -Please do not put your phones on ‘Hold’ For webinar technical difficulties: -Send email to adesjarl@uoregon.eduadesjarl@uoregon.edu Q & A Process (audio/chat): -Ask questions in two ways: 1.Audio/Voice 2.Type your question in the Chat Pod Archive Recording, PPT, & Materials -To be posted to; http://signetwork.org/events/108http://signetwork.org/events/108

5 SAVE THE DATES! SPDG National Meeting March 6 & 7, 2012, Washington, DC OSEP Project Directors’ Conference July 23-25, 2012, Washington, DC

6 Upcoming Calls 6 December 1, 2011 Time: 3:00-4:30pm ET Program Measure Directors' Webinar Series #3: Ongoing Technical Assistance and Teacher Retention Jennifer Coffey, PhD, OSEP Date: January 11, 2012 Time: 3:00-4:30pm ET Adult Learning Principles Carol Trivette, Ph.D. Orelena Hawks Puckett Institute

7 7 Building a Leadership Support Network from Identification of Need to the Launch--the Oklahoma Story November 7, 2011 12pm EST The Personnel Improvement Center @ NASDSE and Oklahoma State Department of Education will share how they are building a support network for new special education directors, which includes leadership institutes, webinars with follow-up blog discussions, and mentoring by veteran directors. Network goals include the development and retention of special education directors to ensure that they are able to attract, support and retain highly-qualified, highly-effective special education teachers. http://click.icptrack.com/icp/relay.php?r=17436393&msgid=408221&act =W7IT&c=572590&destination=https%3A%2F%2Ftadnet.ilinc.com%2F register%2Ffmcbsby

8

9 9 Professional Learning Communities Updates Disband Secondary Education & Transition PLC (includes Adolescent Literacy) Potential Merge – NCRTI – RTI CoP Calls & SPDG RtI/Multi-Tiered Models of Intervention PLC

10 Jennifer Coffey, Ph.D. SPDG Program Lead August 30, 2011 10

11  Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.  Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time. 11

12  Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG- supported practices. (Efficiency Measure)  Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities. 12

13  2007 grantees will not be using the new program measures:  Everyone else will have 1 year for practice › Grantees will use the revised measures this year for their APR › This continuation report will be a pilot  OSEP will learn from this round of reports and make changes as appropriate 13

14 14 Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

15  Fidelity of implementation is traditionally defined as “the extent to which the user’s current practice matches the ideal (Loucks, 1983). 15

16  Dusenbury, Brannigan, Falco, & Hansen (2003)  Dane & Schneider (1998)  O’Donnell (2005)  Blase “Innovation Fluency” presentation: http://signetwork.org/content_pages/15 4  Mowbray, Holter, Teague & Bybee (2003) 16

17  “All five studies consistently showed statistically significantly higher outcomes when the program was implemented with greater fidelity.  Relationships ranged from causal (Ysseldyke et al., 2003), to associational (Hall & Loucks, 1977), to predictive (Songer & Gotwals, 2005).  Distinctions should be made between measuring fidelity to the structural components of a curriculum intervention and fidelity to the processes that guide its design.  The studies reviewed here suggest that fidelity of implementation is more probable when an intervention manual is in place that clearly defines the critical components of the intervention and articulates a theory.” 17

18  Observations may be crucial because teachers are known to be biased in their reports (Hansen and McNeal, 1999).  Given the frequency with which adaptations are observed in research and practice, program developers need to anticipate how and when teachers will modify programs and develop guidelines and recommendations to ensure program goals are met (Dusenbury, Brannigan, Hansen, Walsh, & Falco, 2005) 18

19  The projects will report on those initiatives that they are reporting on for Program Measure 1  Each initiative should have a fidelity measure that notes the presence or absence of the core features of the innovation/program/system that the initiative is focused on 19

20  To develop fidelity criteria, researchers often reported starting with a curriculum profile or analysis that outlined the critical components of the intervention along with an indication of the range of variations for acceptable use. The researcher or developer then outlined acceptable ranges of variation (Songer & Gotwals, 2005).  A component checklist was then developed to record fidelity to these components (Hall & Loucks, 1977). 20

21 Professional Problem Solving 9 Critical Components  Parent Involvement  Problem Statement  Systematic Data Collection  Problem Analysis  Goal Development  Intervention Plan Development  Intervention Plan Implementation  Progress Monitoring  Decision Making Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 21

22 Practice Profile: Building Leadership Team Example 22

23 The project will set its own benchmarks for professional development participants 1 year into training/assistance, 2 yrs in, 3 yrs in, 4 yrs in (e.g., 1 yr benchmark = 40% of core features in place, 4 yr benchmark = 80% of features in place) The project will then determine what percentage of participants they expect to reach this benchmark (e.g., 80% of participants) a.Participants could be individual teachers (if working with just a few teachers or other type of professional per school or district) or could be a school (if working on a school-wide basis, such as RTI or PBIS) 23

24  Self-assessment is acceptable, but projects will need to sample from the group to validate the self- assessment a.For example, if 15 schools were being measured someone from the project would observe at least 3 (1/5 th ) of the schools and compare their assessment with the self-assessment A baseline wouldn’t be necessary 24

25 Questions?  What kind of guidance would be helpful to you?  What challenges do you foresee in capturing these data? 25


Download ppt "Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead."

Similar presentations


Ads by Google