Download presentation
Presentation is loading. Please wait.
Published byAnne Newman Modified over 9 years ago
1
Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead
2
Click the Person icon to: Raise Your Hand Agree/Disagree Other…. Click the Full Screen: To maximize presentation screen
3
Click This Icon to: -Send private messages -Change text size or chat color To Post Chat Messages: Type your message in this box Then hit ‘enter’ on your keyboard to send
4
Webinar Ground Rules Mute your phones: -To Mute or Un-Mute Press *6 -Please do not put your phones on ‘Hold’ For webinar technical difficulties: -Send email to adesjarl@uoregon.eduadesjarl@uoregon.edu Q & A Process (audio/chat): -Ask questions in two ways: 1.Audio/Voice 2.Type your question in the Chat Pod Archive Recording, PPT, & Materials -To be posted to; http://signetwork.org/events/108http://signetwork.org/events/108
5
SAVE THE DATES! SPDG National Meeting March 6 & 7, 2012, Washington, DC OSEP Project Directors’ Conference July 23-25, 2012, Washington, DC
6
SPDG National Meeting March 5 – MEET UP Night
7
Upcoming Calls 7 December 1, 2011 Time: 3:00-4:30pm ET Program Measure Directors' Webinar Series #3: Ongoing Technical Assistance and Teacher Retention Jennifer Coffey, PhD, OSEP Date: January 11, 2012 Time: 3:00-4:30pm ET Adult Learning Principles Carol Trivette, Ph.D. Orelena Hawks Puckett Institute
8
8 Building a Leadership Support Network from Identification of Need to the Launch--the Oklahoma Story November 7, 2011 12pm EST The Personnel Improvement Center @ NASDSE and Oklahoma State Department of Education will share how they are building a support network for new special education directors, which includes leadership institutes, webinars with follow-up blog discussions, and mentoring by veteran directors. Network goals include the development and retention of special education directors to ensure that they are able to attract, support and retain highly-qualified, highly-effective special education teachers. http://click.icptrack.com/icp/relay.php?r=17436393&msgid=408221&act =W7IT&c=572590&destination=https%3A%2F%2Ftadnet.ilinc.com%2F register%2Ffmcbsby
10
10 Professional Learning Communities Updates Disband Secondary Education & Transition PLC (includes Adolescent Literacy) Potential Merge – NCRTI – RTI CoP Calls & SPDG RtI/Multi-Tiered Models of Intervention PLC
11
11 State Grantees Profile – Desktop Share
12
Jennifer Coffey, Ph.D. SPDG Program Lead August 30, 2011 12
13
Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies. Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time. 13
14
Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG- supported practices. (Efficiency Measure) Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities. 14
15
2007 grantees will not be using the new program measures: Everyone else will have 1 year for practice › Grantees will use the revised measures this year for their APR › This continuation report will be a pilot OSEP will learn from this round of reports and make changes as appropriate 15
16
16 Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.
17
Fidelity of implementation is traditionally defined as “the extent to which the user’s current practice matches the ideal (Loucks, 1983). 17
18
Dusenbury, Brannigan, Falco, & Hansen (2003) Dane & Schneider (1998) O’Donnell (2005) Blase “Innovation Fluency” presentation: http://signetwork.org/content_pages/15 4 Mowbray, Holter, Teague & Bybee (2003) 18
19
“All five studies consistently showed statistically significantly higher outcomes when the program was implemented with greater fidelity. The studies reviewed here suggest that fidelity of implementation is more probable when an intervention manual is in place that clearly defines the critical components of the intervention and articulates a theory. Distinctions should be made between measuring fidelity to the structural components of a curriculum intervention and fidelity to the processes that guide its design.” 19
20
E VALUATION D RIVES ERIA’ S E VIDENCE - BASED P RACTICES The Program Guide, a 16-page booklet, explicitly addresses both implementation and intervention practices to guide the design of a site-based program. The Implementation Rubric is a 10-item instrument which provides a framework for trainers, coaches, site team members, and teachers to evaluate and discuss implementation, fidelity, and next steps. Some additional tools include: end-of-event training surveys and three-month follow-ups feedback and support from cohort coaches and site team fidelity observations student data 20
21
ERIA’ S E VIDENCE - BASED P RACTICES The Program Guide articulates a comprehensive set of practices for all stakeholders. 21 Implementation PracticesIntervention Practices Initial Training Team-based Site-level Practice and Implementation Implementation Rubric facilitates self-eval Ongoing Coaching Booster Trainings Implementation Rubric reflection on next steps The 5 Steps of ERIA Data-informed Decision-making Screening and Assessment Progress Monitoring Tiered Interventions and Learning Supports Enhanced Literacy Instruction
22
The projects will report on those initiatives that they are reporting on for Program Measure 1 Each initiative should have a fidelity measure that notes the presence or absence of the core features of the innovation/program/system that the initiative is focused on 22
23
Use implementation measures that have already been created › For example – new RTI implementation measure presented to the RTI PLC › Literacy implementation – Planning and Evaluation Tool – Revised (PET-R) › Schoolwide Evaluation Tool (SET) › Others? 23
24
To develop fidelity criteria, researchers often reported starting with a curriculum profile or analysis that outlined the critical components of the intervention along with an indication of the range of variations for acceptable use. The researcher or developer then outlined acceptable ranges of variation (Songer & Gotwals, 2005). A component checklist was then developed to record fidelity to these components (Hall & Loucks, 1977). 24
25
What is “it”? Operationalize Part of Speech: verb Definition: to define a concept or variable so that it can be measured or expressed quantitatively Webster's New Millennium™ Dictionary of English, Preview Edition (v 0.9.7) Copyright © 2003-2008 Lexico Publishing Group, LLC The “it” must be operationalized whether it is: »An Evidence-Based Practice or Program »A Best Practice Initiative or New Framework »A Systems Change Initiative Practice Profiles »Help Operationalize Practice, Program, and Systems Features 25
26
Searching for “It” Research findings, materials, manuals, and journal articles do not necessarily provide clarity around core intervention elements Current and new evidence-based practices, frameworks, programs will have a range of operational specificity Developing clarity around the “it” is critical 26
27
Practice Profile Defining “it” Through the Development and Use of Practice Profiles Guiding Principles identified Critical Components articulated Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 27
28
Practice Profile Defining “it” Through the Development and Use of Practice Profiles Guiding Principles identified Critical Components articulated For each critical component: Identified gold standard Identified acceptable variations in practice Identified ineffective practices and undesirable practices Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 28
29
???? Have you ever developed or helped to develop a Practice Profile or Innovation Configuration? Vote Now: »Yes »No 29
30
Practice Profiles: Pay Now or Pay Later Identifies Critical Components Guiding Principles Critical Components Match the Guiding Principles Core Activities to Achieve the Critical Components For each Critical Component: Identified “gold standard” activities Identified acceptable variations in practice Identified ineffective practices and undesirable practices Your Implementation Support »Identify and Support Implementation Team »Provide Conceptual Overview and Rationales »Provide Resources, Worksheets, Templates »Facilitate Consensus Building Capacity Building 30
31
Resources for Building Practice Profiles 31 National Centers Experts in Your State National Purveyors Manuals and Materials Implementing Districts and Schools Other States Consensus Building in Your State
32
Example Problem-Solving Practice Profiles in an RtI Framework 32 RESOURCE - Professional Practices in Problem Solving: Benchmarks and Innovation Configurations ~ Iowa Area Education Agency Directors of Special Education, 1994
33
Practice Profile Defining “it” Through the Development and Use of Practice Profiles Guiding Principles identified Critical Components articulated For each critical component: Identified gold standard Identified acceptable variations in practice Identified ineffective practices and undesirable practices Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 33
34
Practice Profiles Each Critical Component is a heading Each level of implementation specifies the activities necessary to operationalize that Critical Component Critical Component Ideal Implementation Acceptable Variation Unacceptable Variation Critical Component 1: Description Description of implementer behavior Drastic Mutation Hall and Hord, 2010, Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 34
35
Professional Problem Solving 9 Critical Components Parent Involvement Problem Statement Systematic Data Collection Problem Analysis Goal Development Intervention Plan Development Intervention Plan Implementation Progress Monitoring Decision Making Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 35
36
Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 36
37
Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 37
38
Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 38
39
Professional Problem Solving Parent Involvement – Critical Components 39
40
Things to Think About Think about your SPDG effort and your involvement and guidance at the State, District, and School levels. Currently, our SPDG work is well operationalized ? ….At the Classroom level »_Strongly Agree _Agree __Disagree __Strongly Disagree …At the district level …At the regional level …At the State level 40
41
Michigan’s Practice Profile: Building Leadership Team Example 41
42
C ALIFORNIA ’ S E VALUATION T OOL : I MPLEMENTATION R UBRIC The 10 items are intervention practices-focused mostly, with site team and fidelity items The overall tool and process of how the rubric is used drives the implementation practices Self-evaluate and reflect on learning and implementation. Shared with coaches and trainers to guide activities Evaluates the fidelity of implementation of both the PD model and the interventions Former 26-item, 3-point ERIA Checklist lacked the specificity to be meaningful and useful. 42
43
I MPLEMENTATION R UBRIC, A DAPTED FROM “G OAL A TTAINMENT S CALES ” Amy Gaumer Erickson and Monica Ballay presented “goal attainment scales” on a June 17 SIG Network webinar: http://www.signetwork.org/content_pages/78 http://www.signetwork.org/content_pages/78 Rubric explicitly describes 5 implementation levels for each of 10 items: Levels 1, 2, and 3 reflect the “Not started,” “In progress,” and “Achieved” implementation levels of former checklist. Levels 4 and 5 detail concrete steps towards optimal implementation, beyond the basics. Each implementation level for each item is explicitly described, building more meaning into the tool than our previous checklist format allowed. 43
44
I MPLEMENTATION R UBRIC E XCEL F ILE : M ULTI - YEAR TRACKING AND A UTOMATED R EPORTS The same file is used in all three years of ERIA, reporting both the trend and most-recent entries. 44
45
ERIA on the Web: http://calstat.org/effectivereading.html Li Walter: li@sonic.net Alan Wood: alan.wood@calstat.org (707) 287-0054 http://calstat.org/effectivereading.htmlli@sonic.netalan.wood@calstat.org 45
46
Observations may be crucial because teachers are known to be biased in their reports (Hansen and McNeal, 1999). Given the frequency with which adaptations are observed in research and practice, program developers need to anticipate how and when teachers will modify programs and develop guidelines and recommendations to ensure program goals are met (Dusenbury, Brannigan, Hansen, Walsh, & Falco, 2005) 46
47
The project will set its own benchmarks for professional development participants 1 year into training/assistance, 2 yrs in, 3 yrs in, 4 yrs in For example: 1 yr benchmark = 40% of core features in place, 4 yr benchmark = 80% of features in place The project will then determine what percentage of participants they expect to reach this benchmark (e.g., 80% of participants) a.Participants could be individual teachers (if working with just a few teachers or other type of professional per school or district) or could be a school (if working on a school-wide basis, such as RTI or PBIS) 47
48
Self-assessment is acceptable, but projects will need to sample from the group to validate the self- assessment a.For example, if 15 schools were being measured someone from the project would observe at least 3 (1/5 th ) of the schools and compare their assessment with the self-assessment A baseline wouldn’t be necessary 48
49
Questions? What kind of guidance would be helpful to you? What challenges do you foresee in capturing these data? 49
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.