Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead.

Slides:



Advertisements
Similar presentations
Instructional Decision Making
Advertisements

Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Thank you for joining us for Monitoring Progress Toward IEP Goals The presentation will begin momentarily. RIGHT REASON TECHNOLOGIES YOUR SOLUTION FOR.
Multi-tiered System of Supports District Application.
Welcome to the 2010 SPDG Competition Bidders’ Webinar Significance Section, Jennifer Coffey, OSEP Project Design, Jennifer Coffey, OSEP Project Personnel,
E VALUATION T OOLS A S I MPLEMENTATION D RIVERS An Example from the California SPDG’s ERIA: Effective Reading Interventions Academy 1.
What Is “It” and How Do We Make “It” Happen Karen Blase, PhD Dean L. Fixsen, PhD Melissa Van Dyke, LCSW Michelle Duda, PhD Frank Porter Graham Child Development.
CA Multi-Tiered System of Supports
Training in Instructional Consultation, Assessment & Teaming Todd A. Gravois, Ph.D. Edward Gickling, Ph.D. & Sylvia Rosenfield, Ph.D.
National Center on Response to Intervention Developed by the National Center on Response to Intervention and RMC Research RTI Integrity Framework: A Tool.
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Session Materials  Wiki
Using and Implementing Goal Attainment Scales as a way to Measure Progress 1.
E VALUATION T OOLS AS I MPLEMENTATION D RIVERS Two Implementation Rubrics from the California SPDG 1.
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
Maine’s Response to Intervention Implementation: Moving Forward Presented by: Barbara Moody Title II Coordinator Maine Department of Education.
V Implementing and Sustaining Effective Programs and Services that Promote the Social-Emotional Development of Young Children Part I Karen Blase, Barbara.
1 SPDG Jennifer Coffey 323A State Personnel Development Grants SPDG Webinar on Grant Performance Report for Continuation Funding Jennifer Coffey Office.
Monica Ballay Data Triangulation: Measuring Implementation of SPDG Focus Areas.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Measuring Implementation: School-Wide Instructional Staff Perspective Amy Gaumer Erickson, Ph.D. University of Kansas Evaluator: Kansas & Missouri SPDGs.
HECSE Quality Indicators for Leadership Preparation.
Connecting with the SPP/APR Kansas State Personnel Development Grant.
Introduction to Coaching School-Wide PBS:RtIB. 2 Agenda PBS:RtIB Brief Overview Coaching Tier 1 Coaching Skills and Activities Resources and Barriers.
Sustainability Training Series 2015 From Piloting to Sustaining Practices September 29, :00pm - 3:00pm Developing and Sustaining ELO Programs.
Effective Behavioral & Instructional Support Systems Overview and Guiding Principles Adapted from, Carol Sadler, Ph.D. – EBISS Coordinator Extraordinaire.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
SPDG Performance Measure Discussion Monday, March 14 th 2011.
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
APR Know-how Jennifer Coffey November 2013 The Revised SPDG Program Measures and Other Reporting Requirements.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Suggested Components of a Schoolwide Reading Plan Part 1: Introduction Provides an overview of key components of reading plan. Part 2: Component details.
Spartan Expectations Be Responsible  Return promptly from breaks  Be an active participant  Use the law of two feet Be Respectful  Maintain cell phone.
Response to Intervention Guidance Overview November 6, :00 am Our session will start momentarily. While you are waiting, please do the following:
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Introduction to School-wide Positive Behavior Support.
Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Florida Charter School Conference Orlando, Florida November, 2009 Clark Dorman Project Leader Florida Statewide Problem-Solving/RtI Project University.
Tuesday, March 29 th 2011 SPDG Performance Measure Discussion.
Tuesday, April 12 th 2011 SPDG Performance Measure Discussion.
Goal Attainment Scales as a way to Measure Progress Amy Gaumer Erickson & Monica Ballay December 3, 2012.
Coaching Within a Statewide Multi-Tiered System of Supports (MTSS) Steve Goodman miblsi.cenmi.org December 6, 2010.
Pennsylvania Training and Technical Assistance Network PAPBS Network Coaches Day January 28, Fidelity Measures Lisa Brunschwyler- School Age- School.
Welcome to the 2010 SPDG Competition Bidders’ Webinar Overview & Program Measures, Jennifer Coffey, OSEP Needs Section - Shedeh Hajghassemali, OSEP Management.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
SPDG Bidders’ Webinar Presented by Jennifer Coffey July 2012 The Revised SPDG Program Measures Dial-in: Participant Code:
Grant Management PLC Session Discussion facilitated by Jennifer Coffey November 2011 Performance Measurement Discussion Dial-in: Participant.
Tier 1 Positive Behavior Support Response to Intervention for Behavior Faculty Overview.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Dial-in: Passcode: RTI/Multi-Tiered Models of Intervention PLC Movement Between Tiers of Intervention & Implications for Special.
Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead.
Technology PLC Session Presenter: Elaine Mulligan November 2011 Dissemination Strategies: What Works for Whom Dial-in: Participant Code:
SAM (Self-Assessment of MTSS Implementation) ADMINISTRATION TRAINING
Leadership and IHE Collaboration
Welcome to the SPDG Webinar
OSEP Project Directors Meeting
The Revised SPDG Program Measures: An Overview
Miblsi.cenmi.org Helping Students Become Better Readers with Social Skills Necessary for Success Steve Goodman Funded through OSEP.
Welcome to SPDG Evaluators’ Online Session:
Facilitators: Jennifer Coffey, OSEP Project Officer
Welcome to SPDG Evaluators’ Community Session:
APR Informational Webinar
SPDG Implementation Conversations - Discussion
Presentation transcript:

Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead

Click the Person icon to: Raise Your Hand Agree/Disagree Other…. Click the Full Screen: To maximize presentation screen

Click This Icon to: -Send private messages -Change text size or chat color To Post Chat Messages: Type your message in this box Then hit ‘enter’ on your keyboard to send

Webinar Ground Rules Mute your phones: -To Mute or Un-Mute Press *6 -Please do not put your phones on ‘Hold’ For webinar technical difficulties: -Send to Q & A Process (audio/chat): -Ask questions in two ways: 1.Audio/Voice 2.Type your question in the Chat Pod Archive Recording, PPT, & Materials -To be posted to;

SAVE THE DATES! SPDG National Meeting March 6 & 7, 2012, Washington, DC OSEP Project Directors’ Conference July 23-25, 2012, Washington, DC

SPDG National Meeting March 5 – MEET UP Night

Upcoming Calls 7 December 1, 2011 Time: 3:00-4:30pm ET Program Measure Directors' Webinar Series #3: Ongoing Technical Assistance and Teacher Retention Jennifer Coffey, PhD, OSEP Date: January 11, 2012 Time: 3:00-4:30pm ET Adult Learning Principles Carol Trivette, Ph.D. Orelena Hawks Puckett Institute

8 Building a Leadership Support Network from Identification of Need to the Launch--the Oklahoma Story November 7, pm EST The Personnel Improvement NASDSE and Oklahoma State Department of Education will share how they are building a support network for new special education directors, which includes leadership institutes, webinars with follow-up blog discussions, and mentoring by veteran directors. Network goals include the development and retention of special education directors to ensure that they are able to attract, support and retain highly-qualified, highly-effective special education teachers. =W7IT&c=572590&destination=https%3A%2F%2Ftadnet.ilinc.com%2F register%2Ffmcbsby

10 Professional Learning Communities Updates Disband Secondary Education & Transition PLC (includes Adolescent Literacy) Potential Merge – NCRTI – RTI CoP Calls & SPDG RtI/Multi-Tiered Models of Intervention PLC

11 State Grantees Profile – Desktop Share

Jennifer Coffey, Ph.D. SPDG Program Lead August 30,

 Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.  Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time. 13

 Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG- supported practices. (Efficiency Measure)  Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities. 14

 2007 grantees will not be using the new program measures:  Everyone else will have 1 year for practice › Grantees will use the revised measures this year for their APR › This continuation report will be a pilot  OSEP will learn from this round of reports and make changes as appropriate 15

16 Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

 Fidelity of implementation is traditionally defined as “the extent to which the user’s current practice matches the ideal (Loucks, 1983). 17

 Dusenbury, Brannigan, Falco, & Hansen (2003)  Dane & Schneider (1998)  O’Donnell (2005)  Blase “Innovation Fluency” presentation: 4  Mowbray, Holter, Teague & Bybee (2003) 18

 “All five studies consistently showed statistically significantly higher outcomes when the program was implemented with greater fidelity.  The studies reviewed here suggest that fidelity of implementation is more probable when an intervention manual is in place that clearly defines the critical components of the intervention and articulates a theory.  Distinctions should be made between measuring fidelity to the structural components of a curriculum intervention and fidelity to the processes that guide its design.” 19

E VALUATION D RIVES ERIA’ S E VIDENCE - BASED P RACTICES The Program Guide, a 16-page booklet, explicitly addresses both implementation and intervention practices to guide the design of a site-based program. The Implementation Rubric is a 10-item instrument which provides a framework for trainers, coaches, site team members, and teachers to evaluate and discuss implementation, fidelity, and next steps. Some additional tools include: end-of-event training surveys and three-month follow-ups feedback and support from cohort coaches and site team fidelity observations student data 20

ERIA’ S E VIDENCE - BASED P RACTICES The Program Guide articulates a comprehensive set of practices for all stakeholders. 21 Implementation PracticesIntervention Practices Initial Training Team-based Site-level Practice and Implementation Implementation Rubric facilitates self-eval Ongoing Coaching Booster Trainings Implementation Rubric reflection on next steps The 5 Steps of ERIA Data-informed Decision-making Screening and Assessment Progress Monitoring Tiered Interventions and Learning Supports Enhanced Literacy Instruction

 The projects will report on those initiatives that they are reporting on for Program Measure 1  Each initiative should have a fidelity measure that notes the presence or absence of the core features of the innovation/program/system that the initiative is focused on 22

 Use implementation measures that have already been created › For example – new RTI implementation measure presented to the RTI PLC › Literacy implementation – Planning and Evaluation Tool – Revised (PET-R) › Schoolwide Evaluation Tool (SET) › Others? 23

 To develop fidelity criteria, researchers often reported starting with a curriculum profile or analysis that outlined the critical components of the intervention along with an indication of the range of variations for acceptable use. The researcher or developer then outlined acceptable ranges of variation (Songer & Gotwals, 2005).  A component checklist was then developed to record fidelity to these components (Hall & Loucks, 1977). 24

What is “it”?  Operationalize Part of Speech: verb Definition: to define a concept or variable so that it can be measured or expressed quantitatively Webster's New Millennium™ Dictionary of English, Preview Edition (v 0.9.7) Copyright © Lexico Publishing Group, LLC  The “it” must be operationalized whether it is: »An Evidence-Based Practice or Program »A Best Practice Initiative or New Framework »A Systems Change Initiative  Practice Profiles »Help Operationalize Practice, Program, and Systems Features 25

Searching for “It”  Research findings, materials, manuals, and journal articles do not necessarily provide clarity around core intervention elements  Current and new evidence-based practices, frameworks, programs will have a range of operational specificity  Developing clarity around the “it” is critical 26

Practice Profile  Defining “it” Through the Development and Use of Practice Profiles  Guiding Principles identified  Critical Components articulated Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 27

Practice Profile  Defining “it” Through the Development and Use of Practice Profiles  Guiding Principles identified  Critical Components articulated  For each critical component:  Identified gold standard  Identified acceptable variations in practice  Identified ineffective practices and undesirable practices Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 28

????  Have you ever developed or helped to develop a Practice Profile or Innovation Configuration?  Vote Now: »Yes »No 29

Practice Profiles: Pay Now or Pay Later  Identifies Critical Components  Guiding Principles  Critical Components Match the Guiding Principles  Core Activities to Achieve the Critical Components  For each Critical Component:  Identified “gold standard” activities  Identified acceptable variations in practice  Identified ineffective practices and undesirable practices  Your Implementation Support »Identify and Support Implementation Team »Provide Conceptual Overview and Rationales »Provide Resources, Worksheets, Templates »Facilitate Consensus Building Capacity Building 30

Resources for Building Practice Profiles 31 National Centers Experts in Your State National Purveyors Manuals and Materials Implementing Districts and Schools Other States Consensus Building in Your State

Example  Problem-Solving Practice Profiles in an RtI Framework 32 RESOURCE - Professional Practices in Problem Solving: Benchmarks and Innovation Configurations ~ Iowa Area Education Agency Directors of Special Education, 1994

Practice Profile  Defining “it” Through the Development and Use of Practice Profiles  Guiding Principles identified  Critical Components articulated  For each critical component:  Identified gold standard  Identified acceptable variations in practice  Identified ineffective practices and undesirable practices Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 33

Practice Profiles  Each Critical Component is a heading  Each level of implementation specifies the activities necessary to operationalize that Critical Component Critical Component Ideal Implementation Acceptable Variation Unacceptable Variation Critical Component 1: Description Description of implementer behavior Drastic Mutation Hall and Hord, 2010, Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 34

Professional Problem Solving 9 Critical Components  Parent Involvement  Problem Statement  Systematic Data Collection  Problem Analysis  Goal Development  Intervention Plan Development  Intervention Plan Implementation  Progress Monitoring  Decision Making Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education,

Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education,

Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education,

Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education,

Professional Problem Solving Parent Involvement – Critical Components 39

Things to Think About  Think about your SPDG effort and your involvement and guidance at the State, District, and School levels.  Currently, our SPDG work is well operationalized ?  ….At the Classroom level »_Strongly Agree _Agree __Disagree __Strongly Disagree  …At the district level  …At the regional level  …At the State level 40

Michigan’s Practice Profile: Building Leadership Team Example 41

C ALIFORNIA ’ S E VALUATION T OOL : I MPLEMENTATION R UBRIC The 10 items are intervention practices-focused mostly, with site team and fidelity items The overall tool and process of how the rubric is used drives the implementation practices Self-evaluate and reflect on learning and implementation. Shared with coaches and trainers to guide activities Evaluates the fidelity of implementation of both the PD model and the interventions Former 26-item, 3-point ERIA Checklist lacked the specificity to be meaningful and useful. 42

I MPLEMENTATION R UBRIC, A DAPTED FROM “G OAL A TTAINMENT S CALES ” Amy Gaumer Erickson and Monica Ballay presented “goal attainment scales” on a June 17 SIG Network webinar: Rubric explicitly describes 5 implementation levels for each of 10 items: Levels 1, 2, and 3 reflect the “Not started,” “In progress,” and “Achieved” implementation levels of former checklist. Levels 4 and 5 detail concrete steps towards optimal implementation, beyond the basics. Each implementation level for each item is explicitly described, building more meaning into the tool than our previous checklist format allowed. 43

I MPLEMENTATION R UBRIC E XCEL F ILE : M ULTI - YEAR TRACKING AND A UTOMATED R EPORTS The same file is used in all three years of ERIA, reporting both the trend and most-recent entries. 44

ERIA on the Web: Li Walter: Alan Wood: (707)

 Observations may be crucial because teachers are known to be biased in their reports (Hansen and McNeal, 1999).  Given the frequency with which adaptations are observed in research and practice, program developers need to anticipate how and when teachers will modify programs and develop guidelines and recommendations to ensure program goals are met (Dusenbury, Brannigan, Hansen, Walsh, & Falco, 2005) 46

The project will set its own benchmarks for professional development participants 1 year into training/assistance, 2 yrs in, 3 yrs in, 4 yrs in For example: 1 yr benchmark = 40% of core features in place, 4 yr benchmark = 80% of features in place The project will then determine what percentage of participants they expect to reach this benchmark (e.g., 80% of participants) a.Participants could be individual teachers (if working with just a few teachers or other type of professional per school or district) or could be a school (if working on a school-wide basis, such as RTI or PBIS) 47

 Self-assessment is acceptable, but projects will need to sample from the group to validate the self- assessment a.For example, if 15 schools were being measured someone from the project would observe at least 3 (1/5 th ) of the schools and compare their assessment with the self-assessment A baseline wouldn’t be necessary 48

Questions?  What kind of guidance would be helpful to you?  What challenges do you foresee in capturing these data? 49