The Revised SPDG Program Measures: An Overview

Slides:



Advertisements
Similar presentations
Practice Profiles Guidance for West Virginia Schools and Districts April 2012.
Advertisements

April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
April 10, 2013 SPDG Implementation Science Webinar #3: Organization Drivers.
1 Interplay Between Evidence-Based Early Literacy Practices and Adult Learning Practices Carl J. Dunst, Ph.D. Carol M. Trivette, Ph.D. Orelena Hawks Puckett.
What Is “It” and How Do We Make “It” Happen Karen Blase, PhD Dean L. Fixsen, PhD Melissa Van Dyke, LCSW Michelle Duda, PhD Frank Porter Graham Child Development.
Braiding Initiatives Steve Goodman, Michigan’s Integrated Behavior and Learning Initiative (MiBLSi) April 16, :00PM – 3:30PM
Washington State Teacher and Principal Evaluation Project Preparing and Applying Formative Multiple Measures of Performance Conducting High-Quality Self-Assessments.
Support systems and sustained implementation of a data-driven, problem-solving model Margie McGlinchey MAASE Summer Institute August 11, 2009 Steve Goodman.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Measuring Implementation and Intervention Fidelity in Scaling Up: Processes, Tools, and Benefits Carol M. Trivette, Ph.D. Karen Blase, Ph.D. Frank Porter.
Dean Fixsen, Karen Blase, Rob Horner, and George Sugai University of North Carolina – Chapel Hill University of Oregon University of Connecticut Scaling.
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.
Participatory Adult Learning Professional Development Strategy: Evidence and Examples Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Morganton,
An Evidence-Based Approach to Professional In-service Training Carl J. Dunst, Ph.D. Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville.
9/15/20151 Scaling Up Presentation: SIG/SPDG Regional Meeting October 2009 Marick Tedesco, Ph.D. State Transformation Specialist for Scaling Up.
Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
Building Capacity with Implementation Drivers
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Implementation Drivers March 23, 2012
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
MiBLSi Systems of Support for Training October 9,
Coaching for Competence Margie McGlinchey SPDG Regional Mtg. October 1, 2009 Steve Goodman Margie McGlinchey Kathryn Schallmo Co-Directors.
A Framework for Making a Difference Rob Horner, University of Oregon Deputy Director of the Research to Practice Division for the U.S. Department of Education’s.
Effective Behavioral & Instructional Support Systems Overview and Guiding Principles Adapted from, Carol Sadler, Ph.D. – EBISS Coordinator Extraordinaire.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Scaling-Up Within a Statewide Multi-Tiered System of Supports (MTSS) SPDG National Meeting miblsi.cenmi.org.
Barbara Sims Brenda Melcher Dean Fixsen Karen Blase Michelle Duda Washington, D.C. July 2013 Keep Dancing After the Music Stops OSEP Project Directors’
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
1 The Oregon Reading First Model: A Blueprint for Success Scott K. Baker Eugene Research Institute/ University of Oregon Orientation Session Portland,
APR Know-how Jennifer Coffey November 2013 The Revised SPDG Program Measures and Other Reporting Requirements.
Rob Horner OSEP Center on PBIS Jon Potter Oregon RTI David Putnam Oregon RTI.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Bob Algozzine Rob Horner National PBIS Leadership Forum Chicago Hyatt Regency O’Hare October 8, /
An Evidence-Based Approach to Professional Development
2010 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career Applying Principles of Adult Learning to Presentations and.
Measuring Fidelity in Early Childhood Scaling-Up Initiatives: A Framework and Examples Carl J. Dunst, Ph.D. Orelena Hawks Puckett Institute Asheville,
Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead.
Friday Institute Leadership Team Glenn Kleiman, Executive Director Jeni Corn, Director of Evaluation Programs Phil Emer, Director of Technology Planning.
Tuesday, April 12 th 2011 SPDG Performance Measure Discussion.
State Systemic Improvement Plan (SSIP) Office of Special Education January 20, 2016.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
Coaching Within a Statewide Multi-Tiered System of Supports (MTSS) Steve Goodman miblsi.cenmi.org December 6, 2010.
IMPLEMENTATION SCIENCE OVERVIEW. CONTEXT & RATIONALE.
ND State Personnel Development Grant North Dakota Scaling- up and Implementation Science Framework (ND-SISF )
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
Grant Management PLC Session Discussion facilitated by Jennifer Coffey November 2011 Performance Measurement Discussion Dial-in: Participant.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Jennifer Coffey, Ph.D. SPDG Program Lead August 30, 2011.
SAM (Self-Assessment of MTSS Implementation) ADMINISTRATION TRAINING
SPDG New Grantee Orientation
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
NC State Improvement Project
Program Review For School Counseling Programs
District Leadership Team Sustainability Susan Barrett Director, Mid-Atlantic PBIS Network Sheppard Pratt Health.
Anna Harms December, 2013 Trainer Notes:
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Zelphine Smith-Dixon, State Director of Special Education
Model Demonstration Projects
Extending RTI to School-wide Behavior Support
Miblsi.cenmi.org Helping Students Become Better Readers with Social Skills Necessary for Success Steve Goodman Funded through OSEP.
Introduction to Coaching
Preparing to Use This Video with Staff:
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
McREL TEACHER EVALUATION SYSTEM
Installation Stage and Implementation Analysis
Facilitators: Jennifer Coffey, OSEP Project Officer
APR Informational Webinar
Measuring Child and Family Outcomes Conference August 2008
McREL TEACHER EVALUATION SYSTEM
Presentation transcript:

The Revised SPDG Program Measures: An Overview Jennifer Coffey, Ph.D. SPDG Program Lead August 30, 2011

Performance Measures Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies. Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG-supported practices. (Efficiency Measure) Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.

Continuation Reporting 2007 grantees will not be using the new program measures: Everyone else will have 1 year for practice Grantees will use the revised measures this year for their APR This continuation report will be a pilot OSEP will learn from this round of reports and make changes as appropriate Your feedback will be appreciated You may continue to report on the old program measures, if you like

Change is hard! This pace is much easier:

The Year in Review & Looking Forward Opportunities for OSEP and the Office of Planning, Evaluation, and Policy Development to hear from you 4 + Calls for you to give feedback Small group to discuss revising the Program Measures and Creating Methodology

Thank You To all who joined the large group discussions To the small working group members: Patti Noonan Jim Frasier Susan Williamson Nikki Sandve Li Walter Ed Caffarella Jon Dyson Julie Morrison

Opportunities to Learn and Share Monthly Webinars – “Directors’ Calls” Professional Development Series Evaluator Community of Practice Resource Library “Regional Meetings” Project Director’s Conference PLC’s

Looking Forward Directors’ Calls focused on the specific measures and overall “how-to” Please provide feedback about information and assistance you need Written guidance & tools to assist you Continuation reporting guidance Webinar that will focus on the program measures We will be learning from you about necessary flexibility (feedback loop)

Performance Measure #1 Projects use evidence-based professional development practices to support the attainment of identified competencies.

SPDG Regional Meeting RESOURCES To view the SPDG Regional Meeting Materials go to: http://signetwork.org/content_pages/27

2011 Professional development series Go to the Home Page to link each webinar segment: http://signetwork.org

Evidence-based Professional Development Models of and Evaluating Professional Development Date: January 12, 3:00-4 :30pm ET Speakers: Julie Morrison, Alan Wood, & Li Walter (SPDG evaluators)   SPDG REGIONAL MEETINGS Topic: Evidence-based Professional Development

Evidence-based PD Innovation Fluency Date: March 24, 3:00-4:30pm ET Speaker: Karen Blase, SISEP Professional Development for Administrators Date: April 19, 3:00-4:30pm ET Speakers: Elaine Mulligan, NIUSI Leadscape Rich Barbacane, National Association of Elementary School Principals Using Technology for Professional Development Date: May 18, 2:00-3:30pm ET Speaker: Chris Dede, Ph.D., Learning Technologies at Harvard’s Graduate School of Education

Evidence-Based Practices Two Types of Evidence-Based Practices Evidence-Based Intervention Practices Insert your SPDG initiative here (identified competencies) Evidence-Based Implementation Practices Professional Development Staff Competence: Selection, Training, Coaching, and Performance Assessment Drivers Adult learning methods/principles Evaluation

How? Adapted the Implementation Drivers model to add Evaluation Tools as Implementation Drivers of PD model Needed to integrate all of the elements HOW?

CA: ERIA’s Evidence-based Practices The Program Guide articulates a comprehensive set of practices for all stakeholders. Implementation Practices Intervention Practices Initial Training Team-based Site-level Practice and Implementation Implementation Rubric facilitates self-eval Ongoing Coaching Booster Trainings Implementation Rubric reflection on next steps The 5 Steps of ERIA Data-informed Decision-making Screening and Assessment Progress Monitoring Tiered Interventions and Learning Supports Enhanced Literacy Instruction

CA: Two Integrative Evaluation Tools Serve as Implementation Drivers Program Guide articulates PD model introduces and illustrates contextualizes the training gets away from “you had to be there” Implementation Rubric operationalizes PD model drives ongoing implementation enables fidelity checks is possible to evaluate Everyone is on the same page Sustainability (beyond funding, staff turnover) Scale-up (recruit new sites/districts, beyond SPDG) Diversity of approaches enabled

How? Adapted the Implementation Drivers model to add Evaluation Tools as Implementation Drivers of PD model Needed to integrate all of the elements HOW?

Best Practices in Training Training must be … Timely Theory grounded (adult learning) Skill-based Information from Training feeds back to Selection and feeds forward to Coaching Selection Training Coaching (Blase, VanDyke, & Fixsen, 2010)

Best Practices in Coaching Design a Coaching Service Delivery Plan Develop accountability structures for Coaching – Coach the Coach! Identify on-going professional development for coaches Training Coaching Performance Assessment (Blase, VanDyke, & Fixsen, 2010)

Best Practices in Performance Assessment (Fidelity) Must be a transparent process Use of multiple data sources Fidelity of implementation should be assessed at the local, regional, and state levels Tied to positive recognition Information from this driver feeds back to Selection, Training, and Coaching and feeds forward to the Organization Drivers

Why focus on professional development? “No intervention practice, no matter what its evidence base, is likely to be learned and adopted if the methods and strategies used to teach or train students, practitioners, parents, or others are not themselves effective.” "Let's Be Pals: An Evidence-based Approach to Professional Development." Dunst & Trivette, 2009

Using Research Findings to Inform Practical Approaches to Evidence-Based Practices Carl J. Dunst, Ph.D. Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Presentation Prepared for a Webinar with the Knowledge Transfer Group, U.S. Department of Health and Human Services, Children’s Bureau Division of Research and Innovation, September 22, 2009

“Adult learning refers to a collection of theories, methods, and approaches for describing the characteristics of and conditions under which the process of learning is optimized.”

Six Characteristics Identified in How People Learna Were Used to Code and Evaluate the Adult Learning Methods Planning Introduce Engage the learner in a preview of the material, knowledge or practice that is the focus of instruction or training Illustrate Demonstrate or illustrate the use or applicability of the material, knowledge or practice for the learner Application Practice Engage the learner in the use of the material, knowledge or practice Evaluate Engage the learner in a process of evaluating the consequence or outcome of the application of the material, knowledge or practice Deep Understanding Reflection Engage the learner in self-assessment of his or her acquisition of knowledge and skills as a basis for identifying “next steps” in the learning process Mastery Engage the learner in a process of assessing his or her experience in the context of some conceptual or practical model or framework, or some external set of standards or criteria a Donovan, M. et al. (Eds.) (1999). How people learn. Washington, DC: National Academy Press.

Additional Translational Synthesis Findings The smaller the number of persons participating in a training (<20), the larger the effect sizes for the study outcomes. The more hours of training over an extended number of sessions, the better the study outcomes. The practices are similarly effective when used in different settings with different types of learners.

Effect Sizes for Introducing Information to Learners Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Pre-class exercises 9 1.02 .63-1.41 Out of class activities/self-instruction 12 20 .76 .44-1.09 Classroom/workshop lectures 26 108 .68 .47-.89 Dramatic readings 18 40 .35 .13-.57 Imagery 7 .34 .08-.59 Dramatic readings/imagery 4 11 .15 -.33-.62

Effect Sizes for Illustrating/Demonstrating Learning Topic Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Using learner input for illustration 6 .89 .28-1.51 Role playing/simulations 20 64 .87 .58-1.17 Real life example/real life + role playing 10 .67 .27-1.07 Instructional video 5 49 .33 .09-.59

Effect Sizes for Learner Application Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Real life application + role playing 5 20 1.10 .48-1.72 Problem solving tasks 16 29 .67 .39-.95 Real life application 17 83 .58 .35-.81 Learning games/writing exercises 9 11 .55 .11-.99 Role playing (skits, plays) 35 .41 .21-.62

Effect Sizes for Learner Evaluation Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Assess strengths/weaknesses 14 48 .96 .67-1.26 Review experience/make changes 19 35 .60 .36-.83

Effect Sizes for Learner Reflection Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Performance improvement 9 34 1.07 .69-1.45 Journaling/behavior suggestion 8 17 .75 .49-1.00 Group discussion about feedback 16 29 .67 .39-.95

Effect Sizes for Self-Assessment of Learner Mastery Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Standards-based assessment 13 44 .76 .42-1.10 Self-assessment 16 29 .67 .39-.95

Summary of Training Findings To be most effective need to actively involve the learners in judging the consequences of their learning experiences (evaluate, reflection, & mastery) Need learner participation in learning new knowledge or practice Need learner engagement in judging his or her experience in learning and using new material

Innovation Fluency Definition: Innovation Fluency refers to the degree to which we know the innovation with respect to: Evidence Program and Practice Features Implementation Requirements

Implementation Pre-Requisites After you Have chosen based on student needs Looked for “best evidence” to address the need An Evidence-Based Practice or Program An Evidence-Informed Initiative or Framework Systems Change and Its Elements (c) Dean Fixsen and Karen Blase, 2010

Implementation Pre-Requisites After you Have chosen based on student needs Looked for “best evidence” to address the need An Evidence-Based Practice or Program An Evidence-Informed Initiative or Framework Systems Change and Its Elements Then it’s time to: Clearly identify and operationalize the elements (c) Dean Fixsen and Karen Blase, 2010

Professional Problem Solving 9 Critical Components Parent Involvement Problem Statement Systematic Data Collection Problem Analysis Goal Development Critical Component Ideal Implementation Acceptable Variation Unacceptable Variation Critical Component 1: Description Description of implementer behavior Intervention Plan Development Intervention Plan Implementation Progress Monitoring Decision Making Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 (c) Dean Fixsen and Karen Blase, 2004

Interaction of Leadership and Implementation Support Drivers Regarding Administrators Purpose: To Develop project Capacity (e.g., data systems, information resources, incentives) and Competency (e.g., selection, training, coaching) so administrators can implement practices with success Project Level Providing Leadership District Level Providing Leadership Building Level Providing Leadership Develop systems for district and building administrators to implement practices with success Develop systems for building administrators to implement practices with success Develop systems for building staff to implement practices with success

How is support provided? Provides guidance, visibility, funding, political support for MiBLSi Students Building Staff Building Leadership Team LEA District Leadership Team Across State Multiple District/Building Teams All staff All students Multiple schools w/in local district Who is supported? How is support provided? Provides guidance, visibility, funding, political support Provides coaching for District Teams and technical assistance for Building Teams Provides guidance and manages implementation Provides effective practices to support students Improved behavior and reading ISD Leadership Team Regional Technical Assistance Michigan Department of Education/MiBLSi Leadership Multiple schools w/in intermediate district MiBLSi Statewide Structure of Support

Developing Capacity Through “Manualization” Manuals are created to provide information and tools for implementation Various levels District Level Building Level

Developing Capacity Through “Practice Profiles” (Implementation Guides) Implementation Guides have been Developed for Positive Behavioral Interventions and Supports at the Building Level Reading Supports at the Building Level Building Leadership Team District Leadership Team Quick Guides have been developed for Principals Coaches

Practice Profile: Building Leadership Team Example

To Capture These Professional Development Elements Created a rubric for evidence-based professional development Implementation drivers = domains Each domain has components Each component will be measured by a panel of external evaluators Evaluators will likely be chosen from the US Department of Education

Rules for developing good rubrics (Zhang & Fiore, 2011) Process Determine what products/practices will be evaluated. Define each dimension and associated indicators. Determine a scale for describing the range of / products/practices. Write descriptors for each of the categories. Pilot test with users and revise—iterative process Train the evaluators. Check inter-rater reliability Finalize the rubric and share with evaluatees.

Use of the Rubric 4 Domains, each with 6 components Selection Training Coaching Performance Assessment Components from the National Implementation Research Network, Learning Forward (NSDC), Guskey Each component of the domains will be rated from 1 - 4

Component Themes Assigning responsibility for major professional development functions (e.g., measuring fidelity and outcomes; monitoring coaching quality) Expectations stated for all roles and responsibilities (e.g., PD participants, trainers, coaches, school & district administrators) Data for each stage of PD (e.g., selection, training, implementation, coaching, outcomes)

PD Needed for SPDG Project Personnel Adult learning principles for coaches More on adult learning principles for training How to create a professional development plan How describe the elements of the professional development plan

What Initiatives will you report on? (from project feedback) 1. “If a SPDG has 1 Goal/Initiative, they report on the performance measure for it. If a SPDG has two Goals/Initiatives, they report on one. If they have three, they report on two. If they have four, they report on two. The pattern would be that SPDGs would report on half of their Goals/Initiatives if they have an even number of them and would report on 2 of 3, 3 of 5, 4 of 7, etc for odd numbers. This is a simple method which could easily be applied.”

“An alternative would be to have SPDGs report on the measures for any Goals/Initiatives which involve PD which includes workshops/conferences designed not to just impart knowledge (Awareness Level of Systems Change Theory) but to implement an evidence based practice (e.g. SWPBIS, Reading Strategies, Math Strategies, etc.) …”

“l think having the OSEP Project Officers assigned to the various states negotiate with their respective state SPDG directors which of their SPDG initiatives are appropriate for this measure.  This could be done each year immediately after the annual project report is submitted to OSEP by the SPDG Directors via a phone call and email exchanges.  IF the negotiation could be completed in the early summer, individual meetings (if necessary) could be conducted at the July OSEP Project Director's Conference in July.  This type of negotiation could provide OSEP Project Officers with information necessary for making  informed decisions about each state SPDG award for the upcoming year.”

We will do… A combination of the three ideas: We will only have you report on those initiatives that lead to implementation (of the practice/program you are providing training on) If you have 1 or 2 of these initiatives, you will report on both. If you have 3, you will report on 2. If you have 4 you will report on 2, (report on 3 if you have 5, and so on) This is all per discussion with your Project Officer.

Setting Benchmarks “Perhaps a annual target could be that at least 60% of all practices receive at least a score of 3 or 4 in the first year of a five year SPDG funding cycle;  at least 70% of all practices receive at least a score of 3 or 4 in the second year of funding; at least 80% of all practices receive at least a score of 3 or 4 in the third year of funding; at least 90% of all practices receive at least a score of 3 or 4 in the fourth and fifth year of funding.”

We will do… This basic idea: 1st year of funding: baseline 2nd yr: 50% of components will have a score of 3 or 4 3rd yr: 70% of components will have a score of 3 or 4 4th yr: 80% of components will have a score of 3 or 4 5th yr: 80% of components will have a score of 3 or 4 (maintenance yr)

Other feedback Have projects fill out a worksheet with descriptions of the elements of their professional development system We will do this and have the panel of evaluators work from this worksheet and any supporting documents the project provides Provide exemplars We will create practice profiles for each component to demonstrate what would receive a 4, 3, 2, or 1 rating

Ideas for Guidance “It would be helpful to have a few rows [of the rubric] completed as an example with rating scores provided. We will do this Other ideas