Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Revised SPDG Program Measures: An Overview

Similar presentations


Presentation on theme: "The Revised SPDG Program Measures: An Overview"— Presentation transcript:

1 The Revised SPDG Program Measures: An Overview
Jennifer Coffey, Ph.D. SPDG Program Lead August 30, 2011

2 Performance Measures Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies. Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

3 Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG-supported practices. (Efficiency Measure) Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.

4 Continuation Reporting
2007 grantees will not be using the new program measures: Everyone else will have 1 year for practice Grantees will use the revised measures this year for their APR This continuation report will be a pilot OSEP will learn from this round of reports and make changes as appropriate Your feedback will be appreciated You may continue to report on the old program measures, if you like

5 Change is hard! This pace is much easier:

6 The Year in Review & Looking Forward
Opportunities for OSEP and the Office of Planning, Evaluation, and Policy Development to hear from you 4 + Calls for you to give feedback Small group to discuss revising the Program Measures and Creating Methodology

7 Thank You To all who joined the large group discussions
To the small working group members: Patti Noonan Jim Frasier Susan Williamson Nikki Sandve Li Walter Ed Caffarella Jon Dyson Julie Morrison

8 Opportunities to Learn and Share
Monthly Webinars – “Directors’ Calls” Professional Development Series Evaluator Community of Practice Resource Library “Regional Meetings” Project Director’s Conference PLC’s

9 Looking Forward Directors’ Calls focused on the specific measures and overall “how-to” Please provide feedback about information and assistance you need Written guidance & tools to assist you Continuation reporting guidance Webinar that will focus on the program measures We will be learning from you about necessary flexibility (feedback loop)

10 Performance Measure #1 Projects use evidence-based professional development practices to support the attainment of identified competencies.

11 SPDG Regional Meeting RESOURCES
To view the SPDG Regional Meeting Materials go to:

12 2011 Professional development series
Go to the Home Page to link each webinar segment:

13 Evidence-based Professional Development
Models of and Evaluating Professional Development Date: January 12, 3:00-4 :30pm ET Speakers: Julie Morrison, Alan Wood, & Li Walter (SPDG evaluators) SPDG REGIONAL MEETINGS Topic: Evidence-based Professional Development

14 Evidence-based PD Innovation Fluency
Date: March 24, 3:00-4:30pm ET Speaker: Karen Blase, SISEP Professional Development for Administrators Date: April 19, 3:00-4:30pm ET Speakers: Elaine Mulligan, NIUSI Leadscape Rich Barbacane, National Association of Elementary School Principals Using Technology for Professional Development Date: May 18, 2:00-3:30pm ET Speaker: Chris Dede, Ph.D., Learning Technologies at Harvard’s Graduate School of Education

15 Evidence-Based Practices
Two Types of Evidence-Based Practices Evidence-Based Intervention Practices Insert your SPDG initiative here (identified competencies) Evidence-Based Implementation Practices Professional Development Staff Competence: Selection, Training, Coaching, and Performance Assessment Drivers Adult learning methods/principles Evaluation

16 How? Adapted the Implementation Drivers model to add Evaluation Tools as Implementation Drivers of PD model Needed to integrate all of the elements HOW?

17 CA: ERIA’s Evidence-based Practices
The Program Guide articulates a comprehensive set of practices for all stakeholders. Implementation Practices Intervention Practices Initial Training Team-based Site-level Practice and Implementation Implementation Rubric facilitates self-eval Ongoing Coaching Booster Trainings Implementation Rubric reflection on next steps The 5 Steps of ERIA Data-informed Decision-making Screening and Assessment Progress Monitoring Tiered Interventions and Learning Supports Enhanced Literacy Instruction

18 CA: Two Integrative Evaluation Tools Serve as Implementation Drivers
Program Guide articulates PD model introduces and illustrates contextualizes the training gets away from “you had to be there” Implementation Rubric operationalizes PD model drives ongoing implementation enables fidelity checks is possible to evaluate Everyone is on the same page Sustainability (beyond funding, staff turnover) Scale-up (recruit new sites/districts, beyond SPDG) Diversity of approaches enabled

19 How? Adapted the Implementation Drivers model to add Evaluation Tools as Implementation Drivers of PD model Needed to integrate all of the elements HOW?

20 Best Practices in Training
Training must be … Timely Theory grounded (adult learning) Skill-based Information from Training feeds back to Selection and feeds forward to Coaching Selection Training Coaching (Blase, VanDyke, & Fixsen, 2010)

21 Best Practices in Coaching
Design a Coaching Service Delivery Plan Develop accountability structures for Coaching – Coach the Coach! Identify on-going professional development for coaches Training Coaching Performance Assessment (Blase, VanDyke, & Fixsen, 2010)

22 Best Practices in Performance Assessment (Fidelity)
Must be a transparent process Use of multiple data sources Fidelity of implementation should be assessed at the local, regional, and state levels Tied to positive recognition Information from this driver feeds back to Selection, Training, and Coaching and feeds forward to the Organization Drivers

23 Why focus on professional development?
“No intervention practice, no matter what its evidence base, is likely to be learned and adopted if the methods and strategies used to teach or train students, practitioners, parents, or others are not themselves effective.” "Let's Be Pals: An Evidence-based Approach to Professional Development." Dunst & Trivette, 2009

24 Using Research Findings to Inform Practical Approaches to Evidence-Based Practices
Carl J. Dunst, Ph.D Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Presentation Prepared for a Webinar with the Knowledge Transfer Group, U.S. Department of Health and Human Services, Children’s Bureau Division of Research and Innovation, September 22, 2009

25 “Adult learning refers to a collection of theories, methods, and approaches for describing the characteristics of and conditions under which the process of learning is optimized.”

26 Six Characteristics Identified in How People Learna Were Used to Code and Evaluate the Adult Learning Methods Planning Introduce Engage the learner in a preview of the material, knowledge or practice that is the focus of instruction or training Illustrate Demonstrate or illustrate the use or applicability of the material, knowledge or practice for the learner Application Practice Engage the learner in the use of the material, knowledge or practice Evaluate Engage the learner in a process of evaluating the consequence or outcome of the application of the material, knowledge or practice Deep Understanding Reflection Engage the learner in self-assessment of his or her acquisition of knowledge and skills as a basis for identifying “next steps” in the learning process Mastery Engage the learner in a process of assessing his or her experience in the context of some conceptual or practical model or framework, or some external set of standards or criteria a Donovan, M. et al. (Eds.) (1999). How people learn. Washington, DC: National Academy Press.

27 Additional Translational Synthesis Findings
The smaller the number of persons participating in a training (<20), the larger the effect sizes for the study outcomes. The more hours of training over an extended number of sessions, the better the study outcomes. The practices are similarly effective when used in different settings with different types of learners.

28 Effect Sizes for Introducing Information to Learners
Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Pre-class exercises 9 1.02 Out of class activities/self-instruction 12 20 .76 Classroom/workshop lectures 26 108 .68 Dramatic readings 18 40 .35 Imagery 7 .34 Dramatic readings/imagery 4 11 .15

29 Effect Sizes for Illustrating/Demonstrating Learning Topic
Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Using learner input for illustration 6 .89 Role playing/simulations 20 64 .87 Real life example/real life + role playing 10 .67 Instructional video 5 49 .33

30 Effect Sizes for Learner Application
Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Real life application + role playing 5 20 1.10 Problem solving tasks 16 29 .67 Real life application 17 83 .58 Learning games/writing exercises 9 11 .55 Role playing (skits, plays) 35 .41

31 Effect Sizes for Learner Evaluation
Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Assess strengths/weaknesses 14 48 .96 Review experience/make changes 19 35 .60

32 Effect Sizes for Learner Reflection
Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Performance improvement 9 34 1.07 Journaling/behavior suggestion 8 17 .75 Group discussion about feedback 16 29 .67

33 Effect Sizes for Self-Assessment of Learner Mastery
Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Standards-based assessment 13 44 .76 Self-assessment 16 29 .67

34 Summary of Training Findings
To be most effective need to actively involve the learners in judging the consequences of their learning experiences (evaluate, reflection, & mastery) Need learner participation in learning new knowledge or practice Need learner engagement in judging his or her experience in learning and using new material

35 Innovation Fluency Definition: Innovation Fluency refers to the degree to which we know the innovation with respect to: Evidence Program and Practice Features Implementation Requirements

36 Implementation Pre-Requisites
After you Have chosen based on student needs Looked for “best evidence” to address the need An Evidence-Based Practice or Program An Evidence-Informed Initiative or Framework Systems Change and Its Elements (c) Dean Fixsen and Karen Blase, 2010

37 Implementation Pre-Requisites
After you Have chosen based on student needs Looked for “best evidence” to address the need An Evidence-Based Practice or Program An Evidence-Informed Initiative or Framework Systems Change and Its Elements Then it’s time to: Clearly identify and operationalize the elements (c) Dean Fixsen and Karen Blase, 2010

38 Professional Problem Solving 9 Critical Components
Parent Involvement Problem Statement Systematic Data Collection Problem Analysis Goal Development Critical Component Ideal Implementation Acceptable Variation Unacceptable Variation Critical Component 1: Description Description of implementer behavior Intervention Plan Development Intervention Plan Implementation Progress Monitoring Decision Making Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 (c) Dean Fixsen and Karen Blase, 2004

39 Interaction of Leadership and Implementation Support Drivers Regarding Administrators
Purpose: To Develop project Capacity (e.g., data systems, information resources, incentives) and Competency (e.g., selection, training, coaching) so administrators can implement practices with success Project Level Providing Leadership District Level Providing Leadership Building Level Providing Leadership Develop systems for district and building administrators to implement practices with success Develop systems for building administrators to implement practices with success Develop systems for building staff to implement practices with success

40 How is support provided?
Provides guidance, visibility, funding, political support for MiBLSi Students Building Staff Building Leadership Team LEA District Leadership Team Across State Multiple District/Building Teams All staff All students Multiple schools w/in local district Who is supported? How is support provided? Provides guidance, visibility, funding, political support Provides coaching for District Teams and technical assistance for Building Teams Provides guidance and manages implementation Provides effective practices to support students Improved behavior and reading ISD Leadership Team Regional Technical Assistance Michigan Department of Education/MiBLSi Leadership Multiple schools w/in intermediate district MiBLSi Statewide Structure of Support

41 Developing Capacity Through “Manualization”
Manuals are created to provide information and tools for implementation Various levels District Level Building Level

42 Developing Capacity Through “Practice Profiles” (Implementation Guides)
Implementation Guides have been Developed for Positive Behavioral Interventions and Supports at the Building Level Reading Supports at the Building Level Building Leadership Team District Leadership Team Quick Guides have been developed for Principals Coaches

43 Practice Profile: Building Leadership Team Example

44 To Capture These Professional Development Elements
Created a rubric for evidence-based professional development Implementation drivers = domains Each domain has components Each component will be measured by a panel of external evaluators Evaluators will likely be chosen from the US Department of Education

45

46 Rules for developing good rubrics (Zhang & Fiore, 2011)
Process Determine what products/practices will be evaluated. Define each dimension and associated indicators. Determine a scale for describing the range of / products/practices. Write descriptors for each of the categories. Pilot test with users and revise—iterative process Train the evaluators. Check inter-rater reliability Finalize the rubric and share with evaluatees.

47 Use of the Rubric 4 Domains, each with 6 components
Selection Training Coaching Performance Assessment Components from the National Implementation Research Network, Learning Forward (NSDC), Guskey Each component of the domains will be rated from 1 - 4

48 Component Themes Assigning responsibility for major professional development functions (e.g., measuring fidelity and outcomes; monitoring coaching quality) Expectations stated for all roles and responsibilities (e.g., PD participants, trainers, coaches, school & district administrators) Data for each stage of PD (e.g., selection, training, implementation, coaching, outcomes)

49 PD Needed for SPDG Project Personnel
Adult learning principles for coaches More on adult learning principles for training How to create a professional development plan How describe the elements of the professional development plan

50 What Initiatives will you report on? (from project feedback)
1. “If a SPDG has 1 Goal/Initiative, they report on the performance measure for it. If a SPDG has two Goals/Initiatives, they report on one. If they have three, they report on two. If they have four, they report on two. The pattern would be that SPDGs would report on half of their Goals/Initiatives if they have an even number of them and would report on 2 of 3, 3 of 5, 4 of 7, etc for odd numbers. This is a simple method which could easily be applied.”

51 “An alternative would be to have SPDGs report on the measures for any Goals/Initiatives which involve PD which includes workshops/conferences designed not to just impart knowledge (Awareness Level of Systems Change Theory) but to implement an evidence based practice (e.g. SWPBIS, Reading Strategies, Math Strategies, etc.) …”

52 “l think having the OSEP Project Officers assigned to the various states negotiate with their respective state SPDG directors which of their SPDG initiatives are appropriate for this measure.  This could be done each year immediately after the annual project report is submitted to OSEP by the SPDG Directors via a phone call and exchanges.  IF the negotiation could be completed in the early summer, individual meetings (if necessary) could be conducted at the July OSEP Project Director's Conference in July.  This type of negotiation could provide OSEP Project Officers with information necessary for making  informed decisions about each state SPDG award for the upcoming year.”

53 We will do… A combination of the three ideas:
We will only have you report on those initiatives that lead to implementation (of the practice/program you are providing training on) If you have 1 or 2 of these initiatives, you will report on both. If you have 3, you will report on 2. If you have 4 you will report on 2, (report on 3 if you have 5, and so on) This is all per discussion with your Project Officer.

54 Setting Benchmarks “Perhaps a annual target could be that at least 60% of all practices receive at least a score of 3 or 4 in the first year of a five year SPDG funding cycle;  at least 70% of all practices receive at least a score of 3 or 4 in the second year of funding; at least 80% of all practices receive at least a score of 3 or 4 in the third year of funding; at least 90% of all practices receive at least a score of 3 or 4 in the fourth and fifth year of funding.”

55 We will do… This basic idea: 1st year of funding: baseline
2nd yr: 50% of components will have a score of 3 or 4 3rd yr: 70% of components will have a score of 3 or 4 4th yr: 80% of components will have a score of 3 or 4 5th yr: 80% of components will have a score of 3 or 4 (maintenance yr)

56 Other feedback Have projects fill out a worksheet with descriptions of the elements of their professional development system We will do this and have the panel of evaluators work from this worksheet and any supporting documents the project provides Provide exemplars We will create practice profiles for each component to demonstrate what would receive a 4, 3, 2, or 1 rating

57 Ideas for Guidance “It would be helpful to have a few rows [of the rubric] completed as an example with rating scores provided. We will do this Other ideas


Download ppt "The Revised SPDG Program Measures: An Overview"

Similar presentations


Ads by Google