SPDG Bidders’ Webinar Presented by Jennifer Coffey July 2012 The Revised SPDG Program Measures Dial-in: Participant Code:
Click the Person icon to: Raise Your Hand Agree/Disagree Other…. Click the Full Screen: To maximize presentation screen
Webinar Ground Rules Mute your phones: -To Mute or Un-Mute Press *6 -Please do not put your phones on ‘Hold’ For webinar technical difficulties: -Send to Q & A Process (audio/chat): -Ask questions in two ways: 1.Audio/Voice 2.Type your question in the Chat Pod Archive Recording, PPT, & Materials -To be posted to;
SPDG 2 nd Annual National Meeting March 5 & 6, 2013 Washington, DC
OSEP Project Directors’ Conference – SPDG Program Area Meeting
Next Bidders’ Webinar 6 SPDG Competition Bidders’ Webinar # 2: SPDG Management Plan Date: Friday, July 13, 2012 Time: 2:00 PM - 3:30 PM Eastern / 1:00 PM - 2:30 PM Central / 12:00 PM - 1:30 PM Mountain / 11:00 AM - 12:30 PM Pacific
Program Measures Web page: pages/205 pages/205 7
Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies. Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time. 8
Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG- supported practices. (Efficiency Measure) Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities. 9
Projects use evidence-based professional development practices to support the attainment of identified competencies. 10
Fixsen and colleagues Trivette and Dunst Guskey Learning Forward (Formerly National Staff Development Council) 11
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005). Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Download all or part of the monograph for free at: eID=31http:// eID=31 Implementation Research: A Synthesis of the Literature 12
Review and synthesis of the implementation research and evaluation literature (1970 – 2004) Multi-disciplinary Multi-sector Multi-national 13
Evidence-Based Intervention Practices Insert your SPDG initiative here (identified competencies) Evidence-Based Implementation Practices Professional Development Staff Competence: Selection, Training, Coaching, and Performance Assessment Drivers Adult learning methods/principles Evaluation 14 Two Types of Evidence-Based Practices
The Program Guide articulates a comprehensive set of practices for all stakeholders. 15 Implementation PracticesIntervention Practices Initial Training Team-based Site-level Practice and Implementation Implementation Rubric facilitates self-eval Ongoing Coaching Booster Trainings Implementation Rubric reflection on next steps The 5 Steps of ERIA Data-informed Decision-making Screening and Assessment Progress Monitoring Tiered Interventions and Learning Supports Enhanced Literacy Instruction
Program Guide articulates PD model › introduces and illustrates › contextualizes the training › gets away from “you had to be there” Implementation Rubric operationalizes PD model › drives ongoing implementation › enables fidelity checks › is possible to evaluate Everyone is on the same page Sustainability (beyond funding, staff turnover) Scale-up (recruit new sites/districts, beyond SPDG) Diversity of approaches enabled 16
17 HOW?
“No intervention practice, no matter what its evidence base, is likely to be learned and adopted if the methods and strategies used to teach or train students, practitioners, parents, or others are not themselves effective.” 18
Job or role description should be explicit about expectations and accountability for all positions (e.g., teachers, coaches, staff, administrators) Readiness measures to select at a school building-level or school district-level. Interactive interview process (Blase, VanDyke, & Fixsen, 2010) 19
Training must be … › Timely › Theory grounded (adult learning) › Skill-based Information from Training feeds back to Selection and feeds forward to Coaching SelectionTraining Coaching (Blase, VanDyke, & Fixsen, 2010) 20
Using Research Findings to Inform Practical Approaches to Evidence-Based Practices Carl J. Dunst, Ph.D. Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Recording and resources: Presentation Prepared for a Webinar with the Knowledge Transfer Group, U.S. Department of Health and Human Services, Children’s Bureau Division of Research and Innovation, September 22,
“ Adult learning refers to a collection of theories, methods, and approaches for describing the characteristics of and conditions under which the process of learning is optimized.” 22
Planning Introduce Engage the learner in a preview of the material, knowledge or practice that is the focus of instruction or training Illustrate Demonstrate or illustrate the use or applicability of the material, knowledge or practice for the learner Application Practice Engage the learner in the use of the material, knowledge or practice Evaluate Engage the learner in a process of evaluating the consequence or outcome of the application of the material, knowledge or practice Deep Understanding Reflection Engage the learner in self-assessment of his or her acquisition of knowledge and skills as a basis for identifying “next steps” in the learning process Mastery Engage the learner in a process of assessing his or her experience in the context of some conceptual or practical model or framework, or some external set of standards or criteria a Donovan, M. et al. (Eds.) (1999). How people learn. Washington, DC: National Academy Press. 23
The smaller the number of persons participating in a training (<20), the larger the effect sizes for the study outcomes. The more hours of training over an extended number of sessions, the better the study outcomes. 24
Practices Number Mean Effect Size (d) 95% Confide nce Interval Studies Effect Sizes Pre-class exercises Out of class activities/self- instruction Classroom/workshop lectures Dramatic readings Imagery Dramatic readings/imagery Effect Sizes for Introducing Information to Learners 25
Practices Number Mean Effect Size (d) 95% Confid ence Interva l Studie s Effect Sizes Standards-based assessment Self-assessment Effect Sizes for Self-Assessment of Learner Mastery 26
To be most effective need to actively involve the learners in judging the consequences of their learning experiences (evaluate, reflection, & mastery) › Need learner participation in learning new knowledge or practice › Need learner engagement in judging his or her experience in learning and using new material 27
Design a Coaching Service Delivery Plan Develop accountability structures for Coaching – Coach the Coach! Identify on-going professional development for coaches Coaching Performance Assessment Training (Blase, VanDyke, & Fixsen, 2010) 28
Must be a transparent process Use of multiple data sources Fidelity of implementation should be assessed at the local, regional, and state levels Tied to positive recognition Information from this driver feeds back to Selection, Training, and Coaching and feeds forward to the Organization Drivers 29
Assess fidelity of implementation at all levels and respond accordingly Identify outcome measures that are … › Intermediate and longer-term › Socially valid › Technically adequate: reliable and valid › Relevant data that is feasible to gather, useful for decision making, widely shared and reported frequently 30
A Building/District Leadership and Implementation Team is formed › The Team uses feedback and data to improve Implementation Drivers Policies and procedures are developed and revised to support the new ways of work Solicits and analyzes feedback from staff and stakeholders 31
Leadership analyzes feedback from staff and makes changes to alleviate barriers and facilitate implementation, Revising policies and procedures to support new way of work. 32
33
Evaluation LevelWhat Questions Are Addressed How Will Information Be Gathered? What Is Measured or Addressed? How Will Information Be Used? 1. Participants’ reactions Did they like it? Was their time well spent? Did the material make sense? Will it be useful Was leader knowledgeable & helpful? Were the refreshments fresh & tasty? Was the room the right temp.? Were the chairs comfortable? Questionnaires administered at the end of the session Focus groups Interviews Personal learning logs Initial satisfaction with the experience To improve program design and delivery 2. Participants’ learning Did participants acquire the intended knowledge and skills? Paper-and-pencil instruments Simulations & demonstrations Participant reflections (oral and/or written Participant portfolios Case study analyses New knowledge and skills of participants To improve program content, format, and organization 3. Organization support and change What was the impact on the org.? Did it affect organizational climate and procedures? Was implementation advocated, facilitated, and supported? Was the support public and overt? Were problems addressed quickly and efficiently? Were sufficient resources available? District and school records Minutes from follow-up meetings Questionnaires Focus groups Structured interviews with participants and school or district administrators Participant portfolios The organization’s advocacy, support, accommodatio n, facilitation, and recognition To document and improve organizational support To inform future change efforts 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implement ation To document and improve the implementa tion of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well- being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Questionnaires Structured interviews with students, parents, teachers, and/or administrators Participant portfolios Student learning outcomes: - C og ni tiv e (performance & achieveme nt) - Affective (attitudes and disposition s) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementa tion, and follow-up To demonstrat e the overall impact of professiona l developme nt Guskey’s Five Critical Levels of Professional Development Evaluation Reprinted from: Guskey, T. R. (2000). Evaluating professional development (pp ). Thousand Oaks, CA: Corwin Press. 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implement ation To document and improve the implementa tion of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well- being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Questionnaires Structured interviews with students, parents, teachers, and/or administrators Participant portfolios Student learning outcomes: - C og ni tiv e (performance & achieveme nt) - Affective (attitudes and disposition s) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementa tion, and follow-up To demonstrat e the overall impact of professiona l developme nt 34
Evaluation LevelWhat Questions Are Addressed How Will Information Be Gathered? What Is Measured or Addressed? How Will Information Be Used? 1. Participants’ reactions Did they like it? Was their time well spent? Did the material make sense? Will it be useful Was leader knowledgeable & helpful? Were the refreshments fresh & tasty? Was the room the right temp.? Were the chairs comfortable? Questionnaires administered at the end of the session Focus groups Interviews Personal learning logs Initial satisfactio n with the experience To improve program design and delivery 2. Participants’ learning Did participants acquire the intended knowledge and skills? Paper-and-pencil instruments Simulations & demonstrations Participant reflections (oral and/or written Participant portfolios Case study analyses New knowledg e and skills of participant s To improve program content, format, and organizati on 3. Organization support and change What was the impact on the org.? Did it affect organizational climate and procedures? Was implementation advocated, facilitated, and supported? Was the support public and overt? Were problems addressed quickly and efficiently? Were sufficient resources available? District and school records Minutes from follow-up meetings Questionnaires Focus groups Structured interviews with participants and school or district administrators Participant portfolios The organizati on’s advocacy, support, accommo dation, facilitation, and recognitio n To document and improve organizati onal support To inform future change efforts 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implementation To document and improve the implementation of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well- being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Questionnaires Structured interviews with students, parents, teachers, and/or administrators Participant portfolios Student learning outcomes: - Cognitive (performance & achievement) - Affective (attitudes and dispositions) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementation, and follow-up To demonstrate the overall impact of professional development Guskey’s Five Critical Levels of Professional Development Evaluation Reprinted from: Guskey, T. R. (2000). Evaluating professional development (pp ). Thousand Oaks, CA: Corwin Press. 35
SPDG Professional Development Rubric 5 Domains, each with components Selection Training Coaching Performance Assessment/Data-based decision making Facilitative administration/Systems intervention Components from the National Implementation Research Network, Learning Forward (NSDC), Guskey, Trivette Each component of the domains will be rated from
37
Component Themes Assigning responsibility for major professional development functions (e.g., measuring fidelity and outcomes; monitoring coaching quality) Expectations stated for all roles and responsibilities (e.g., PD participants, trainers, coaches, school & district administrators) Data for each stage of PD (e.g., selection, training, implementation, coaching, outcomes) 38
SPDG Initiatives and Evidence-based Professional Development EB-PD should be applied to those initiatives that lead to implementation (of the practice/program providing training on) 39
Grantee Benchmarks 1 st year of funding: baseline 2 nd yr: 50% of components will have a score of 3 or 4 3 rd yr: 70% of components will have a score of 3 or 4 4 th yr: 80% of components will have a score of 3 or 4 5 th yr: 80% of components will have a score of 3 or 4 (maintenance yr) 40
41 Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.
Fidelity of implementation is traditionally defined as “the extent to which the user’s current practice matches the ideal (Loucks, 1983). 42
Dusenbury, Brannigan, Falco, & Hansen (2003) Dane & Schneider (1998) O’Donnell (2005) Blase “Innovation Fluency” presentation: Mowbray, Holter, Teague & Bybee (2003) 43
“All five studies consistently showed statistically significantly higher outcomes when the program was implemented with greater fidelity. The studies reviewed here suggest that fidelity of implementation is more probable when an intervention manual is in place that clearly defines the critical components of the intervention and articulates a theory. Distinctions should be made between measuring fidelity to the structural components of a curriculum intervention and fidelity to the processes that guide its design.” 44
EVALUATION DRIVES ERIA’S EVIDENCE-BASED PRACTICES The Program Guide, a 16-page booklet, explicitly addresses both implementation and intervention practices to guide the design of a site-based program. The Implementation Rubric is a 10-item instrument which provides a framework for trainers, coaches, site team members, and teachers to evaluate and discuss implementation, fidelity, and next steps. Some additional tools include: end-of-event training surveys and three-month follow-ups feedback and support from cohort coaches and site team fidelity observations student data 45
ERIA’S EVIDENCE-BASED PRACTICES The Program Guide articulates a comprehensive set of practices for all stakeholders. 46 Implementation PracticesIntervention Practices Initial Training Team-based Site-level Practice and Implementation Implementation Rubric facilitates self-eval Ongoing Coaching Booster Trainings Implementation Rubric reflection on next steps The 5 Steps of ERIA Data-informed Decision-making Screening and Assessment Progress Monitoring Tiered Interventions and Learning Supports Enhanced Literacy Instruction
Each initiative should have a fidelity measure that notes the presence or absence of the core features of the innovation/program/system that the initiative is focused on 47
Use implementation measures that have already been created › For example: new RTI implementation measure from the Natl RTI Center › Literacy implementation: Planning and Evaluation Tool – Revised (PET-R) › PBIS: Schoolwide Evaluation Tool (SET) › Others… 48
To develop fidelity criteria, researchers often reported starting with a curriculum profile or analysis that outlined the critical components of the intervention along with an indication of the range of variations for acceptable use. The researcher or developer then outlined acceptable ranges of variation (Songer & Gotwals, 2005). A component checklist was then developed to record fidelity to these components (Hall & Loucks, 1977). 49
What is “it”? Operationalize Part of Speech: verb Definition: to define a concept or variable so that it can be measured or expressed quantitatively Webster's New Millennium™ Dictionary of English, Preview Edition (v 0.9.7) Copyright © Lexico Publishing Group, LLC The “it” must be operationalized whether it is: »An Evidence-Based Practice or Program »A Best Practice Initiative or New Framework »A Systems Change Initiative Practice Profiles »Help Operationalize Practice, Program, and Systems Features 50
Searching for “It” Research findings, materials, manuals, and journal articles do not necessarily provide clarity around core intervention elements Current and new evidence-based practices, frameworks, programs will have a range of operational specificity Developing clarity around the “it” is critical 51
Practice Profile Defining “it” Through the Development and Use of Practice Profiles Guiding Principles identified Critical Components articulated Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 52
Practice Profile Defining “it” Through the Development and Use of Practice Profiles Guiding Principles identified Critical Components articulated For each critical component: Identified gold standard Identified acceptable variations in practice Identified ineffective practices and undesirable practices Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 53
Resources for Building Practice Profiles 54 National Centers Experts in Your State National Purveyors Manuals and Materials Implementing Districts and Schools Other States Consensus Building in Your State
Example Problem-Solving Practice Profiles in an RtI Framework 55 RESOURCE - Professional Practices in Problem Solving: Benchmarks and Innovation Configurations ~ Iowa Area Education Agency Directors of Special Education, 1994
Practice Profile Defining “it” Through the Development and Use of Practice Profiles Guiding Principles identified Critical Components articulated For each critical component: Identified gold standard Identified acceptable variations in practice Identified ineffective practices and undesirable practices Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 56
Practice Profiles Each Critical Component is a heading Each level of implementation specifies the activities necessary to operationalize that Critical Component Critical Component Ideal Implementation Acceptable Variation Unacceptable Variation Critical Component 1: Description Description of implementer behavior Drastic Mutation Hall and Hord, 2010, Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency 57
Professional Problem Solving 9 Critical Components Parent Involvement Problem Statement Systematic Data Collection Problem Analysis Goal Development Intervention Plan Development Intervention Plan Implementation Progress Monitoring Decision Making Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education,
Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education,
Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education,
Professional Problem Solving Parent Involvement as a Critical Component Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education,
Professional Problem Solving Parent Involvement – Critical Components 62
Michigan’s Practice Profile: Building Leadership Team Example 63
CALIFORNIA’S EVALUATION TOOL: IMPLEMENTATION RUBRIC The 10 items are intervention practices-focused mostly, with site team and fidelity items The overall tool and process of how the rubric is used drives the implementation practices Self-evaluate and reflect on learning and implementation. Shared with coaches and trainers to guide activities Evaluates the fidelity of implementation of both the PD model and the interventions Former 26-item, 3-point ERIA Checklist lacked the specificity to be meaningful and useful. 64
IMPLEMENTATION RUBRIC, ADAPTED FROM “GOAL ATTAINMENT SCALES” Amy Gaumer Erickson and Monica Ballay presented “goal attainment scales” on a SIG Network webinar: Rubric explicitly describes 5 implementation levels for each of 10 items: Levels 1, 2, and 3 reflect the “Not started,” “In progress,” and “Achieved” implementation levels of former checklist. Levels 4 and 5 detail concrete steps towards optimal implementation, beyond the basics. Each implementation level for each item is explicitly described, building more meaning into the tool than our previous checklist format allowed. 65
IMPLEMENTATION RUBRIC EXCEL FILE: MULTI-YEAR TRACKING AND AUTOMATED REPORTS The same file is used in all three years of ERIA, reporting both the trend and most-recent entries. 66
ERIA on the Web: Li Walter: Alan Wood: (707)
Observations may be crucial because teachers are known to be biased in their reports (Hansen and McNeal, 1999). Given the frequency with which adaptations are observed in research and practice, program developers need to anticipate how and when teachers will modify programs and develop guidelines and recommendations to ensure program goals are met (Dusenbury, Brannigan, Hansen, Walsh, & Falco, 2005) 68
The project will set its own benchmarks for professional development participants 1 year into training/assistance, 2 yrs in, 3 yrs in, 4 yrs in For example: 1 yr benchmark = 40% of core features in place, 4 yr benchmark = 80% of features in place The project will then determine what percentage of participants they expect to reach this benchmark (e.g., 80% of participants) a.Participants could be individual teachers (if working with just a few teachers or other type of professional per school or district) or could be a school (if working on a school-wide basis, such as RTI or PBIS) 69
Self-assessment is acceptable, but projects will need to sample from the group to validate the self- assessment a.For example, if 15 schools were being measured someone from the project would observe at least 3 (1/5 th ) of the schools and compare their assessment with the self-assessment A baseline wouldn’t be necessary 70
71 Performance Measurement 3: Projects use SPDG professional development funds to provide follow- up activities designed to sustain the use of SPDG-supported practices. (Efficiency Measure
Professional development funds = a minimum of 90% of the overall budget being used for activities from subsection "a" of the notice/Statute › Only following the initiatives from Program Measure 1 & 2 Follow-up activities = the professional development assistance provided following training. A list of follow-up activities that are correlated with sustainability will be provided. 72
Coaching/mentoring* Implementation fidelity measurement & other types of observation* Mini-workshops* Determining needs through data and providing guidance or tools to meet those needs* Maintaining data systems* Peer sharing* 73
Model demonstration site activities Creating and disseminating enduring documents (procedural manuals)* Communities of Practice TA Networks (support from internal state/local TA&D systems Regional PD partnerships* * = Evidence-based 74
Research has demonstrated that “train and hope” does not work. Instead, ongoing support is needed for those who attend training. Despite this evidence, most professional development is one-time only, which is inefficient and largely a waste of money. 75
To demonstrate that the SPDG projects are using their money efficiently by providing the appropriate ongoing TA services that may lead to sustained use of the SPDG-supported practices. 76
For each initiative, grantee should report cost of activities designed to sustain learning of scientific or evidence-based instructional practices, divided by the total cost of all professional development activities carried out for the initiative. 77
Cost of ongoing TA Cost of all PD activities for an initiative 78
Only need to report on those initiatives reporting on for Measures 1 & 2 Projects will set their own targets 79
Consider what is happening each year of your project › Are you providing training for an entire year before you begin providing coaching? › In the final year of your project are you no longer providing training and only providing follow-up support? 80
Your initiative would help build local coaching capacity Projects would match/modify their training with (a) coaching, (b) performance feedback, and (c) student outcomes 81
82 Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.
Divide the number of teachers who remain in a teaching position by all teachers who received SPDG assistance. 83
# of personnel retained for at least two years following participation in a SPDG teacher retention activity # of personnel participating in a SPDG activity designed to retain highly qualified special education teachers 84
This is only for projects that have teacher retention as an objective. 85
Only inservice Initial participation is defined as beginning at the time someone receives funding or services from the SPDG grant. 86
If the SPDG State does not have a tracking system for highly qualified special education teachers they will need to put an agreement in place with the individual receiving funds or services › This agreement will require information from that individual for the life of the grant 87
88
Has relevant presentations, tools, links, etc: es/205 es/205 › There is a tab for it on the top of the SIGnetwork homepage 89