SPDG: Potential Applicants’ Webinar #1 Jennifer Coffey, PhD State Personnel Development Grants (SPDG) Program Lead.

Slides:



Advertisements
Similar presentations
Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Advertisements

Performance Assessment
Five -Year Strategic Title I School Plan. Session Objectives Review the five year components utilizing the rubric Organize actions steps to meet the requirements.
WV High Quality Standards for Schools
April 10, 2013 SPDG Implementation Science Webinar #3: Organization Drivers.
Establishing an Effective Network of PB4L: School wide Coaches
1 Interplay Between Evidence-Based Early Literacy Practices and Adult Learning Practices Carl J. Dunst, Ph.D. Carol M. Trivette, Ph.D. Orelena Hawks Puckett.
Campus Staffing Changes Positions to be deleted from CNA/CIP  Title I, Title II, SCE  Academic Deans (211)  Administrative Assistants.
John Carter Project Coordinator PBIS Idaho: Menu button: Idaho PBIS Presentations and Webinars.
Ensuring Quality and Effective Staff Professional Development to Increase Learning for ALL Students.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
School Leadership Evaluation System Orientation SY13-14 Evaluation Systems Office, HR Dr. Michael Shanahan, CHRO.
Professional Development and Technical Assistance: The National Blueprint for Success Tim Lewis, Ph.D University of Missouri OSEP Center on Positive Behavioral.
SAISD Office for Professional Learning Advisory Board March 2, 2005 Navarro Academy March 2, 2005 Navarro Academy 623 S. Pecos.
Virginia’s State Improvement Grant and State Personnel Development Grant The Content Literacy Continuum & Response to Intervention.
February 8, 2012 Session 3: Performance Management Systems 1.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Are We making a Difference
Measuring Implementation and Intervention Fidelity in Scaling Up: Processes, Tools, and Benefits Carol M. Trivette, Ph.D. Karen Blase, Ph.D. Frank Porter.
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
Participatory Adult Learning Professional Development Strategy: Evidence and Examples Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Morganton,
An Evidence-Based Approach to Professional In-service Training Carl J. Dunst, Ph.D. Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville.
Performance and Development Culture Preparing for P&D Culture accreditation April 2008.
Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
ISLLC Standard #2 Implementation
1 SPDG Jennifer Coffey 323A State Personnel Development Grants SPDG Webinar on Grant Performance Report for Continuation Funding Jennifer Coffey Office.
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Heidi Erstad and Peg Mazeika, Technical Assistance Coordinators Wisconsin RtI Center Bridget Blask, Rachel Blum, and Leslie Connors Franklin Elementary.
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
Measuring Implementation: School-Wide Instructional Staff Perspective Amy Gaumer Erickson, Ph.D. University of Kansas Evaluator: Kansas & Missouri SPDGs.
Effective Behavioral & Instructional Support Systems Overview and Guiding Principles Adapted from, Carol Sadler, Ph.D. – EBISS Coordinator Extraordinaire.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
SPDG Performance Measure Discussion Monday, March 14 th 2011.
Quality Assurance Review Team Oral Exit Report District Accreditation Bibb County Schools February 5-8, 2012.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
APR Know-how Jennifer Coffey November 2013 The Revised SPDG Program Measures and Other Reporting Requirements.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
An Evidence-Based Approach to Professional Development
Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Presentation prepared for Helping Extend Learning and.
Program Measure Review – Considerations for the APR  Jennifer Coffey, PhD, OSEP Program Lead 1 “Continuous improvement is better than delayed perfection.”
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
2010 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career Applying Principles of Adult Learning to Presentations and.
Selecting Evidence Based Practices Oregon’s initial attempts to derive a process Implementation Conversations 11/10.
Program Measures Web page: pages/205 pages/205 1.
Quality Assurance Review Team Oral Exit Report School Accreditation Center Grove High School 10 November 2010.
Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead.
Positive Behaviour for Success (PBS) Universals Pre-Training Session 2012 Positive Behaviour for Success Illawarra South East Region.
Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Presentation prepared for Helping Extend Learning and.
Tuesday, April 12 th 2011 SPDG Performance Measure Discussion.
What does it mean to be a RETA Instructor this project? Consortium for 21 st Century Learning C21CL
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
SPDG Bidders’ Webinar Presented by Jennifer Coffey July 2012 The Revised SPDG Program Measures Dial-in: Participant Code:
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
Fidelity: Maximizing the Effectiveness of Tier 2 September 24, 2013 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development.
Grant Management PLC Session Discussion facilitated by Jennifer Coffey November 2011 Performance Measurement Discussion Dial-in: Participant.
School Leadership Evaluation System Orientation SY12-13 Evaluation Systems Office, HR Dr. Michael Shanahan, CHRO.
Jennifer Coffey, Ph.D. SPDG Program Lead August 30, 2011.
Assessing High Quality Professional Development BY: MONICA BALLAY |
SPDG New Grantee Orientation
NC State Improvement Project
Welcome to the SPDG Webinar
The Revised SPDG Program Measures: An Overview
GETTING ‘EVEN’ BETTER: FIVE LEVELS FOR PD EVALUATION
Miblsi.cenmi.org Helping Students Become Better Readers with Social Skills Necessary for Success Steve Goodman Funded through OSEP.
323A State Personnel Development Grants SPDG Webinar on Grant Performance Report for Continuation Funding Jennifer Coffey Office of Special Education.
Facilitators: Jennifer Coffey, OSEP Project Officer
APR Informational Webinar
Presentation transcript:

SPDG: Potential Applicants’ Webinar #1 Jennifer Coffey, PhD State Personnel Development Grants (SPDG) Program Lead

Housekeeping  Please mute your phones if you are not speaking (you can use *6 to mute and *6 to unmute)  Please note that these Webinars are not a substitute for reading through the entire notice and application package.  You can always reach Jennifer Coffey with questions

Webinar Schedule  1 st Webinar: Thursday, February 11, 3:00 PM-4:00 PM ET  2 nd Webinar: Thursday, February 18, 3:00 PM-4:00 PM ET  3 rd Webinar: Thursday, February 25, 3:00 PM-4:00 PM ET All Webinars will be recorded and can be found at on the homepage.

Today’s Agenda  Review the notice  Review the application package  Describe the program measures  Lessons learned

SPDG Notice Inviting Applications Areas of note

SPDG Application Package Things to remember

SPDG Program Measures Considerations for your Project Services and Your Evaluation Plan

Capturing Performance  Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.  Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time. 8

 Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG-supported practices. (Efficiency Measure)  Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities. 9

Performance Measure #1  Projects use evidence-based professional development practices to support the attainment of identified competencies. 10

References  Fixsen and colleagues  Trivette and Dunst  Guskey  Learning Forward (Formerly National Staff Development Council) 11

Evidence-Based Intervention Practices  Insert your SPDG initiative here (identified competencies) Evidence-Based Implementation Practices  Professional Development  Staff Competence: Selection, Training, Coaching, and Performance Assessment Drivers  Adult learning methods/principles  Evaluation 12 Two Types of Evidence-Based Practices

13 HOW?

Evidence-based Professional Development Rubric A, the EB-PD Rubric can be found at:

Using Research Findings to Inform Practical Approaches to Evidence-Based Practices Carl J. Dunst, Ph.D. Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Recording and resources: Presentation Prepared for a Webinar with the Knowledge Transfer Group, U.S. Department of Health and Human Services, Children’s Bureau Division of Research and Innovation, September 22,

 “ Adult learning refers to a collection of theories, methods, and approaches for describing the characteristics of and conditions under which the process of learning is optimized.” 16

Planning IntroduceEngage the learner in a preview of the material, knowledge or practice that is the focus of instruction or training IllustrateDemonstrate or illustrate the use or applicability of the material, knowledge or practice for the learner Application Practice Engage the learner in the use of the material, knowledge or practice EvaluateEngage the learner in a process of evaluating the consequence or outcome of the application of the material, knowledge or practice Deep Understanding ReflectionEngage the learner in self-assessment of his or her acquisition of knowledge and skills as a basis for identifying “next steps” in the learning process Mastery Engage the learner in a process of assessing his or her experience in the context of some conceptual or practical model or framework, or some external set of standards or criteria a Donovan, M. et al. (Eds.) (1999). How people learn. Washington, DC: National Academy Press. 17

 The smaller the number of persons participating in a training (<20), the larger the effect sizes for the study outcomes.  The more hours of training over an extended number of sessions, the better the study outcomes. 18

Practices Number Mean Effect Size (d) 95% Confide nce Interval Studies Effect Sizes Pre-class exercises Out of class activities/self- instruction Classroom/workshop lectures Dramatic readings Imagery Dramatic readings/imagery Effect Sizes for Introducing Information to Learners 19

Practices Number Mean Effect Size (d) 95% Confid ence Interva l Studie s Effect Sizes Standards-based assessment Self-assessment Effect Sizes for Self-Assessment of Learner Mastery 20

 To be most effective need to actively involve the learners in judging the consequences of their learning experiences (evaluate, reflection, & mastery) › Need learner participation in learning new knowledge or practice › Need learner engagement in judging his or her experience in learning and using new material 21

 Design a Coaching Service Delivery Plan  Develop accountability structures for Coaching – Coach the Coach!  Identify on-going professional development for coaches Coaching Performance Assessment Training (Blase, VanDyke, & Fixsen, 2010) 22

 Must be a transparent process  Use of multiple data sources  Fidelity of implementation should be assessed at the local, regional, and state levels  Tied to positive recognition  Information from this driver feeds back to Selection, Training, and Coaching and feeds forward to the Organization Drivers 23

 Assess fidelity of implementation at all levels and respond accordingly  Identify outcome measures that are … › Intermediate and longer-term › Socially valid › Technically adequate: reliable and valid › Relevant data that is feasible to gather, useful for decision making, widely shared and reported frequently 24

 A Building/District Leadership and Implementation Team is formed › The Team uses feedback and data to improve Implementation Drivers  Policies and procedures are developed and revised to support the new ways of work  Solicits and analyzes feedback from staff and stakeholders 25

 Leadership analyzes feedback from staff and makes changes to alleviate barriers and facilitate implementation,  Revising policies and procedures to support new way of work. 26

27

Evaluation LevelWhat Questions Are Addressed How Will Information Be Gathered? What Is Measured or Addressed? How Will Information Be Used? 1. Participants’ reactions Did they like it? Was their time well spent? Did the material make sense? Will it be useful Was leader knowledgeable & helpful? Were the refreshments fresh & tasty? Was the room the right temp.? Were the chairs comfortable? Questionnaires administered at the end of the session Focus groups Interviews Personal learning logs Initial satisfaction with the experience To improve program design and delivery 2. Participants’ learning Did participants acquire the intended knowledge and skills? Paper-and-pencil instruments Simulations & demonstrations Participant reflections (oral and/or written Participant portfolios Case study analyses New knowledge and skills of participants To improve program content, format, and organization 3. Organization support and change What was the impact on the org.? Did it affect organizational climate and procedures? Was implementation advocated, facilitated, and supported? Was the support public and overt? Were problems addressed quickly and efficiently? Were sufficient resources available? District and school records Minutes from follow-up meetings Questionnaires Focus groups Structured interviews with participants and school or district administrators Participant portfolios The organization’s advocacy, support, accommodatio n, facilitation, and recognition To document and improve organizational support To inform future change efforts 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implement ation To document and improve the implementa tion of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well- being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Questionnaires Structured interviews with students, parents, teachers, and/or administrators Participant portfolios Student learning outcomes: - C og ni tiv e (performance & achieveme nt) - Affective (attitudes and disposition s) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementa tion, and follow-up To demonstrat e the overall impact of professiona l developme nt Guskey’s Five Critical Levels of Professional Development Evaluation Reprinted from: Guskey, T. R. (2000). Evaluating professional development (pp ). Thousand Oaks, CA: Corwin Press. 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implement ation To document and improve the implementa tion of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well- being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Questionnaires Structured interviews with students, parents, teachers, and/or administrators Participant portfolios Student learning outcomes: - C og ni tiv e (performance & achieveme nt) - Affective (attitudes and disposition s) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementa tion, and follow-up To demonstrat e the overall impact of professiona l developme nt 28

Evaluation LevelWhat Questions Are Addressed How Will Information Be Gathered? What Is Measured or Addressed? How Will Information Be Used? 1. Participants’ reactions Did they like it? Was their time well spent? Did the material make sense? Will it be useful Was leader knowledgeable & helpful? Were the refreshments fresh & tasty? Was the room the right temp.? Were the chairs comfortable? Questionnaires administered at the end of the session Focus groups Interviews Personal learning logs Initial satisfactio n with the experience To improve program design and delivery 2. Participants’ learning Did participants acquire the intended knowledge and skills? Paper-and-pencil instruments Simulations & demonstrations Participant reflections (oral and/or written Participant portfolios Case study analyses New knowledg e and skills of participant s To improve program content, format, and organizati on 3. Organization support and change What was the impact on the org.? Did it affect organizational climate and procedures? Was implementation advocated, facilitated, and supported? Was the support public and overt? Were problems addressed quickly and efficiently? Were sufficient resources available? District and school records Minutes from follow-up meetings Questionnaires Focus groups Structured interviews with participants and school or district administrators Participant portfolios The organizati on’s advocacy, support, accommo dation, facilitation, and recognitio n To document and improve organizati onal support To inform future change efforts 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implementation To document and improve the implementation of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well- being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Questionnaires Structured interviews with students, parents, teachers, and/or administrators Participant portfolios Student learning outcomes: - Cognitive (performance & achievement) - Affective (attitudes and dispositions) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementation, and follow-up To demonstrate the overall impact of professional development Guskey’s Five Critical Levels of Professional Development Evaluation Reprinted from: Guskey, T. R. (2000). Evaluating professional development (pp ). Thousand Oaks, CA: Corwin Press. 29

 5 Domains, each with components › Selection › Training › Coaching › Performance Assessment/Data-based decision making › Facilitative administration/Systems intervention  Components from the National Implementation Research Network, Learning Forward (NSDC), Guskey, Trivette  Each component of the domains will be rated from

 Assigning responsibility for major professional development functions (e.g., measuring fidelity and outcomes; monitoring coaching quality)  Expectations stated for all roles and responsibilities (e.g., PD participants, trainers, coaches, school & district administrators)  Data for each stage of PD (e.g., selection, training, implementation, coaching, outcomes) 31

 EB-PD should be applied to those initiatives that lead to implementation (of the practice/program providing training on) 32

 1 st year of funding: baseline  2 nd yr: 50% of components will have a score of 3 or 4  3 rd yr: 70% of components will have a score of 3 or 4  4 th yr: 80% of components will have a score of 3 or 4  5 th yr: 80% of components will have a score of 3 or 4 (maintenance yr) 33

34 Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

 Fidelity of implementation is traditionally defined as “the extent to which the user’s current practice matches the ideal (Loucks, 1983). 35

 Each initiative should have a fidelity measure that notes the presence or absence of the core features of the innovation/program/system that the initiative is focused on  Be clear about the name of the measure and stating that it is a fidelity measure.  Choose one fidelity measure only for your program measure. › You can use other measures as project measures. 36

 Use implementation measures that have already been created › For example: new RTI implementation measure from the Natl RTI Center › Literacy implementation: Planning and Evaluation Tool – Revised (PET-R) › PBIS: Schoolwide Evaluation Tool (SET) › Others… 37

 Start here: ule-6 ule-6  Then go here: ule-7 ule-7 38

The project will set its own benchmarks for professional development participants 1 year into training/assistance, 2 yrs in, 3 yrs in, 4 yrs in For example: 1 yr benchmark = 40% of core features in place, 4 yr benchmark = 80% of features in place The project will then determine what percentage of participants they expect to reach this benchmark (e.g., 80% of participants) a.Participants could be individual teachers (if working with just a few teachers or other type of professional per school or district) or could be a school (if working on a school-wide basis, such as RTI or PBIS) 39

 Self-assessment is acceptable, but projects will need to sample from the group to validate the self- assessment a.For example, if 15 schools were being measured someone from the project would observe at least 3 (1/5 th ) of the schools and compare their assessment with the self-assessment A baseline wouldn’t be necessary 40

41 Performance Measurement 3: Projects use SPDG professional development funds to provide follow- up activities designed to sustain the use of SPDG-supported practices.

 Professional development funds = a minimum of 90% of the overall budget being used for activities from subsection "a" of the notice/Statute › Only following the initiatives from Program Measure 1 & 2  Follow-up activities = the professional development assistance provided following training. A list of follow-up activities that are correlated with sustainability are provided. 42

 Coaching/mentoring  Implementation fidelity measurement & other types of observation  Mini-workshops  Determining needs through data and providing guidance or tools to meet those needs  Maintaining data systems  Peer sharing 43

 Model demonstration site activities  Creating and disseminating enduring documents (procedural manuals)  Communities of Practice  TA Networks (support from internal state/local TA&D systems  Regional PD partnerships 44

 Research has demonstrated that “train and hope” does not work. Instead, ongoing support is needed for those who attend training.  Despite this evidence, most professional development is one-time only, which is inefficient and largely a waste of money. 45

 To demonstrate that the SPDG projects are using their money efficiently  by providing the appropriate ongoing TA services  that may lead to sustained use of the SPDG-supported practices. 46

 For each initiative, grantee should report cost of activities designed to sustain learning of scientific or evidence-based instructional practices, divided by the total cost of all professional development activities carried out for the initiative. 47

Cost of ongoing TA Cost of all PD activities for an initiative 48

 Only need to report on those initiatives reporting on for Measures 1 & 2  Use dollar amounts in the equation. › Otherwise your measure may not be counted in the external review  Projects will set their own targets 49

 Consider what is happening each year of your project › Are you providing training for an entire year before you begin providing coaching? › In the final year of your project are you no longer providing training and only providing follow-up support? 50

 Your initiative would help build local coaching capacity  Projects would match/modify their training with (a) coaching, (b) performance feedback, and (c) student outcomes 51

52 Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.

 Divide the number of teachers who remain in a teaching position by all teachers who received SPDG assistance. 53

 # of personnel retained for at least two years following participation in a SPDG teacher retention activity # of personnel participating in a SPDG activity designed to retain highly qualified special education teachers 54

 This is only for projects that have teacher retention as an objective.  Only inservice  Initial participation is defined as beginning at the time someone receives funding or services from the SPDG grant. 55

 If the SPDG State does not have a tracking system for highly qualified special education teachers they will need to put an agreement in place with the individual receiving funds or services › This agreement will require information from that individual for the life of the grant 56

 Can be found at: es/205 es/205 › Called “ 2014 Recent Program Measures Guidance (3/27/2014) “ at the top of the page 57

58