SPDG New Grantee Orientation

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Five -Year Strategic Title I School Plan. Session Objectives Review the five year components utilizing the rubric Organize actions steps to meet the requirements.

RTI Implementer Webinar Series: Establishing a Screening Process
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
SAISD Office for Professional Learning Advisory Board March 2, 2005 Navarro Academy March 2, 2005 Navarro Academy 623 S. Pecos.
BUILDING CAPACITY FOR UNIVERSAL PREVENTION THROUGH STATE-NONPROFIT-UNIVERSITY- SCHOOL SYSTEM PARTNERSHIPS Philip J. Leaf, Ph.D. Johns Hopkins University.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Are We making a Difference
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
An Evidence-Based Approach to Professional In-service Training Carl J. Dunst, Ph.D. Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville.
Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
Connecting with the SPP/APR Kansas State Personnel Development Grant.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
APR Know-how Jennifer Coffey November 2013 The Revised SPDG Program Measures and Other Reporting Requirements.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Program Measure Review – Considerations for the APR  Jennifer Coffey, PhD, OSEP Program Lead 1 “Continuous improvement is better than delayed perfection.”
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Program Measures Web page: pages/205 pages/205 1.
Welcome to Directors’ Program Measure Webinar - Presenter: Jennifer Coffey, Ph.D., OSEP Project Officer, SPDG Program Lead.
Moving Title IA School Plans into Indistar ESEA Odyssey Summer 2015 Presented by Melinda Bessner Oregon Department of Education.
Tuesday, April 12 th 2011 SPDG Performance Measure Discussion.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
Grant Management PLC Session Discussion facilitated by Jennifer Coffey November 2011 Performance Measurement Discussion Dial-in: Participant.
SPDG: Potential Applicants’ Webinar #1 Jennifer Coffey, PhD State Personnel Development Grants (SPDG) Program Lead.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
School Leadership Evaluation System Orientation SY12-13 Evaluation Systems Office, HR Dr. Michael Shanahan, CHRO.
Jennifer Coffey, Ph.D. SPDG Program Lead August 30, 2011.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Assessing High Quality Professional Development BY: MONICA BALLAY |
2015 Leadership Conference “All In: Achieving Results Together”
North Carolina Council on Developmental Disabilities
2015 Leadership Conference “All In: Achieving Results Together”
NC State Improvement Project
Child Outcomes Summary Process April 26, 2017
Program Review For School Counseling Programs
SPDG Bidders’ Webinar TA&D Resources 6/01/2011
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Welcome to the SPDG Webinar
School Self-Evaluation 
The Revised SPDG Program Measures: An Overview
Miblsi.cenmi.org Helping Students Become Better Readers with Social Skills Necessary for Success Steve Goodman Funded through OSEP.
2018 OSEP Project Directors’ Conference
Introductions Introduction
Introduction Introduction
Implementing Race to the Top
Common Core State Standards AB 250 and the Professional Learning Modules Phil Lafontaine, Director Professional Learning and Support Division.
323A State Personnel Development Grants SPDG Webinar on Grant Performance Report for Continuation Funding Jennifer Coffey Office of Special Education.
Parent-Teacher Partnerships for Student Success
McREL TEACHER EVALUATION SYSTEM
School-Wide Positive Behavioral Interventions and Supports (SWPBIS)
Introductions Introduction
Facilitators: Jennifer Coffey, OSEP Project Officer
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
APR Informational Webinar
McREL TEACHER EVALUATION SYSTEM
School Leadership Evaluation System Orientation SY12-13
Presentation transcript:

SPDG New Grantee Orientation September 15, 2016 September 22, 2016 Jennifer Coffey & John Lind

Today’s Agenda Introductions Resources Program Measures

Congratulations! Welcome to the SIGnetwork Introductions State & your position in the grant Previous grant? New to the SPDG? Describe your initiatives and the populations you are serving What would success look like for these initiatives and for your grant generally? What would you like to learn from other SPDG grantees?

Opportunities to learn from other SPDGs SPDG National Meeting (Oct 11 & 12) Directors’ Webinars Communities Coaching Evaluators SSIP Family Engagement Mentoring Sending questions to the SIGnetwork listserv OSEP Project Directors’ Conference (July 2018)

The SIGnetwork Website Topical resources (e.g., coaching tools) SPDG contacts SPDG project abstracts Recorded Webinars Program Measure exemplars and how-to Grant management support Events calendar

Annual Performance Report http://www.signetwork.org/content_pages/10#apcr Provides 2 Webinars: (1) Program Measures how-to; (2) Annual Performance Report how-to Gives you the continuation reporting package Program Measures how-to: http://www.signetwork.org/content_pages/205 Provides step-by-step directions, examples, exemplars, Webinars Has the document you must fill out for Program Measure 1 (EB-PD Worksheet)

Recommendation Review the Webinars Then create a draft APR and send to your project officer to review with you This will help you focus your data collection in this first year Your evaluator may be able to help

Speaking of evaluation… The Center to Improve Project Performance (CIPP) evaluation resource: Working with a 3rd party evaluator http://signetwork.org/content_page_assets/content_page_177/Guidelines%20for%20Working%20with%20Third-Party%20Evaluators--Section%20508%20Compliant.pdf Evaluation resource page: http://signetwork.org/content_pages/177

Other Resources www.OSEPIDEASThatWork.org Federal Resources for Stakeholders Looking for guidance about specific OSEP priorities? Access this set of resources to learn more about OSEP recommendations relevant to all special education stakeholders. Resources for Grantees This section is designed to help OSEP Grantees accomplish their goals, including guidance on technical assistance (TA), collaboration tools, and resources and funding opportunities for doctoral students. Find a Center or Grant

Regional Education Laboratories (RELs) http://ies.ed.gov/ncee/edlabs/ Webinars Ability to do research for SEAs Guidance documents Tools

Uniform Guidance http://www2.ed.gov/policy/fund/guid/uniform-guidance/index.html Webinar FAQs

New G5 Hotline Self-Help Portal On Aug. 3, 2015, the G5 Hotline will debut a Self-Help Portal to all external users of G5, the U.S. Department of Education's Grants Management System. The Self-Help portal, accessed through the EDCAPS.FORCE.COM web address from any location at any time, allows: G5 users to submit and track their own tickets Access knowledge articles 24/7 to find answers to issues and questions about the G5 application

The SPDG Program/GPRA Measures: An Overview Program Measures Web page: http://www.signetwork.org/content_pages/205

Capturing Performance Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies. Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG-supported practices. (Efficiency Measure) Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.

Performance Measure #1 Projects use evidence-based professional development practices to support the attainment of identified competencies.

Best Practices in Selection Job or role description should be explicit about expectations and accountability for all positions (e.g., teachers, coaches, staff, administrators) Readiness measures to select at a school building-level or school district-level. Interactive interview process (Blase, VanDyke, & Fixsen, 2010)

Best Practices in Training Training must be … Timely Theory grounded (adult learning) Skill-based Information from Training feeds back to Selection and feeds forward to Coaching Selection Training Coaching (Blase, VanDyke, & Fixsen, 2010)

Using Research Findings to Inform Practical Approaches to Evidence-Based Practices Carl J. Dunst, Ph.D., Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Recording and resources: http://www.signetwork.org/event_calendar/events/396 Presentation Prepared for a Webinar with the Knowledge Transfer Group, U.S. Department of Health and Human Services, Children’s Bureau Division of Research and Innovation, September 22, 2009

“Adult learning refers to a collection of theories, methods, and approaches for describing the characteristics of and conditions under which the process of learning is optimized.”

Six Characteristics Identified in How People Learna Were Used to Code and Evaluate the Adult Learning Methods Planning Introduce Engage the learner in a preview of the material, knowledge or practice that is the focus of instruction or training Illustrate Demonstrate or illustrate the use or applicability of the material, knowledge or practice for the learner Application Practice Engage the learner in the use of the material, knowledge or practice Evaluate Engage the learner in a process of evaluating the consequence or outcome of the application of the material, knowledge or practice Deep Understanding Reflection Engage the learner in self-assessment of his or her acquisition of knowledge and skills as a basis for identifying “next steps” in the learning process Mastery Engage the learner in a process of assessing his or her experience in the context of some conceptual or practical model or framework, or some external set of standards or criteria a Donovan, M. et al. (Eds.) (1999). How people learn. Washington, DC: National Academy Press.

Additional Translational Synthesis Findings The smaller the number of persons participating in a training (<20), the larger the effect sizes for the study outcomes. The more hours of training over an extended number of sessions, the better the study outcomes.

Effect Sizes for Introducing Information to Learners Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Pre-class exercises 9 1.02 .63-1.41 Out of class activities/self-instruction 12 20 .76 .44-1.09 Classroom/workshop lectures 26 108 .68 .47-.89 Dramatic readings 18 40 .35 .13-.57 Imagery 7 .34 .08-.59 Dramatic readings/imagery 4 11 .15 -.33-.62

Effect Sizes for Self-Assessment of Learner Mastery Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Standards-based assessment 13 44 .76 .42-1.10 Self-assessment 16 29 .67 .39-.95

Summary of Training Findings To be most effective need to actively involve the learners in judging the consequences of their learning experiences (evaluate, reflection, & mastery) Need learner participation in learning new knowledge or practice Need learner engagement in judging his or her experience in learning and using new material

Best Practices in Coaching Design a Coaching Service Delivery Plan Develop accountability structures for Coaching – Coach the Coach! Identify on-going professional development for coaches Training Coaching Performance Assessment (Blase, VanDyke, & Fixsen, 2010)

Best Practices in Performance Assessment Must be a transparent process Use of multiple data sources Fidelity of implementation should be assessed at the local, regional, and state levels Tied to positive recognition Information from this driver feeds back to Selection, Training, and Coaching and feeds forward to the Organization Drivers

Best Practices in Decision Support Data Systems Assess fidelity of implementation at all levels and respond accordingly Identify outcome measures that are … Intermediate and longer-term Socially valid Technically adequate: reliable and valid Relevant data that is feasible to gather, useful for decision making, widely shared and reported frequently

Best Practices in Facilitative Administration A Building/District Leadership and Implementation Team is formed The Team uses feedback and data to improve Implementation Drivers Policies and procedures are developed and revised to support the new ways of work Solicits and analyzes feedback from staff and stakeholders

Best practices in Systems Interventions Leadership analyzes feedback from staff and makes changes to alleviate barriers and facilitate implementation, Revising policies and procedures to support new way of work.

Evaluating Professional Development

Guskey’s Five Critical Levels of Professional Development Evaluation Reprinted from: Guskey, T. R. (2000). Evaluating professional development (pp. 79-81). Thousand Oaks, CA: Corwin Press. Evaluation Level What Questions Are Addressed How Will Information Be Gathered? What Is Measured or Addressed? Be Used? 1. Participants’ reactions Did they like it? Was their time well spent? Did the material make sense? Will it be useful Was leader knowledgeable & helpful? Were the refreshments fresh & tasty? Was the room the right temp.? Were the chairs comfortable? Questionnaires administered at the end of the session Focus groups Interviews Personal learning logs Initial satisfaction with the experience To improve program design and delivery 2. Participants’ learning Did participants acquire the intended knowledge and skills? Paper-and-pencil instruments Simulations & demonstrations Participant reflections (oral and/or written Participant portfolios Case study analyses New knowledge and skills of participants To improve program content, format, and organization 3. Organization support and change What was the impact on the org.? Did it affect organizational climate and procedures? Was implementation advocated, facilitated, and supported? Was the support public and overt? Were problems addressed quickly and efficiently? Were sufficient resources available? District and school records Minutes from follow-up meetings Questionnaires Structured interviews with participants and school or district administrators The organization’s advocacy, support, accommodation, facilitation, and recognition To document and improve organizational support To inform future change efforts 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implementation To document and improve the implementation of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well-being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Structured interviews with students, parents, teachers, and/or administrators Student learning outcomes: Cognitive (performance & achievement) - Affective (attitudes and dispositions) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementation, and follow-up To demonstrate the overall impact of professional development 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implementation To document and improve the implementation of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well-being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Structured interviews with students, parents, teachers, and/or administrators Student learning outcomes: Cognitive (performance & achievement) - Affective (attitudes and dispositions) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementation, and follow-up To demonstrate the overall impact of professional development

Guskey’s Five Critical Levels of Professional Development Evaluation Reprinted from: Guskey, T. R. (2000). Evaluating professional development (pp. 79-81). Thousand Oaks, CA: Corwin Press. 4. Participants’ use of new knowledge and skills Did participants effectively apply the new knowledge and skills? Questionnaires Structured interviews with participants and their supervisors Participant reflection (oral and/or written) Participant portfolios Direct observations Video- or audiotapes Degree and quality of implementation To document and improve the implementation of program content 5. Student learning outcomes What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well-being? Are students more confident as learners? Is student attendance improving? Are dropouts decreasing? Student records School records Structured interviews with students, parents, teachers, and/or administrators Student learning outcomes: Cognitive (performance & achievement) - Affective (attitudes and dispositions) - Psychomotor (skills & behavior) To focus and improve all aspects of program design, implementation, and follow-up To demonstrate the overall impact of professional development Evaluation Level What Questions Are Addressed How Will Information Be Gathered? What Is Measured or Addressed? Be Used? 1. Participants’ reactions Did they like it? Was their time well spent? Did the material make sense? Will it be useful Was leader knowledgeable & helpful? Were the refreshments fresh & tasty? Was the room the right temp.? Were the chairs comfortable? Questionnaires administered at the end of the session Focus groups Interviews Personal learning logs Initial satisfaction with the experience To improve program design and delivery 2. Participants’ learning Did participants acquire the intended knowledge and skills? Paper-and-pencil instruments Simulations & demonstrations Participant reflections (oral and/or written Participant portfolios Case study analyses New knowledge and skills of participants To improve program content, format, and organization 3. Organization support and change What was the impact on the org.? Did it affect organizational climate and procedures? Was implementation advocated, facilitated, and supported? Was the support public and overt? Were problems addressed quickly and efficiently? Were sufficient resources available? District and school records Minutes from follow-up meetings Questionnaires Structured interviews with participants and school or district administrators The organization’s advocacy, support, accommodation, facilitation, and recognition To document and improve organizational support To inform future change efforts

SPDG Professional Development Rubric 5 Domains, each with components Selection Training Coaching Performance Assessment/Data-based decision making Facilitative administration/Systems intervention Components from the National Implementation Research Network, Learning Forward (NSDC), Guskey, Trivette Each component of the domains will be rated from 1 - 4

Component Themes Assigning responsibility for major professional development functions (e.g., measuring fidelity and outcomes; monitoring coaching quality) Expectations stated for all roles and responsibilities (e.g., PD participants, trainers, coaches, school & district administrators) Data for each stage of PD (e.g., selection, training, implementation, coaching, outcomes)

SPDG Initiatives and Evidence-based Professional Development EB-PD should be applied to those initiatives that lead to implementation (of the practice/program providing training on)

Grantee Benchmarks 1st year of funding: baseline 2nd yr: 50% of components will have a score of 3 or 4 3rd yr: 70% of components will have a score of 3 or 4 4th yr: 80% of components will have a score of 3 or 4 5th yr: 80% of components will have a score of 3 or 4 (maintenance yr)

Implementation fidelity Fidelity of implementation is traditionally defined as “the extent to which the user’s current practice matches the ideal (Loucks, 1983).

Measure 2 Methodology Each initiative should have a fidelity measure that notes the presence or absence of the core features of the innovation/program/system that the initiative is focused on Be clear about the name of the measure and stating that it is a fidelity measure. Choose one fidelity measure only for your program measure. You can use other measures as project measures.

When possible… Use implementation measures that have already been created For example: new RTI implementation measure from the Natl RTI Center Literacy implementation: Planning and Evaluation Tool – Revised (PET-R) PBIS: Schoolwide Evaluation Tool (SET) Others…

Developing a Fidelity Measure (O’Donnell, 2005) To develop fidelity criteria, researchers often reported starting with a curriculum profile or analysis that outlined the critical components of the intervention along with an indication of the range of variations for acceptable use. The researcher or developer then outlined acceptable ranges of variation (Songer & Gotwals, 2005). A component checklist was then developed to record fidelity to these components (Hall & Loucks, 1977).

Practice Profile Defining “it” Through the Development and Use of Practice Profiles Guiding Principles identified Critical Components articulated For each critical component: Identified gold standard Identified acceptable variations in practice Identified ineffective practices and undesirable practices Hall and Hord, 2010 Implementing Change: Patterns, Principles, and Potholes (3rd Edition) and Adapted from work of the Iowa Area Education Agency

Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG-supported practices.

Operational definition of terms Professional development funds = a minimum of 90% of the overall budget being used for activities from subsection "a" of the notice/Statute Only following the initiatives from Program Measure 1 & 2 Follow-up activities = the professional development assistance provided following training. A list of follow-up activities that are correlated with sustainability are provided.

Ongoing TA features: Coaching/mentoring Implementation fidelity measurement & other types of observation Mini-workshops Determining needs through data and providing guidance or tools to meet those needs Maintaining data systems Peer sharing

Ongoing TA cont. Model demonstration site activities Creating and disseminating enduring documents (procedural manuals) Communities of Practice TA Networks (support from internal state/local TA&D systems Regional PD partnerships

Methodology For each initiative, grantee should report cost of activities designed to sustain learning of scientific or evidence-based instructional practices, divided by the total cost of all professional development activities carried out for the initiative.

Cost of ongoing TA Cost of all PD activities for an initiative Equation Cost of ongoing TA Cost of all PD activities for an initiative

Methodology cont. Only need to report on those initiatives reporting on for Measures 1 & 2 Use dollar amounts in the equation. Otherwise your measure may not be counted in the external review Projects will set their own targets

Setting targets Consider what is happening each year of your project Are you providing training for an entire year before you begin providing coaching? In the final year of your project are you no longer providing training and only providing follow-up support?

Optimally Your initiative would help build local coaching capacity Projects would match/modify their training with (a) coaching, (b) performance feedback, and (c) student outcomes

Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.

Methodology Divide the number of teachers who remain in a teaching position by all teachers who received SPDG assistance.

Equation # of personnel retained for at least 2 yrs following participation in a SPDG teacher retention activity # of personnel participating in a SPDG activity designed to retain highly qualified special education teachers

Note This is only for projects that have teacher retention as an objective. Only inservice Initial participation is defined as beginning at the time someone receives funding or services from the SPDG grant.

Considerations If the SPDG State does not have a tracking system for highly qualified special education teachers they will need to put an agreement in place with the individual receiving funds or services This agreement will require information from that individual for the life of the grant

Program Measures Webpage Has relevant presentations, tools, links, etc: http://www.signetwork.org/content_pages/205 There is a tab for it on the top of the SIGnetwork homepage

Questions for us? To contact us: John Lind: jlind@uoregon.edu Jennifer Coffey: Jennifer.Coffey@ed.gov Your Project Officer…