Presentation is loading. Please wait.

Presentation is loading. Please wait.

2018 OSEP Project Directors’ Conference

Similar presentations


Presentation on theme: "2018 OSEP Project Directors’ Conference"— Presentation transcript:

1 2018 OSEP Project Directors’ Conference
OSEP Disclaimer 2018 OSEP Project Directors’ Conference DISCLAIMER: The contents of this presentation were developed by the presenters for the 2018 Project Directors’ Conference. However, these contents do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the Federal Government. (Authority: 20 U.S.C. 1221e-3 and 3474)

2 Evaluating Implementation and Improvement: Getting Better Together
Moderator: Leslie Fox, OSEP Panelists: Wendy Sawtell, Bill Huennekens, David Merves

3 Colorado SSIP – Implementation & Improvement
Wendy Sawtell State Systemic Improvement Plan Coordinator State Transformation Specialist

4 ALIGNMENT IS OUR GOAL State-identified Measurable Result is based upon the second portion of the Theory of Action.

5 Colorado SSIP Plan, Do, Study, Act
MODEL FOR IMPROVEMENT What are we trying to accomplish? How will we know that a change is an improvement? What change can we make that will result in improvement? PDSA Cycle and Model for Improvement (1994) Citation: Moen, R. (2009), Foundation and History of the PDSA Cycle, Associates in Process Improvement-Detroit, MI.

6 6 Core Principles of Improvement
Organize as networks (social learning) Learn through disciplined inquiry Embrace measurement See the system Attend to variability Be problem specific and user-centered 6 Core Principles of Improvement Citation: Bryk, A. S. (2018, April 3). Advancing quality in continuous improvement. Speech presented at the Carnegie Foundation Summit on Improvement in Education, San Francisco, CA.

7 Anthony Bryk, Keynote Speech: Advancing quality in continuous improvement
Lesson 1: Take time to really understand the problem you have to solve Lesson 2: Develop evidence that truly informs improvement Lesson 3: As problems increase in their complexity, engage and activate the diversity of expertise assembled in improvement networks Lesson 4: It is a paradigm shift: a cultural transformation in the ways schools work

8 Practice to Policy Feedback Loop National Implementation Research Network
Professional learning, technical assistance, and time for educators to do the work the way it was intended. Changes in practice and policy to produce system change. NIRN - Systemic Change

9 Fidelity Assessment Measuring Implementation
“Fidelity assessment is defined as indicators of doing what is intended.”

10 SSIP Structured Literacy Project Fidelity Assessment
IMPLEMENTATION COACHES Colorado State Systemic Improvement Plan

11 Phase I, Year One - Appendix J Structured Literacy Routine Rubric
Practice Profile Overall Routine Overall understanding of Language Structures Knowledge of Early Reading development Pacing 8 Instructional Components All educators participate in on-going learning opportunities of each element included on the practice profile

12 Table 1: Method to categorize fidelity assessment items
Citation: Retrieved from Module 7 of the Active Implementation Hub on July 6, 2018 Type of Assessment Direct Observation Record Review Ask Others Context Organization of the classroom and student groups Lesson plan is available for review Interview the Master Teacher re: teacher’s planning and preparation activities Content Lesson plan followed during class period Lesson plan contains key elements of an effective approach to teaching the subject Interview Master Teacher re: use of agreed upon curriculum; ratings of Master Teacher regarding reliable inclusion of effective instructional approaches in lesson plans Competence Observation of teachers’ engagement of students with information and questions; and teachers’ use of prompt and accurate feedback Review Coaching notes to see progress on instructional skill development Interview students re: instruction methods used by the teacher

13 Structured Literacy Routine Rubric Evaluation by Implementation Coaches (3 times per year)
CDE Literacy Specialists IMPLEMENTATION COACHES

14 Comparison in the percentage of students progressing from Kindergarten ( SY) to first-grade ( SY) in each of the DIBELS Next performance ranges (19 schools)

15 Table 2: Fidelity scores as a diagnostic tool.
Citation: Retrieved from Module 7 of the Active Implementation Hub on July 6, 2018 ? High Fidelity Low Fidelity Good Outcomes Celebrate! Re-examine the innovation and modify the fidelity assessment Poor Outcomes Modify the Intervention Start Over

16 Our Next Steps Using the 6 Core Principles of Improvement
Organize as networks (social learning) Learn through disciplined inquiry Embrace measurement See the system Attend to variability Be problem specific and user-centered 6 Core Principles of Improvement Citation: Bryk, A. S. (2018, April 3). Advancing quality in continuous improvement. Speech presented at the Carnegie Foundation Summit on Improvement in Education, San Francisco, CA.

17 EXPLORATION PHASE Consideration of a new evaluation tool
Network Leadership – Build, Manage, and Evaluate Effective Networks Citation: Retrieved from the Partner Tool website on July 6, 2018

18 How can I use the Partner tool data?
“PARTNER data are meant to be used as a Quality Improvement process, focused on strategic planning (to steer decision-making).” Citation: Retrieved from the Partner Tool website on July 6, 2018

19 Using the PDSA Cycle with the PARTNER tool
“To do this you will need to: identify your goals (plan), implement your collaborative activities (do), gather PARTNER data (study), and develop action steps to get you from where you “are” to where your goals indicate you “should be” (act).” Citation: Retrieved from the Partner Tool website on July 6, 2018

20 Reflection and Learning
To increase your awareness, please read the handout with information about the tool Based upon your knowledge at this point, would this tool be useful to you? Discuss with your elbow partner what may be beneficial and why Or you can access the material directly from the website at Citation: Retrieved from: the PARTNER Tool website on July 6, 2018

21

22 Agenda What is CIID Goal of evaluation efforts
Ensure understanding of CIID approach and work, align the plans with that work Execute the plan and collect the data Tools used in the effort Evaluate the data, review recommendations for making changes and improving our work

23 What is CIID Center for the Integration of IDEA Data
Resolve data silos Improve the quality and efficiency or EDFacts reporting Develop and implement automated reporting tool - Generate

24 Evaluation effort Goal
Collaborative effort with external evaluator to establish comprehensive evaluation program to improve Technical Assistance (TA) services to SEAs

25 Understanding CIID work
CIID work is a little different – significant part of the work is IT project management Focus on people, process and systems of data management Evaluation team included in the work – help us shape the work Leadership meetings to calls with TA providers to Training Access to all documentation

26 Data Collection Plan Establish timeframes
Think about work flow in SEAs and with TA efforts Determine methods for different activities Focus groups Manage burden with annual survey Ensure coverage of Universal, Targeted and Intensive TA

27 Execution of the Plan Collaborative effort
Communication – CIID Leadership involved Identify audiences Challenge across special education, IT and EDFacts staff in SEAs

28 Evaluation Tools Data System Framework
Adopted from DaSy Data System Framework used in Part C Administered before and after Determine if goals and objective of data integration were met Intensive Technical Assistance Quality Rubric David will cover in detail

29 Evaluation Results Review reports in detail with evaluation team
Reports include recommendations Share reports with TA providers Explore options for addressing areas that need improvement

30 Making changes Based on Focus group information – changed engagement with stakeholders Individual meetings with TA providers Trainings with TA providers Example - CEDS Tools

31 Discussion Question How do we link our collaborative and comprehensive effort to improve services to SEAs to educational outcomes for students with disabilities?

32 Contact Information Bill Huennekens Follow us on Sign-up for our newsletter: Visit the CIID website: Follow us on LinkedIn:

33 Thank You!

34 Evaluating Implementation and Improvement: Getting Better Together
Evergreen Evaluation & Consulting, Inc. David Merves MBA, CAS

35 New Evaluation Tools to Inform Project Improvement
Intensive Technical Assistance Quality Rubric PARTNER Tool Measuring Fidelity as a Method for Evaluating Intensive Technical Assistance

36 Intensive Technical Assistance Quality Rubric
Developed by IDEA Data Center (IDC) It is still in DRAFT form Guiding Principles for TA Four Components of Effective Intensive TA

37 PARTNER TOOL Designed by Danielle Varda with funding from RWJ Foundation Housed at University of Colorado Denver Program to Analyze, Record, and Track Networks to Enhance Relationships Participatory Approach, PARTNER Framework, and FLEXIBLE

38 Measuring Fidelity as a Method for Evaluation Intensive Technical Assistance
Presentation at a TACC meeting in September 2015 by Jill Feldman and Jill Lammert of Westat Identify the key components of the intensive TA your Center provides Define and operationalize measurable indicators for key intensive TA components in your logic model Identify data sources and measures for each indicator Create numeric thresholds for determining whether adequate fidelity has been reached for each indicator Assign fidelity ratings (scores) for each indicator and combine indicator scores into a fidelity score for the key component

39 How would you use any of these tools/instruments in your work?
Discussion Question How would you use any of these tools/instruments in your work?

40 THANK YOU David Merves Evergreen Evaluation & Consulting, Inc
(802)

41 2018 OSEP Project Directors’ Conference
OSEP Disclaimer 2018 OSEP Project Directors’ Conference DISCLAIMER: The contents of this presentation were developed by the presenters for the 2018 Project Directors’ Conference. However, these contents do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the Federal Government. (Authority: 20 U.S.C. 1221e-3 and 3474)


Download ppt "2018 OSEP Project Directors’ Conference"

Similar presentations


Ads by Google