Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluating Practices Workshop Series

Similar presentations


Presentation on theme: "Evaluating Practices Workshop Series"— Presentation transcript:

1 Evaluating Practices Workshop Series
Session 1: What are we measuring and why? Session 1: What are we measuring and why? Shoring up your foundation for evaluating practice change and practice fidelity: evaluation questions, outcomes and performance indicators. This workshop series is sponsored by DaSy and ECTA, in collaboration with IDC and NCSI. February 1, 2018

2 Welcome! Who is with us? Part C and Part B staff from over 40 states!

3 Intended Outcomes of this Session
Understand the difference between evaluating practice change and evaluating practice fidelity. Understand what good practice change and practice fidelity outcomes, evaluation questions, and performance indicators look like. Understand how to align intended outcomes, evaluation questions, and their related performance indicators. Identify improvements needed to state evaluation plans for measuring both practice change and practice fidelity.

4 Working Session Agenda
Ground Rules Summary of Pre-work Presentation – Evaluating Practice Change and Practice Fidelity Large Group Activity – Aligning outcomes & performance indicators to activities and evaluation questions Small Group Work – Developing practice change and practice fidelity outcomes and performance indicators State identification of improvements/changes needed Wrap-Up & Plan for Session 2

5 Ground Rules Share openly (we are not recording)
What is shared in the room, stays in the room Listen actively Ask one another questions Participate to the fullest of your ability Participate in all 3 sessions Come prepared – complete pre-work Let us know your needs

6 POLL EVERYWHERE: Word Cloud
Instructions Go to pollev.com/dasy on your computer, smart phone or tablet OR text “DASY” to 22333 Once you have joined, text your one word response to the question Question: In one word, what comes to mind when you think about evaluating practice change and practice fidelity?

7

8 Summary of Pre-work Most state teams reported measuring practice fidelity. Many reported measuring practice change. There is variation in the ways states are approaching measuring practice change and practice fidelity. Many states have collected data on practice change/fidelity.

9 Practice Implementation Improves Results
Good outcomes for children with disabilities and their families Implementation of Effective Practices Increase quantity, (e.g., scaling up, more practices) Increase quality Practice quality sustained over time Must have strong infrastructure to support implementation of practices. It is the practices, implemented as intended over time, that yield good outcomes for children and families.

10 Definition of Practices
The teachable and doable behaviors that practitioners use with children and families which can be used, replicated, and measured for fidelity The practice needs to be clearly defined in order to be measured. Practices are behaviors and actions of practitioners, not awareness, knowledge, confidence, or self-efficacy. These are important to measure, but they are not the practices being implemented. They are important precursors that set the stage for change in practitioner behavior. Practices need to be measurable and operationalized so everyone knows what they are and what effective implementation looks like. Then you can measure implementation.

11 Definitions Practice Change: Increase or decrease in the number, frequency, precision, or quality of practices implemented by a practitioner as compared across at least two points in time. Fidelity: The extent to which practitioners are implementing an evidence-based program or practice as intended, as measured against a threshold or predetermined level. Fidelity implies strict and continuing faithfulness to the original innovation or practice that is expected to lead to the desired child/family outcomes.  . Note: these terms are often used in different ways. These definitions are ones we will use throughout the workshops. They are consistent with the definitions from OSEP, DaSy, ECTA, and other TA centers. Practice change – across routines, activities, children, families, settings. Fidelity – needs to be maintained over time Fidelity is assessed at a point in time, but once a practitioners reaches fidelity do not stop assessing for fidelity. It should be measured periodically to ensure practitioners maintain fidelity. When practitioners maintain fidelity, positive outcomes can be expected. We will talk more about fidelity thresholds and measurement in session 2.

12 Relationship between Practice Change and Fidelity
Both practice change and practice fidelity are about the implementation of evidence-based practices (EBPs) Are service providers changing their implementation of EBPs? Are service providers implementing EBPs as intended? Service providers can demonstrate changes in practices without reaching fidelity. Evaluating implementation of the evidence-based practice(s) Fidelity – evaluating the degree to which practitioners implement the EBP as intended Practice change – evaluating the degree to which practitioners change their implementation of practices relative to fidelity Change in the frequency, intensity, quality of implementation Progress towards fidelity Measuring fidelity is important, but don’t forget to also look at your data to examine practice change. OSEP wants to see progress, and practice change data can demonstrate progress towards fidelity. Don’t wait until practitioners reach fidelity to report on their implementation of the evidence-based practices. For example, on a fidelity checklist, even if a practitioner does not meet the threshold for fidelity, she/he can demonstrate practice change and progress toward fidelity by obtaining a higher score in May as compared to January.

13 Measuring Practice Change and Fidelity
Use the same tools or methods to evaluate fidelity versus practice change. Practice change can be evaluated in relation to fidelity. Data on practice change provides formative data to show progress is being made implementing the evidence-based practices before practitioners reach fidelity. Practice change data provide valuable information, so don’t skimp on getting this information on your progress. We will talk more about measurement and analysis in sessions 2 and 3.

14 Why measure practice change and practice fidelity?
To ensure practices are implemented as such that improved outcomes are expected To determine whether practitioners maintain fidelity over time To obtain information for monitoring progress Incremental increases in fidelity can indicate that improvement strategies are working and highlight areas where practitioners need additional support Corrections/adjustments in practice can be made in a timely manner Children and families do not benefit from programs/practices they don’t receive, i.e., if EBPs are not being implemented, not being implemented well, or not being sustained over time, then improved outcomes should not be expected. When a practitioner reaches fidelity of practice implementation and maintains fidelity over time, that yields achievement of outcomes. Just because a practitioner might change their practices doesn't necessarily lead to improved outcomes. Evaluating practice change is an interim measure of progress in implementing practices prior to the longer term outcome of practitioners reaching and maintaining fidelity.

15 Evaluation Plan Components
Outcomes Evaluation Questions Performance Indicators Measurement/Data Collection Methods Outcomes - Statement of the benefit or change you expect as a result of the completed activities. Outcomes can vary based on two dimensions: When you would expect the outcomes to occur, i.e., short-term, intermediate or long-term (impact); and The level at which you are defining your outcome, e.g., state level, local/program level, practitioner, child/family. Evaluation Questions - The key questions the state wants to learn and answer with the evaluation, usually focused on two main areas: Process/Implementation: How’s it going? Are we effectively implementing our planned activities? Outcomes: What good did it do? Are we achieving the results we intended? Performance Indicator - An item of information that provides evidence that a certain condition exists or that certain results have or have not been achieved.  There are a number of types of indicators, including those that measure inputs, process, outputs and outcomes. Good performance indicators identify specific, observable and measurable pieces of information and require the use of such terms as “number of”, “percent of”, “mean of”, or similar phrases. Measurement/Data Collection Methods - Identify the evaluation methods that will be used to collect data for each indicator and who the data will be collected on

16 Example Evaluation Plan
Type of Outcome Outcome Description Evaluation Questions How Will We Know the Intended Outcome Was Achieved? (performance indicator) Intermediate- Practice Change Practitioners who receive coaching in implementing relationship-based intervention practices increase use of those practices with families and infants/toddlers with social emotional needs Do practitioners who receive coaching increase implementation of relationship-based practices with families and infants/toddlers with social emotional needs? 80% of providers who receive coaching increase their use of the relationship-based practices from the checklist 6 months after training when working with children and families. Intermediate- Practice Fidelity Practitioners implement relationship-based intervention practices with fidelity to support families in achieving goals for the child and the family Do practitioners who receive coaching implement practices with fidelity? 80% of practitioners implement the targeted practices from the checklist with fidelity Practice change and practice fidelity outcomes and their related evaluation questions and performance indicators. These examples show that you will be measuring an increase in the practitioners’ use of the practices whereas the fidelity measure focuses on the practitioner reaching fidelity. (Your state would have a definition of or would need to define fidelity based on the tools you use to measure the performance indicator. We will talk about measurement in the next working session.) You also see in this example that both practice change and practice fidelity are measured using the same tool. Reminder, practice change typically occurs prior to reaching fidelity and can serve as an interim measure to report progress toward fidelity and improving outcomes.

17 Performance Indicators
Alignment Considerations of Outcomes, Questions, and Performance Indicators Outcomes Describe what you intend to achieve as a result of activity(ies) related to EBPs Often interconnected Define steps toward achieving SiMR Evaluation Questions Describe what you need to know to determine if you have achieved the outcome Performance Indicators Describe how you will answer your evaluation question Are based on measurement data (e.g., extant or new) Outcomes describe what you intend to achieve as a result of implementing activities that support practice implementation, are frequently interconnected with each other, and define steps toward achieving your SIMR. Evaluation questions describe what you want to know and learn in order to determine if you achieve your outcome. Also, if an outcome focuses on practice change the question focuses on what you want to know about practice change. Performance indicators describe how you will answer your evaluation question and are based on measurement data (e.g., extant or new).   The next working session will focus on measurement.

18 Large Group Activity: Alignment
Outcome Description Evaluation Question Performance Indicator Providers use evidence-based practices to support social-emotional skills with parents and infants/toddlers identified with social emotional needs Do providers know how to use evidence-based practices to support social-emotional skills with parents and infants/toddlers identified with social-emotional needs? 80% of providers completed training on evidence-based practices to support social emotional skills with parents and infants/toddlers identified with social-emotional needs Aligning Infrastructure Outcomes, Evaluation Questions and Performance Indicators Start with the outcome. Does the evaluation question align with the outcome? What might be a better evaluation question? Does it ask what you want to know and learn about the outcome? Does the performance indicator align with the evaluation question? What would be a better performance indicator? In this example, the evaluation question and performance indicator are not aligned with the outcome. The next slide shows a corrected version.

19 Large Group Activity: Alignment
Outcome Description Evaluation Question Performance Indicator Providers use evidence-based practices to support social-emotional skills with parents and infants/toddlers identified with social emotional needs Do providers use evidence-based practices to support social-emotional skills with parents and infants/toddlers identified with social-emotional needs? 80% of providers use evidence-based practices to support social emotional skills with parents and infants/toddlers identified with social-emotional needs Aligning a Practice Change Outcomes, Evaluation Question and Performance Indicators The corrected information has been bolded. The evaluation question and performance indicator are now aligned with the outcome.

20 Key Take-Away Points Practices = behaviors
Alignment, alignment, alignment Evaluate both practice change and practice fidelity using the same measures Evaluate practices across children, settings, families, programs Practices are BEHAVIORS, not knowledge, self-efficacy, confidence, etc.

21 Small Breakout Group Work
Introductions Structured Activity: Identify practice fidelity evaluation question and performance indicator based on a practice fidelity outcome State Sharing: Identify if practice change and/or practice fidelity outcomes, questions and performance indicators need to be added or modified Identify if evaluation plan alignment is needed Share what supports are needed in making these changes Know it will be tempting to share what has been happening and status update, but with limited time, we really want you to keep your focus on evaluation of practice change and fidelity

22 Breakout Group Instructions
We will go into virtual breakout groups in Adobe Connect For visual – you will be automatically moved into breakouts through this Adobe Connect room For audio – you will call into conference line for each specific breakout group (next slide & in Adobe note) We will go into virtual breakout groups in Adobe Connect Stay in this Adobe Connect, will be automatically moved into rooms Call into conference line for each specific breakout group for audio Spend about min in breakout groups.

23 Small Group Agenda Who’s with us in the small group?
Structured Activity: Identify practice fidelity evaluation question and performance indicator based on a practice fidelity outcome State Sharing: Identify if practice change and/or practice fidelity outcomes, questions and performance indicators need to be added or modified Identify if evaluation plan alignment is needed. Share what supports are needed in making these changes

24 Practice Fidelity Outcomes/Measures
Outcome Description Evaluation Questions Performance Indicator Practitioners demonstrate fidelity in their use of evidence based practices to support social emotional skills Come up with an evaluation question and a performance indicator that align with the outcome listed here.

25 Evaluation Plan Worksheet Handout
The worksheet provided may be helpful to use as you think through your evaluation plan and any changes that are needed throughout the workshops. For this workshop, you can fill in the outcome, evaluation question, and performance indicator. In sessions 2 and 3 you can think through the other components.

26 State Sharing Re: Evaluation Plan Improvements
What do you need to add or modify in your evaluation plan to effectively evaluate practice change and practice fidelity and ensure alignment (e.g. add an outcome, revise a performance indicator)? What supports do you need in making these changes to your evaluation plan?

27 Your Take Aways What struck you today?
What did this get you thinking about?

28 Formative Evaluation What worked well? What could be improved for next time?

29 Wrap-Up Session 2: February 15th from 3:30-5pm Eastern
How are we measuring? Delving deeper into measurement strategies and data sources Pre-work - will be sent out next week Defining and operationalizing practices so they are measurable Characteristics of a high-quality tool for measuring practice change and fidelity

30 See you in 2 weeks! Thank you
The contents of this presentation were developed under grants from the U.S. Department of Education, # H373Z120002, #H326P120002, H326R140006, and H373Y However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Meredith Miceli, Richelle Davis, Julia Martin Eile, Perry Williams, and Shedeh Hajghassemali.


Download ppt "Evaluating Practices Workshop Series"

Similar presentations


Ads by Google