OSEP Project Directors Meeting July 2018

Slides:



Advertisements
Similar presentations
Refresher: Background on Federal and State Requirements.
Advertisements

The Center for IDEA Early Childhood Data Systems ECTA/DaSy System Framework Self- Assessment June 18, 2015.
The Center for IDEA Early Childhood Data Systems The Power and Challenges of Early Childhood Integrated Data Systems: Implications for Researchers Kathleen.
The Center for IDEA Early Childhood Data Systems Using Needs Assessments to Identify and Evaluate Technical Assistance: Results of a National Survey about.
The Center for IDEA Early Childhood Data Systems Kathleen Hebbeler Lauren Barton Suzanne Raber The DaSy Center at SRI International The Power and Challenges.
Building High Quality Personnel Systems Improving Data, Improving Data Conference September 9, 2014.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
SPP/APR - SSIP Stakeholders Meeting # 5. Agenda for Today Stakeholder involvement Review Draft SSIP –Baseline Data / Target setting –Introduction –Data.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
The Center for IDEA Early Childhood Data Systems National Meeting on Early Learning Assessment June 2015 Assessing Young Children with Disabilities Kathy.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Mission Possible: Improving Academic and Behavioral Results for Children with Disabilities through Sustained Research Based Professional Development Deborah.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Oregon Statewide System of Support for School & District Improvement Tryna Luton & Denny Nkemontoh Odyssey – August 2010.
The Center for IDEA Early Childhood Data Systems Improving Data, Improving Outcomes Conference, September 2014 Digging into “Data Use” Using the DaSy Framework.
The Center for IDEA Early Childhood Data Systems The Importance of Personnel Data Donna Spiker Co-Director, DaSy Center OSEP 2016 Virtual leadership Conference.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Am I Making a Difference? Using Data to Improve Practice Megan Vinh, PhD Lise Fox, PhD 2016 National Inclusion Institute May 12, 2016.
Creating Engaging and Effective Data Displays
Professional Development: Evaluation Strategies to Maximize Impact
Child Outcomes Summary (COS) Process Professional Development Tools
Child Outcomes Data Collection: Results of the ITCA National Survey and Discussion of Policies and Considerations Christina Kasprzak, ECTA/DaSy Cornelia.
Engaging Families and Creating Trusting Partnerships to Improve Child and Family Outcomes More on Infusing Partnership Principles and Practices into Family.
Supporting Families’ and Practitioners’ Use of the DEC Recommended Practices Chelsea Guillen-Early Intervention Training Program at the University of.
Child Outcomes Summary Process April 26, 2017
Phase I Strategies to Improve Social-Emotional Outcomes
Benchmarks of Quality (BOQ) Training
Using Formative Assessment
National, State and Local Educational Environments Data:
OSEP Project Directors Meeting
Supporting Improvement of Local Child Outcomes Measurement Systems
National Webinar Presented by: Amy Nicholas Cathy Smyth
Telling Your SSIP Story
RESEARCH IMPLEMENTATION PRACTICE
The Future is Data Linking: Using DaSy TA to Position Your Program
Using Data to Reduce Suspension and Expulsion of Young Children
ECTA/DaSy System Framework Self-Assessment
Who Wants to be a 7-Point Rating Scale Millionaire?
Questions, Data, and Reports: Connecting the Data Dots Using the DaSy Framework System Design and Development Subcomponent Taletha Derrington and Kathleen.
ECTA/DaSy System Framework Self-Assessment Comparison Tool
Child Outcomes Data: A Critical Lever for Systems Change
Improving Data, Improving Outcomes Conference, September 2014
Pay For Success: An Invitation to Learn More
Evaluating Practices Workshop Series
IDEA Part C and Part B Section 619 National Child Outcomes Results for
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
2018 OSEP Project Directors’ Conference
ECTA/DaSy System Framework Self-Assessment
Using outcomes data for program improvement
Cornelia Taylor, Sharon Walsh, and Batya Elbaum
Supporting Improvement of Local Child Outcomes Measurement Systems
2018 Improving Data, Improving Outcomes Conference August 2018
Using Data to Answer Critical Questions about EI/ECSE Personnel
Let’s Talk Data: Making Data Conversations Engaging and Productive
7-Point Rating Scale Jeopardy
Using Data for Program Improvement
NC Preschool Pyramid Model Leadership Team Summit January 9-10, 2019
Using Data for Program Improvement
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Oh, the Places You’ll Go …with Data on Personnel
Encore Webinar February 13, 2019, 3:00pm ET
Ready to create a culture of data use?
Refresher: Background on Federal and State Requirements
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Christina Kasprzak Frank Porter Graham Child Development Institute
Cynthia Curry, Director National AEM Center
Data Culture: What does it look like in your program?
Implementing, Sustaining and Scaling-Up High Quality Inclusive Preschool Policies and Practices: Application for Intensive TA September 10, 2019 Lise.
Data Culture: What does it look like in your program?
Presentation transcript:

OSEP Project Directors Meeting July 2018 A System Perspective on Examining the Effectiveness of Professional Development Kathleen Hebbeler, SRI International Megan Vinh, Frank Porter Graham Child Development Institute OSEP Project Directors Meeting July 2018

What we will cover What is system thinking? What is an EI/ECSE/SE system? Implications of systems thinking for evaluating PD Evaluating the delivery of PD vs. system for PD Asking system-level evaluation questions How to analyzing data to address questions at different levels of the system

What is system thinking? What is an EI/ECSE/SE system?

System Thinking Interconnectedness Feedback loops System structure and understanding systems at different scales Different perspectives

Interconnections Good outcomes for children and families Local/ Regional System State Systems Federal Policies Practices (Interactions of providers with children and families) Recognizing interconnections Identifying and Understanding Feedback Understanding System Structure Understanding Systems at Different Scales Discuss systems thinking

What is infrastructure in EI and ECSE? State and local system that support the provision of services State system is what state agency needs to put in place to support implementation of IDEA and provision of evidence based practices Policies Procedures Other supports (e.g., data system, leadership team, training modules, coaches, implementation teams) Infrastructure changes over time but has a (semi) permanent quality Not person dependent, continues when staff leave Providing a one time training session is not infrastructure change; establishing procedures to repeat that training session every 6 months is infrastructure change.

What does a state need to put into place to support implementation of effective practices? Quality Standards Accountability & Quality Improvement Good outcomes for children with disabilities and their families Result Implementation of Effective Practices Data System Personnel / Workforce Finance Governance Building High-Quality Systems ectacenter.org/sysframe

Components of state system Governance Finance Personnel/Work Force Accountability & Quality Improvement Quality Standards Important: Well functioning state system requires all the components to work well. Not additive; not a hierarchy

Personnel subcomponent Leadership, Coordination and Sustainability (leadership team, written plan for CSPD) State Personnel Standards (aligned with national standards, across disciplines) Preservice Professional Development (IHE programs aligned with standards; address EC content) In-service Professional Development (statewide systems for PD and TA, across disciplines; aligned with higher ed) Recruitment and Retention (have good strategies) Evaluation (evaluation plan looks at all subcomponents; data collected and used to drive improvements)

Implications of systems thinking for evaluating PD

One and done vs. “True” Infrastructure Change There is nothing related to the Inservice PD “parts” that is one and done. Need to plan for ongoing PD; sustainability of the infrastructure Evaluation issue Not just did we do “it” (the training, the coaches, the module, etc.) Also, did we put procedures in place so “it” can be repeated to reach new practitioners in later years

Making sustainable change Evaluation also needs to address sustainability of keeping the PD activity in place in the future. Sustainability: Requires that you use the evaluation findings from Round 1 of implementation to improve Round 2 and so on. Is closely tied to the “reach” challenge. How long does it take for the state to “reach” all the practitioners? If you have only planned to implement a one-time or time-limited part of inservice PD, you are not making an infrastructure change. If your evaluation is only looking at a one-time/time-limited implementation of a PD part, you are not evaluating infrastructure change Infrastructure change for personnel involves making (semi-) permanent changes to personnel system

Asking system-level evaluation questions

System-level Questions Very global Somewhat less global Specific Questions may be global, somewhat less global, specific.

Examples of evaluation questions for personnel/workforce (Very global) How good is our personnel infrastructure? Another way of saying “do we have all the subcomponents in place and are they of high quality?” (Somewhat less global) How good is our inservice professional development? Another way of saying “do we have the quality indicators for inservice professional development in place?” (Getting specific) Does our inservice professional development employ evidence-based practices….. Do we have one or two elements of the quality indicator in place?

Evaluation questions related to inservice PD system What is our state’s planned inservice personnel development system? How does this compare to our current system? Answer is a description of the parts and intended changes (training modules, coaches, etc.) Was the planned inservice PD system implemented effectively and efficiently? Questions about whether training module is accessible, training are held, coaches are hired and supervised, etc. How effective was the new inservice PD system (i.e., parts working together)? Two dimensions to an effective PD system: High quality (practitioners learn intended content and skills) Is the system effective in supporting practitioners to acquire intended scope of new knowledge and skills? Extensive reach (All/most practitioners provided the opportunity to learn) Have all of our practitioners acquired the new knowledge and skills? Reach is often overlooked. From a top down perspective, the state need to improve the skills of all of the practitioners.

Evaluation questions focused on one part of the inservice PD system Was the professional development activity implemented effectively and efficiently? Was the training session held? How many people were expected? How many persons attended? How many coaches were to be hired? How many were the coaches hired? Did the coaches provide coaching? How many practitioners were assigned coaches? How many sessions did each receive? Etc. Was the training module made available? How many persons accessed the training module? Was a state implementation team formed? What was the membership? How many times did the team meet? Did the team function effectively? Did we put it in place? Did it work, i.e., get the results we intended?

Evaluation questions focused on one part of the inservice PD system How effective was the professional development activity in changing practitioner behavior? Did practitioners find the training to be of high quality? Did [all targeted] practitioners acquire new knowledge as a result of the training? Did participants implement targeted practices with fidelity? Did practitioners find the coaching to be of high quality? Did practitioners acquire new knowledge as a result of the coaching? Did participants implement targeted practices with fidelity? How much coaching did it take for the practitioners to reach fidelity? How many months did it take for practitioners to reach fidelity? Did practitioners find the training module to be of high quality? Did the practitioners acquire new knowledge? Did participants implement targeted practices with fidelity? Did the state team achieve its objectives?* *Note that this question is at a different level than the others.

And the evaluation question you should always ask… If not, why not? Need to plan to collect the data to see if there has been a breakdown in implementation somewhere along the line Training was great but not many people attended Training was not sufficient to bring about behavior change Coaches were hired but practitioners didn’t like working with them. Good news: These issues are fixable going forward.

Analyzing data to address system-level questions

Analyzing data at the right level How to frame the evaluation questions to lead into the right level of data analysis? Working assumption: Stakeholders at different levels of the system need information to support the kinds of decisions they can make and the resources they can bring to bear. A state administrator’s responsibilities differ from those of a local administrator’s. Remember a key element of systems thinking is taking multiple perspectives. Even when focused on exactly the same thing, for example, the provision of coaching or practitioner implementing a practice, the evaluation questions need to reflect the perspective of who needs to make decisions from the information.

Example of data types “nested” “roll up” “unit of analysis” Different levels of the system require different evaluation questions which mean different types of data for different levels “nested” “roll up” “unit of analysis”

A theory of action State supports put in place Coaching provided with fidelity Local supports put in place Child outcomes improve Evidence based practices implemented with fidelity What you see on this slide is a very basic theory of action for how supports for coaching and the provision of coaching leads to improvements in child outcomes. The state and the local program put the appropriate supports for practice based coaching in place, coaching is provided, the practitioner implements evidence based practices with fidelity, and child outcomes improve. This example uses coaching but any PD intervention could go in the middle box. This is simple, right? This is an accurate representation of the logic of systems change through coaching, but the representation is in fact overly simple.

Asking the right question State Leader(s): E.g., What % of local programs/districts have at least 80% of practitioners at fidelity? Local Leader Level E.g., What % of our practitioners are implementing the EBP with fidelity? Different levels need to ask different questions about evidence-based practices. Note that benchmarks or performance measure are built into the questions. Practitioner/Coach Level E.g., Is the practitioner implementing EBP with fidelity?

Negative examples State level questions: How many practitioners achieved fidelity by March? What percentage of practitioners achieved fidelity by March? How long did it take the average practitioner in the state to achieve fidelity? Interesting but what will the state do with this information? Is there a way to ask a more actionable question? Yes, make the unit of analysis local program!

Different levels [should] ask different questions about coaching State Leader(s): E.g., What % of local programs/districts have at least 80% of practitioners who received 7 of the 10 planned coaching visits? Local Leader Level E.gs. What % of our practitioners received at least 7 of the 10 planned coaching visits?; what % of coaches provided 80% of their practitioners with 7 or more coaching visits? Ask audience to develop negative examples for the state and local. Practitioner/Coach Level E.g., How much coaching did the practitioner receive?

Revisiting the theory of change State supports put in place Coaching provided with fidelity Local supports put in place Child outcomes improve Evidence based practices implemented with fidelity …for each of the locals …by each of the coaches in those localities …by each of the practitioners who received coaching …for each of the children who worked with those practitioners Let’s re-visit the theory of change to see some of the assumptions in the model that lead to better questions and more appropriate analyses. Recall the notion of one to many we talked about earlier. The question about local support is really a question about local supports for each of the locals. The question about coaches is did each of the coaches in each of the local programs. Similarly, we want to know about each of the practitioners who worked with those coaches. And, finally, the outcomes experienced by each of the children. Notice also how this chain can be broken and what happens. If one of the locals never hires coaches, or hires unqualified coaches, this will have a major impact on the practitioners and children nested within that program. The chain is broken for that broken. Similarly, if one of a programs coaches is ineffective then the practitioners assigned to that coach are not likely to show improved practices. In thinking about PD evaluation from a systems lens, we want to develop a set of evaluation questions that provides actionable information at each point in the chain.

Professional development as a tool for systems change at the state level requires… Good implementation in all of the targeted districts/local programs which requires… All of the local coaches are delivering effective coaching (high quality, sufficient dosage) which should lead to… All practitioners who received coaching implementing EBPs with fidelity which should lead to.. Improved outcomes for the children served by those practitioners

Good evaluation of PD Good evaluation questions reflect an understanding of what has to happen for the intended outcomes to be achieved at each level. They also reflect an understanding of where and how things might go wrong.

Any questions for us?

Some questions for you Why is implementing coaching across the state not necessarily a change in personnel infrastructure? What has to happen for change to be considered an infrastructure change? What are some alternative units of analysis for looking at data at the state level? What are the advantages and disadvantages of different ways of looking at the same data? Why is the level of information needed by coaches/mentors too much information for program directors?

For more information Visit the DaSy website at: http://dasycenter.org/ Follow DaSy on Twitter: @DaSyCenter Visit the ECTA website at: http://ectacenter.org/ Follow ECTA on Twitter: @ECTACenter

Thank you The contents of this tool and guidance were developed under grants from the U.S. Department of Education, #H326P120002 and #H326P170001. However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Meredith Miceli, Richelle Davis, and Julia Martin Eile. Instructions to presenters: This slide is to be included as the last slide in your deck but you are not expected to show it to the audience. Please be sure to delete these instructions from this slide’s notes page in your presentation.