Presentation is loading. Please wait.

Presentation is loading. Please wait.

Charting the Course: Monitoring Progress Towards SIMR Attainment

Similar presentations


Presentation on theme: "Charting the Course: Monitoring Progress Towards SIMR Attainment"— Presentation transcript:

1 Charting the Course: Monitoring Progress Towards SIMR Attainment

2 Session Description In this session, participants will:
learn about considerations related to using local assessment data when measuring progress towards State-Identified Measurable Results (SIMR); reflect on strategies for monitoring progress and making associated changes in infrastructure to support implementation of evidence-based practices (EBPs); hear how two states, Ohio and Utah, are monitoring progress both in terms of student results and systems changes, to qualitatively describe how they know if they're moving forward to achieving their SIMRs.     

3 Presenters Sheryl Lazarus, Associate Director, National Center on Educational Outcomes (NCEO) Susan Hayes, Senior Program Associate, West Ed/National Center on Systemic Improvement (NCSI) Wendy Stoica, Assistant Director, Ohio Department of Education Leah Voorhies, Coordinator, Utah State Board of Education Moderator: Kerry Haag, Assistant Director, Kansas State Department of Education

4 Sheryl Lazarus National Center on Educational Outcomes (NCEO)
Considerations for Using Local Assessment Data for Monitoring Progress to the SIMR Sheryl Lazarus National Center on Educational Outcomes (NCEO)

5 Overview of Forthcoming Brief
Purpose: To outline key considerations for the collection, analysis, and use of local assessment data to monitor progress toward the SIMR and support improved decision-making at the state and local levels. Primary Audience: States and other TA providers

6 Acknowledgements Other Workgroup Members Cesar D’Agord Susan Hayes
Carla Howe Additional Team Members Contributing to this Document Rorie Fitzpatrick Maureen Hawes Bill Huennekens Kathy Strunk Martha Thurlow

7 Considerations slide Considerations

8 Consideration 1. Appropriateness of Measure
Local assessment should be aligned to the SIMR Most states’ SIMRs are performance on the state test Alignment to state standards Is the local assessment consistent enough with the state standards to ensure that when the SIMR is met, the students will be well on their way to meeting the state's standards? Contextual Factors There is a need for local buy-in Cost can be an issue Reduces testing if an assessment can be used for several purposes

9 Consideration 2: Purpose of Measure
Data from the assessment used to measure progress toward SIMR may also be used for other purposes at local level. It can be difficult to know how to balance the desire to introduce new local measures that are good measures of progress toward SIMR with the desire to utilize existing measures that are familiar to staff and students alike.

10 Consideration 3: Accessibility & Accommodations
Some students with disabilities need to use accessibility features and accommodations so that they have valid and reliable scores. IDEA requirements (Sec ) (b) (1) A State (or, in the case of a district-wide assessment, an LEA) must develop guidelines for the provision of appropriate accommodations. (2) The State's (or, in the case of a district-wide assessment, the LEA's) guidelines must--(i) Identify only those accommodations for each assessment that do not invalidate the score; and (ii) Instruct IEP Teams to select, for each assessment, only those accommodations that do not invalidate the score. Reminder: IDEA also addresses participation requirements- Sec a

11 Consideration 4. Assessment-Curriculum Literacy of Educators
Local assessments typically used for multiple purposes; one purpose being to inform instruction. Some educators may not have a good understanding of how assessment data can be used throughout the year to strengthen instruction. Role of professional development

12 Consideration 5. Transparency & Data Privacy
To successfully measure progress toward the SIMR there is a need for the sharing of local assessment data with the State. Need to get local measures into State system Concerns about data privacy Data sharing agreements State only needs summary data Need for parent information/materials that help them understand why local assessment data are used to measure progress toward SIMR Importance of participation Helps ensure appropriate services for their child

13 Consideration 6. Technical Issues
Challenges inherent when different districts use different measures; how to make comparisons. Common Education Data Standards (CEDs) address some of these issues Will be important to analyze data over time to make sure that the local measure(s) selected actually do measure progress toward the SIMR.

14 Questions How do these considerations resonate with you?
Are there similarities to what you have experienced in your state? Have you experienced other things?

15 Susan Hayes National Center for Systemic Improvement
Measuring Changes in Infrastructure to Support Implementation of Evidence-Based Practices: The What, Why, and How Susan Hayes National Center for Systemic Improvement Susan

16 Overview Types of SSIP infrastructure activities
The “what” Value of evaluating those efforts The “why” Measuring outcomes and impact The “how” NCSI resources

17 Importance of Infrastructure to the SSIP
In Phase I, you analyzed various facets of your infrastructure that could support or pose barriers to your planned improvement strategies. In Phase II, you developed a plan to improve your infrastructure and measure those changes over time. In Phase III, you implemented infrastructure improvements and began to measure the impact of these changes. States’ infrastructure improvements are foundational to the SSIP.

18 What? The “what”

19 SSIP Infrastructure Improvements
Enhancing infrastructure improvement capacity at all levels of the system: SEA LEA Practitioner Note that Phase III analysis is forthcoming! Here I think we talk generally about what we know about how state activities are reaching all levels of the system. According to Phase II submissions, most states (49 or 82%) were explicit in describing how infrastructure improvements would enhance their ability to support LEAs to implement and scale up EBPs to address SIMRs From the Phase II submissions, the two most frequently identified areas for improvement were professional development, (52 states, 87%), and technical assistance (46 states, 77%).

20 SSIP Infrastructure Improvements
PD and TA were most commonly cited infrastructure improvements in the Phase II analysis. The two most frequently identified areas for improvement were professional development, (52 states, 87%), and technical assistance (46 states, 77%). A smaller number of states (22 states, 37%) planned on making improvements or enhancements to their quality standards.

21 Infrastructure: Enhancing Professional Development
PD was the most frequently identified area of infrastructure improvement by states (52 states, 87%). Thirty-seven of those 52 states (71%) cited in-service offerings for educators/paraprofessionals, and 25 of the states (48%) cited in-service offerings for administrators. Thirty-six of the 52 states (69%) also named as an improvement the current topics of PD being provided. Nineteen states (37%) plan to better support LEAs in the quality or fidelity of the implementation of PD; fifteen states (29%) intend to use online modules/technology; and four states (8%) identified pre-service for teachers as an infrastructure improvement. See Figure 3. Other planned improvements to states’ PD systems included: providing parent trainers for families, to address reading interventions at home; focusing on mindful coaching; bringing together Multi-Tiered Systems of Support (MTSS) with mental health and suicide prevention programs; partnering with institutions of higher education to provide training to schools; focusing on Universal Design for Learning (UDL); and developing coaches and mentors for dual-language instructors.

22 Infrastructure: Enhancing Technical Assistance
Forty-six states (77%) reported improvements to their TA infrastructures. Of those state responses, coaching was identified in 34 (74%) as a key improvement to support LEAs in implementation and scale-up of EBPs to improve results for students with disabilities. Thirteen states (28%) plan to use a variety of strategies, such as online chats, study groups, or phone conferencing, and 15 states (33%) plan to use some type of technology for TA. Other TA improvements reported by states include offering online book groups for promoting EBPs, providing train-the-trainer models, developing a coaching network, and providing teacher mentoring for data-informed decision-making. See Figure 4.

23 Why? The “why”

24 Why Measure Infrastructure Improvements?
States’ infrastructure improvements support the implementation of selected evidence-based practices. Those evidence-based practices are designed to lead to changes in the SIMR. Infrastructure improvements are therefore woven into the fabric of the SSIP.

25 Why Measure Infrastructure Improvements?
Measuring your infrastructure improvements allows you to: Demonstrate progress Identify needed changes (continuous improvement) Ensure everything is on track to support implementation of EBPs and therefore improvement in the SIMR

26 How? The “how”

27 Measuring Infrastructure Implementation and Impact
How’s it going? (Implementation) Are we successfully accomplishing our planned activities? Are we moving along appropriately so that we can achieve our goals? Where are we experiencing challenges and what changes can we make to What good did it do? (Outcome/Impact) Did we accomplish our goals? Can we show that what we did was responsible for the accomplishments? Do the accomplishments matter?

28 Measuring Infrastructure Implementation and Impact

29 Logic Model Intermediate outcomes can be viewed as . . .
Strategies can be viewed as . . . Inputs, activities, and outputs on the left side of the logic model depict a program’s processes and implementation. Broad approaches to realizing the theory of action and addressing the goals Activities can be viewed as . . . Specific actions that implement strategies Actionable plans based on the program’s theory of action Flesh out strategies through concrete events and products Outputs can be viewed as . . . Program accomplishments Direct results of the activities Description and number of products and events Customer contacts with products and events Short-term outcomes can be viewed as . . . What customers/clients learn as a result of outputs What awareness, attitudes, or skills they develop Intermediate outcomes can be viewed as . . . Changes in adult actions or behaviors based on knowledge or skills acquired Fidelity of the planned interventions Improved organizational functioning Improved system functioning Long-term outcomes can be viewed as . . . The broadest program outcomes The results that fulfill the program’s goals The impact on children or families Program sustainability Changes that are expected to result from these processes are called outcomes and are depicted on the right side of the logic model.

30 Measuring Infrastructure Implementation and Impact
Defining evaluation questions at each level of the system State Local Practitioner Family Collecting data Analyzing and interpreting data Using data to drive decisions Questions that SSIP state teams are currently exploring (note to Susan: below are some guiding questions that were used in the recent Part B CSLC meeting around this topic to drive state discussion) What are some of the activities that have occurred (or you intend on implementing)? o   At what level are changes occurring SEA, LEA, practitioners level? o   How did know you are accomplishing the intended outcomes, what evidence exists?  ·       This leads to understanding if the activity accomplishes an intended outcome and how states measure practices and subsequently results. Most states have focused on PD and TA.  o   How do we measure state capacity improvements? §  How has the state infrastructure been improved to support implementation of the practices? §  Measures state system changes – governance, data systems, TA, etc §  Data sources – statewide communication plan, new monitoring systems o   How do we measure LEA capacity improvements? §  How has the local infrastructure been improved to support implementation of the practices? §  Measures local system changes – governance, data systems, TA, etc §  Data sources – local strategic planning that aligns to state priority or plan, supervisory system with checklist or rubric for supporting practitioners implementation o   How do we measure Teacher level practice improvements? §  Have practices improved? Are more practitioners implementing the desired practice(s)? §  Measure - % of implementation of desired practice,  §  Data sources – self assessments, observation checklist, implementation rubrics What PD strategies/activities will you evaluate this year? Why did you chose these activities? Why are these activities critical for making progress toward the SiMR? What are your short- term outcomes for PD activities? What are your long- term outcomes for PD activities? What are the critical evaluation questions to measure these outcomes? What data will be collected to measure implementation progress of PD activities? How will the data be analyzed to measure implementation progress of PD activities? How will you determine that the data are valid and reliable? How will you measure the impact of your PD? How will you use the data to drive decisions about future PD efforts?

31 Possible Data Sources Qualitative and Quantitative Data
Surveys, focus groups, interviews Professional development/training data Communication plans and improvement action plans Observation checklists Implementation fidelity rubrics The purpose of this slide is just to highlight the range of data states could be collecting to measure impact of infrastructure improvement activities. How do we measure state capacity improvements? §  Data sources – statewide communication plan, new monitoring systems How do we measure LEA capacity improvements? §  Data sources – local strategic planning that aligns to state priority or plan, supervisory system with checklist or rubric for supporting practitioners implementation How do we measure Teacher level practice improvements? §   Data sources – self assessments, observation checklist, implementation rubrics

32 Evaluating PD Activities

33 Questions to Consider When Evaluating PD Activities
How do you know your PD is being implemented as planned? Audience Training Delivery Content How do you know if the PD is achieving intended outcomes? Knowledge/skills Behavior Implementation of learned practices Fidelity Note: I hid the next three slides because I think we need to reduce the number of slides given your time allocation. So, I would use the information on those now hidden slides to inform speaker notes to this slide. Implementation questions: Did you reach the intended audience? Did you deliver the training events as planned? Was the planned content delivered? Was the content delivered as planned? Outcomes from the perspective of trained staff: Did the participants believe they acquired knowledge or skills? Did the participants behavior change? “I will implement…” (behavior) Did the participants implement learned practices? Did the participants implement practices with fidelity?

34 Questions to Consider in Evaluating PD Implementation
PD Activities Did the PD occur? # of trainings held, schedule of PD offerings, # of participants who attended Quality Was the PD delivered as intended and aligned to designated content? Review of PD materials, direct observation Consistency Was the PD delivered consistent from site-to-site? PD evaluation surveys, direct observation How do we use data at each level to inform progress?

35 Questions to Consider in Evaluating PD Impact
Participant Reaction Are the participants satisfied with the PD experience? Survey, PD exit evaluation Participant Learning Did the participants acquire the intended knowledge or skills? Survey, PD exit evaluation, pretest/posttest Organization Support Were resources made available to support participant knowledge development? Course materials How do we use data at each level to inform progress? Adapted from Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks, CA: Corwin

36 Questions to Consider in Evaluating PD Impact continued.
Participant Use of Knowledge Did the participants (effectively) apply the new knowledge? Intervention fidelity rubrics Student Learning Outcomes What was the impact on students? Formative assessments, statewide assessments, screening tools How do we use data at each level to inform progress?

37 NCSI Resource: Implementation Evaluation Matrix

38 NCSI Resource: “Wins and Hiccups” Guide
NCSI has developed a new resource, “Wins and Hiccups: A Collaborative Implementation Problem-Solving Guide for SSIP Teams” designed to support states in identifying both wins and hiccups at this stage of SSIP implementation in four key areas: Innovation Implementers Organizational Context Evaluation Over the course of SSIP implementation, you will likely experience some successes (wins!) ... and some setbacks (hiccups).

39 NCSI Resource: “Wins and Hiccups” Guide

40 NCSI Resource: “Wins and Hiccups” Guide

41 NCSI Resource: “Wins and Hiccups” Guide

42 Thank You!

43 Literacy in Ohio: Building a Strong Foundation
Focus: Building a System that Supports all Students in Learning to Read Professional Development Coaching Student Outcomes Special Education Leadership Summit, 2017

44 Literacy in Ohio: Building a Strong Foundation
Collaborative Effort State Regional District Building Next we will discuss Ohio’s aim to support school-based implementation of these critical skills, strategies, and instructional moves. Selection of RELS, criteria Special Education Leadership Summit, 2017

45 Coaching Fidelity Tool
Literacy in Ohio: Building a Strong Foundation Coaching Fidelity Tool Like any other educational innovation, the coaching of teachers must be used with fidelity in order to achieve its intended outcomes. Although fidelity often is thought of as the adherence to the “key ingredients” of the innovation, it also includes aspects such as quality, responsiveness of the participant (i.e., teacher), and dose. Although we use the word teacher throughout this document, the term is used to denote individuals, such as early child care providers, interventionists, or parents, who work with learners in a less traditional educational setting (i.e., home). We also use the term to describe individuals, such as prekindergarten through grade 12 teachers, who work with learners in a more traditional setting (i.e., classroom). Similarly, we use learner to describe the infants, toddlers, children, and youth with whom these teachers work. Special Education Leadership Summit, 2017

46 Data Dashboard: Coaching Fidelity Tool
Literacy in Ohio: Building a Strong Foundation Data Dashboard: Coaching Fidelity Tool Next we will discuss Ohio’s aim to support school-based implementation of these critical skills, strategies, and instructional moves. Scraps of paper, Special Education Leadership Summit, 2017

47 Coaching Log: Coaching Intensity
Literacy in Ohio: Building a Strong Foundation Coaching Log: Coaching Intensity Coaching Systems: Currently, within the coaching log, coaches can select “principal group” as a coaching group. This is where RELS have been documenting coaching the RTFI as well as principal content coaching (i.e., what to look for during walk-throughs). We found that this was a limitation because it was not capturing the details of coaching systems. We are working with the data dashboard developer to add coaching RTFI data to this log. Coaching Content: A tab in the data dashboard has been created to document the intensity of coaching in terms of frequency and duration. In addition, the RELS and district coaches are documenting coaching alliance building strategies being used. There is a coaching fidelity tool – (limitation) however, this tool measures alliance building strategies and not content coaching. A team of RELS will be attending a coaching institute at the University of Kansas (Jim Knight) on Instructional Coaching. The team will plan and facilitate coaching PD for the RELS cohort 1 and cohort 2 in order to better coach language and literacy content. Limitation: data is not being entered consistently – we need to develop a data entry schedule with deadlines for monthly data entries. Special Education Leadership Summit, 2017

48 Literacy in Ohio: Building a Strong Foundation
Coaching Logs Coaching Systems: Currently, within the coaching log, coaches can select “principal group” as a coaching group. This is where RELS have been documenting coaching the RTFI as well as principal content coaching (i.e., what to look for during walk-throughs). We found that this was a limitation because it was not capturing the details of coaching systems. We are working with the data dashboard developer to add coaching RTFI data to this log. Coaching Content: A tab in the data dashboard has been created to document the intensity of coaching in terms of frequency and duration. In addition, the RELS and district coaches are documenting coaching alliance building strategies being used. There is a coaching fidelity tool – (limitation) however, this tool measures alliance building strategies and not content coaching. A team of RELS will be attending a coaching institute at the University of Kansas (Jim Knight) on Instructional Coaching. The team will plan and facilitate coaching PD for the RELS cohort 1 and cohort 2 in order to better coach language and literacy content. Limitation: data is not being entered consistently – we need to develop a data entry schedule with deadlines for monthly data entries. Special Education Leadership Summit, 2017

49 Coaching Logs Continued

50 State Systemic Improvement Plan: Utah’s SSIP
Utah will increase the percentage of students with Specific Learning Disabilities (SLD) or Speech/Language Impairment (SLI) in grades 6-8 who are proficient on the SAGE mathematics assessment by 11.11% over a 5- year period.

51 SSIP Baseline Data Analysis: 2013-14 Grades 6-8 SAGE Mathematics Assessment Percent Proficient
These data are and “pull out” and compare Grades 6-8 Utah student proficiency results to those of SWD with SLI and SLD, showing a large 22.22% achievement gap in mathematics.

52 Utah’s State Systemic Improvement Plan (SSIP)

53 State-Identified Measureable Result (SIMR)
We calculated the SIMR the way we used to calculate AYP and the Flexibility Waiver. We cut the gap in half and set goals leading to achievement in 5 years. Each year we must mve 2.22% of the SWD into proficient, while maintaining the prior levels of proficiency. The numbers at the bottom reflect the total number of SWD moved into and maintained in proficiency each year.

54 Successful Activities to Improve Mathematics Outcomes
The Utah State Board of Education (USBE) is working with outside organizations/agencies to improve expectations/beliefs USBE-Public Relations campaign Legislative roundtable at State Capitol Roundtable discussions in each LEA working intensively with USBE on the SSIP Press releases on SSIP activities (co-teaching project, book studies) USBE-Utah Parent Center Parent Book Study: Mindset by Carol S. Dweck 300 people registered from across the state (15 districts and 5 charter schools) Over 100 participants were online during a session

55 Successful Activities to Improve Mathematics Outcomes Continued
USBE is providing supports to LEAs to improve content knowledge, effective instruction and multi-tiered systems of supports in secondary settings Teacher Book Study: Mathematics Mindsets by Jo Boaler Administrator Book Study: Mathematics Mindsets by Jo Boaler Teacher Book Study: Principles to Actions by the National Council of Teachers of Mathematics Administrator Book Study: Principles to Actions by the National Council of Teachers of Mathematics Pilot projects in 9 “intensive” LEAs (I-9s) and about 10 “targeted” LEAs

56 SSIP Progress: SAGE Mathematics Proficiency for All Students with Disabilities

57 SSIP Progress: SIMR


Download ppt "Charting the Course: Monitoring Progress Towards SIMR Attainment"

Similar presentations


Ads by Google