Download presentation
Presentation is loading. Please wait.
1
LCAP Support Meeting January 11, 2017
2
Evaluation Rubrics
3
Objectives » Be familiar with the layout and content of the evaluation rubrics » Know how indicators receive a performance category » Be able to distinguish between status and change on reference chart » Be able to distinguish between state and local indicators.
4
What Did LCFF Change? BEFORE NOW
Performance represented by multiple performance indicators Performance measured by both achievement AND growth Performance represented by a single performance indicator, i.e. API /number Performance measured by achievement OR growth
5
7 State Performance Indicators
» Chronic Absenteeism (not ready yet) » Suspension Rate » English Learner » Graduation Rates » College and Career Readiness (not ready yet) » English Language Arts Assessment (Gr. 3-8)* » Math Assessment (Gr. 3-8)* * - For these indicators are based on the percent of students that met standard or above for the school year.
6
State Performance Categories
Blue Green Yellow Orange Red Highest Lowest
7
* No Color? ‐‐‐ N/A Symbol What this means?
Indicates that the student group consists of between 11 and 29 students. ‐‐‐ Indicates that the student group consists of less than 11 students. N/A Indicates that data is not available.
8
4 Local Performance Indicators
» Basics (Teachers, Instruction Materials, Facilities) » Implementation of Academic Standards » Parent Engagement » Local Climate Survey
9
Local Performance Indicators - no colors
LEAs will evaluate progress on LCFF evaluation rubrics local performance indicators using self assessments and/or a menu of local measures LEAs will be provided with self-assessment tools and/or menu of local measures and will report progress through the evaluation rubrics Following the the reporting out of the self-assessment/local measure options and progress, LEAs will assess performance using criteria: Review: These are the three major points of the of the evaluation rubrics with the language that is included in the board action. In each case, we will clarify whether the self-assessment tools will be provided, will be a local decision, or a choice between the provided tools or local assessment tools MET Not MET Not MET For 2 or more years
10
Status and Change » Status is most recent data available
» Change is difference between: Status year and prior year OR Status year and average of multiple prior years
11
Status and Change Data Layout: All Students
12
4-Year Cohort Graduation Rate
Identify status (based on this year) [Year 1] Calculate change (3‐year average) = 85.0% : 87.0% : 84.0% : 88.0% : 83.0%
13
4-Year Cohort Graduation Rate
Determine Status & Change 2014‐2015 Rate (Status) 87.0% Prior 3‐Year Average 85.0% Change
14
(All LEAs and High Schools)
Reference Chart GRADUATION RATE (All LEAs and High Schools) CHANGE (PRIOR 3 YEARS) Declined Significantly by more than 5% Increased by 1% to less than 5% Increased Significantly by 5% or more Declined by 1% to 5% Maintained Declined or improved by less than 1% Very High Blue Blue Blue Blue 95% or greater High Orange Yellow Green Green Blue 90% to less than 95% STATUS (2014‐15) Median Orange Orange Yellow Green Green 85% to less than 90% Low Red Orange Orange Yellow Yellow 67% to less than 85% Very Low Red Red Red Red Red Less than 67%
15
Performance Category Highest Blue Green Yellow Orange Red Lowest
16
What happens if the change over time is negative?
We’re glad you asked. Let’s do an example together. We’ll use the Graduation Rate indicator and a Reference Chart. But, first, we need data.
17
4-Year Cohort Graduation Rate
Identify status (based on this year) [Year 1] Calculate change (3‐year average) = 89.0% : 87.0% : 86.0% : 90.0% : 91.0%
18
4-Year Cohort Graduation Rate
Determine Status & Change 2014‐2015 Rate (Status) 87.0% Prior 3‐Year Average 89.0% Change +2.0%
20
Your Turn! Status & Change 2014‐2015 Rate (Status) 87.0%
Prior 3‐Year Average Change 0% 80
21
(All LEAs and High Schools)
Reference Chart GRADUATION RATE (All LEAs and High Schools) CHANGE (PRIOR 3 YEARS) Declined Significantly by more than 5% Declined by 1% to 5% Maintained Declined or improved by less than 1% Increased by 1% to less than 5% Increased Significantly by 5% or more Very High 95% or greater Blue High 90% to less than 95% Orange Yellow Green Median 85% to less than 90% Low 67% to less than 85% Red Very Low Less than 67%
22
Single Indicator Data Layout: Graduation Rate
23
Single Student Group Data Layout: English Learners
89 Single Student Group Data Layout: English Learners
24
Equity How does the new system take equity into account? 90
25
Equity Data Layout: All Students
26
Equity Data Layout: All Students (State Indicators Only)
27
ANALYSIS AT THE LEA LEVEL
28
Objectives » Understand purpose of statements of model practice and external links » Be able to identify patterns among indicators and students groups » Begin process of examining potential causes of performance for an indicator
29
What Did LCFF Change? BEFORE NOW
» Performance represented by a single performance indicator, i.e. API / number » Performance represented by multiple performance indicators » Performance measured by achievement OR growth » Performance measured by both achievement AND growth » Performance measured by student test scores » Multiple measures that go beyond student test scores
30
116 All Student Groups/All State Indicators Data Layout: Golden State Unified School District
31
Check for Understanding
How many student groups have a performance category (aka color) two or more below All Students?
32
Grad Rate Performance Category
Blue Green Yellow Orange Red Highest Lowest
33
119 All Student Groups/All State Indicators Data Layout: Golden State Unified School District
34
Analysis Finding Differences in Groups
How can examining the data in this way better inform an LEA and its community?
35
Analysis Across performance indicators Across student group
One method: Count by Blue/Green and by Orange/Red
36
Analysis Where do you see differences in performance among student groups in the Graduation Rate indicator?
37
Independence School District
Data Subsets
38
What is the Nexus between the LCFF Evaluation Rubric and the LCAP Template Summary Section?
Give table groups a few minutes to review Independence data before proceeding to reveal possible connections. Table discussion in small groups of 3-4.
39
Ask participants the following: “At a first glance of LEA data, what could be considered an area of success? “ Pause and give participants a moment to share their response with a table partner.
40
Discuss that mathematics assessment performance might be viewed as a strength. Although African American students lag in performance, no subgroups are performing in orange or red categories.
41
Ask participants the following: “At a first glance of LEA data, what could be considered an area of greatest need? “ Pause and give participants a moment to share their response with a table partner.
43
Ask participants the following: “At a first glance of LEA data, what could be considered a performance gap? “ Pause and give participants a moment to share their response with a table partner.
44
A two degree discrepancy in ALL to African American suspension rate is required for Performance Gap and may also inform a Greatest Need (is a student group, not overall performance). Participants will have an opportunity to further discuss this topic in the next data scenario (Bright Creek).
45
Refer participants to take a deeper look at site-level reports
Refer participants to take a deeper look at site-level reports. Note relatively similar demographics.
46
Reveal: Note discrepancy between suspension rate at the two sites, especially that of African Americans. Also note the performance differences of English Learner indicator between the two sites. Note: School to school comparisons inform LCAP template responses, but are not required to be reported.
47
Making meaning of the data What is causing the progress?
*Identification of areas of greatest progress, areas of greatest need and performance gaps Making meaning of the data What is causing the progress? What is causing areas of greatest need? What is causing the performance gap(s)? Data in of itself is neutral. It is only through us making meaning of the data that it takes on significance. As we focus on continuous improvement, the evaluation rubric and the state/local indicator results represent the first steps in a process of analysis and reflection. It is important to remember that the data points are only a small piece of a larger context. Educating the diverse students we have in our classrooms, within organizational structures that are just as diverse, is a very complex responsibility. As an example, we’d like to go back to the Independence Unified School District; you have identified some preliminary areas of progress, areas of need and performance gaps that they need to take a deeper look at. We’d like to share additional information with you to consider (refer to handout). Please take a few minutes to read the story behind the data in Independence Unified School District.
48
Independence Unified School District Beyond the Data...
Review the additional context about this district and the school sites Areas of Strength Areas of Need Discrepancies between student groups or sites AS we all know, numbers are not the whole story. Please take a minute to review the handout and read more about the story of the Independence School District, Lincoln and Washington schools. Note as you read-strengths, needs, discrepancies you see between sites.
49
How does this information change how you look at the data?
What areas of strength and need exist? What are the discrepancies between sites and/or student groups? What questions might you ask now? Use your notes to discuss as table groups to answer the following questions: How does this change how you look at the data? What questions might you ask now? Given the areas of strength , need and the discrepancies that exist, what are some of the question you might begin asking? Highlight-changes in status for each school, staff strengths/concerns, analysis of effectiveness that needs to occur The hope is that they identify the suspension of African American and EL students varies from the two sites. If not...steer this direction Share out
50
Areas of Strength/Need/Discrepancies Possible Questions
Suspension Rate for African Americans and English Learners What are the specific reasons that African American and EL students are being suspended and how do the reasons compare with All students? How is the staff perceiving this issue? English Learner Progress What is causing the lack of progress towards English proficiency? Does staff have the necessary capacity? School site discrepancies (Suspension Rate and EL Indicator) Why is Washington making progress on these indicators (with an increase in low-income students ) while Lincoln is not? What differences exist in practices? Mathematics Assessment What is causing the rise in Mathematics achievement? Why is the African American student group not achieving at the same level as All students? Read areas identified-give them time to read the questions and consider other questions they came up with. Suspension rate-discrepancy, EL progress-need, school site discrepancies with suspension rate and EL indicator, progress-mathematics.
51
Elements to Consider Why is the System Producing its Current Results?
Culture Structures and Systems Resources Stakeholders Environment Elements to Consider Why is the System Producing its Current Results? When looking at root causes, there are a number of elements to consider. A systems perspective is really needed-we need to consider the system as a whole and begin to consider why it is producing the results it is-helping to identify where the root causes may lie. When choosing strategy to respond to concerns here you want to make sure you are addressing the causal factors. Here are some elements to consider and address in action plan: Culture-Beliefs/norms of organization-how we do things around here Structures/Systems-hierarchy, processes/protocols-how work gets done (formal and informal) Resources-people, finances, technology-how resources are allocated, capacity is built Stakeholders-internal/external(staff, board, unions, parents, students, community)-how stakeholders are involved, what strengths/needs exist with stakeholders Environment-political/policy context-regulations,contracts,funding,politics-how to best function within environmental factors Critical to spend enough time identifying the problem and engaging your stakeholders in the discussions around root causes and possible solutions. Consider your systems and how they might be contributing to the results (positive or negative). Need to be strategic with the actions that are included in the plan-what research or evidence supports those actions? In continuous improvement, we can’t wait until the end of the year to consider progress-should be considering how objectives will be measured not only annually but throughout the year to adjust as necessary (leading indicators vs lag). Ultimately LCAP should reflect what is being learned and telling a story of improvement.
52
Why is the System Producing its Current Results?
Culture Staff resistant to change EL instruction not a priority Difference in attendance rates Structures and Systems Lack of instructional leadership from District Office; site autonomy Resources Staff experience Investment in PD for Math TOSA to support EL instruction Stakeholders Parent attendance at meetings Advocacy groups Environment Economic struggles of families A you can see, each of these elements help to frame possible causal factors that need to be addressed to improve the district’s results and create a systems perspective focused on continuous improvement. The next step in the process is to begin thinking about how to respond to the identified issues, and what resources exist to help with that.
53
SBE – January 2017 EL Student Group in the Academic Indicator
SBE Item 2 Item 02 Attachment 1 Addendum
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.