Download presentation
Presentation is loading. Please wait.
Published bySilvester Hicks Modified over 6 years ago
1
Network for Evaluation Assessment Accountability and Research
N.E.A2.R.
2
Welcome to N.E.A2.R. Vision: ESC – Region 19 provides key guidance and support in all things related to Evaluation, Assessment, Accountability, and Research.
3
N.E.A2.R. We will meet twice a semester and once in the summer.
September November February April July
4
N.E.A2.R. Agenda Domain 2 and 3 Updates ATAC feedback and input
MOU’s and Data Sharing Assessment Updates Questions and Needs
5
The School Progress Domain
HB 22 A-F Accountability Good afternoon, everyone. Thank you for joining us for the second in our series of three ToT TETNs titled “The Implementation of House Bill 22: Collaborating to Build a Better Accountability System. The purpose of this series of TETNs is to provide greater details on the development of the new accountability system and get your feedback, then have you use this same presentation to provide the same training to your districts so we can get their feedback as well. In this episode, we will explore the School Progress domain. We’ll go into detail of the Closing the Gaps domain at the TETN in October. The School Progress Domain
6
School Progress: Growth
Student Achievement Closing The Gaps School Progress The School Progress domain is the second of the three domains established by House Bill 22
7
School Progress: Two Aspects to Progress
Part A: Student Growth Part B: Relative Performance When we think about School Progress, we think about the impact on students: how much are they growing House Bill 22 establishes two ways to evaluation this. Student Growth: how much has each individual student grown academically over last year—longitudinal student growth While this is really good information to have, this metric has some limitations: because there are no STAAR test until third grade, we can’t determine level of growth using this metric until fourth grade, and high school has only a few STAAR test, so it’s not a very effective measure there either. Relative performance is a different way of looking at growth by comparing the performance of similar campuses against each other.
8
School Progress: Two Aspects to Progress
Part A: Student Growth Part B: Relative Performance We’ll start our look at the School Progress domain by exploring Part A: Student Growth
9
STAAR: Test Inclusion Methodology
Includes all tests (STAAR with and without accommodations and STAAR Alternate 2) combined Combines reading and mathematics Uses STAAR Progress Measure Includes ELs (except in their first year in US schools) Uses same STAAR Progress Measure for ELs and non-Els Because the first STAAR tests are given in third grade, we can’t assess growth using the STAAR Progress Measure until fourth grade. In high school, there are limitations to measuring growth with STAAR. It can only possibly be done for 9th graders who take Algebra I, and then only for 9th and 10th graders taking English I or English II. At this point, only Relative Performance will be analyzed in high school. First, what are we including in Part A: what test do we have to make accurate conclusions about student growth Part A includes all types of STAAR tests, those without accommodations, with accommodations, and STAAR Alt 2 The subjects included are reading and mathematics, and we combine the tests, we don’t report them separately. Part A takes into account the STAAR Progress Measure (the same one that we been using for several years)—it’s the same progress measure for ELs and non-ELs; we no longer use the ELL progress measure. This is new this year. Because the first STAAR tests are given in third grade, we can’t assess growth using the STAAR Progress Measure until fourth grade. We can’t measure growth in high school using the STAAR Progress Measure because high schools have very few STAAR tests. At this point, only Part B: Relative Performance will be analyzed in high school. 9
10
STAAR Performance Level
Student Growth: Measuring Advancement Exceeds Masters Masters + 1 Point Awarded For meeting or exceeding expected growth Expected Meets Meets + .5 Points Awarded For maintaining proficiency but failing to meet expected growth STAAR Performance Level Maintains Approaches Approaches Limited + 0 Points Awarded For falling to a lower level Does Not Meet Does Not Meet 3rd Grade Example 4th Grade Example Here are a few examples of how Part A works. The design of the new growth model is to reward students who achieve expected growth (1 point) and also reward students who stay at the same performance level year over year even if they don’t meet growth expectations. A student who was at Approaches Grade Level last year and achieves the Meets Grade Level this year earns one point A student who was at Approaches Grade Level last year, stays at Approaches Grade Level this year, but meets the STAAR Progress Measure expectation also earns one point. A student who was at Approaches Grade Level last year, stays at Approaches Grade Level this year, but does not meet the STAAR Progress Measure expectation earns half a point for maintaining proficiency. And a student who moves to a lower performance level between years earns no points. We no longer award two points for exceeding STAAR Progress Measure expectations. This is very different from what we’ve done in recent years. 10
11
Student Growth: Percentage of Students Gaining
Current Year Does Not Meet Grade Level Approaches Meets Masters Met/Exceeded Growth Measure = 1 pt Did not meet = 0 pts Growth Measure = 1 pt Did not meet = .5 pts 1 pt 0 pts Previous Year Each test can earn one point, half a point, or no points depending on proficiency level and whether the STAAR Progress Measure was met. This matrix shows the 20 different possible outcomes for each test.
12
Student Growth: Percentage of Students Gaining
Current Year No Points Does Not Meet to Does Not Meet (without meeting growth expectations) Approaches to Does Not Meet (without meeting growth expectations) Meets to Does Not Meet Meets to Approaches Masters to Does Not Meet Masters to Approaches Masters to Meets Does Not Meet Grade Level Approaches Meets Masters Met/Exceeded Growth Measure = 1 pt Did not meet = 0 pts Did not meet = .5 pts 1 pt 0 pts Previous Year There are seven different ways to not earn any points. No points are awarded for not growing and not maintaining proficiency 12
13
Student Growth: Percentage of Students Gaining
Current Year Half Point Does Not Meet to Approaches (without meeting growth expectations) Approaches to Approaches (without meeting growth expectations) Does Not Meet Grade Level Approaches Meets Masters Met/Exceeded Growth Measure = 1 pt Did not meet = 0 pts Growth Measure = 1 pt Did not meet = .5 pts 1 pt 0 pts Previous Year There are two different ways to earn half a point. Half a point is awarded for maintaining proficiency even if the STAAR Progress Measure expectation is not met 13
14
Student Growth: Percentage of Students Gaining
One Point Does Not Meet to Approaches (meeting/exceeding growth expectations) Approaches to Approaches (meeting/exceeding growth expectations) Does Not Meet to Meets Does Not Meet to Masters Approaches to Meets Approaches to Masters Meets to Meets Meets to Masters Masters to Masters Does Not Meet to Does Not Meet (meeting/exceeding growth expectations) Approaches to Does Not Meet (meeting/exceeding growth expectations) Current Year Does Not Meet Grade Level Approaches Meets Masters Met/Exceeded Growth Measure = 1 pt Did not meet = 0 pts Did not meet = .5 pts 1 pt Did not meet = .5 pts 0 pts Previous Year There are eleven different ways to earn one point. One point is awarded for meeting or exceeding STAAR Progress Measure expectations or for achieving a high proficiency level than last year. 14
15
? 200 Student Growth: Sample Calculation One Hundred Students
Each with reading and mathematics results for last year and this year Denominator = 200 STAAR Progress Measures ? 200 This is an example of how we calculate the score for Part A. Let’s imagine the following scenario: a campus with one hundred students, each of whom took both reading and mathematics STAAR tests last year and this year. That would give us 200 STAAR Progress Measures: that’s the denominator
16
Student Growth: Sample Calculation
No Points Does Not Meet to Does Not Meet (without meeting growth expectations) Approaches to Does Not Meet (without meeting growth expectations) Masters to Meets Previous Year Current Year Count of Tests 20 + 15 + 14 49 Twenty tests remain at the Does Not Meet proficiency level and don’t meet the STAAR Progress Measure expectations Fifteen tests went from Approaches last year to Does Not Meet this year And 14 test went from Masters Grade Level Last year to Meets Grade Level this year. That’s 49 tests that don’t earn any points.
17
Student Growth: Sample Calculation
Half Point Does Not Meet to Approaches (without meeting growth expectations) Approaches to Approaches (without meeting growth expectations) Previous Year Current Year Count of Tests 7 + 10 17 Seven tests went from Does Not Meet last year to Approaches this year without meeting the STAAR Progress Measure expectations And 10 test stayed at Approaches Grade Level an did not meet STAAR Progress Measure expectations That’s 17 test that earned half a point.
18
Student Growth: Sample Calculation
One Point Does Not Meet to Does Not Meet (meeting/exceeding growth expectations) Approaches to Does Not Meet (meeting/exceeding growth expectations)* Approaches to Approaches (meeting/exceeding growth expectations) *Very rare but statistically possible Previous Year Current Year Count of Tests 23 + 7 + 22 52 Twenty-three tests stayed at the Does Not Meet proficiency level, but met the STAAR Progress Measure expectations Seven tests fell from Approaches Grade Level last year to Does Not Meet this year, but met the STAAR Progress Measure expectations (This is rare but possible when a student skips a grade.) Twenty-two tests stayed at Approaches Grade Level, but met the STAAR Progress Measure expectations
19
Student Growth: Sample Calculation
One Point Meets to Meets Meets to Masters Masters to Masters Previous Year Current Year Count of Tests 33 + 32 + 17 82 Thirty-three tests stayed at Meets Grade Level Thirty-two test went from Meets Grade Level last year to Masters Grade Level this year And 17 test stayed at Masters Grade Level
20
Student Growth: Sample Calculation
49 results that earned no points 17 results that earned half a point 134 results that earned one point (49× 0)+(17 × .5) +(52 × 1) +(82× 1) 200 142.5 200 71 = = In this case, we loosely conclude that 71% of students have gained a year academically. Technically, however, this is the percentage of tests taken, with some adjustment for maintaining proficiency. That a total of 134 tests that earn one point. Here’s how we add it up. 49 test earn no points, plus 17 test that earned half a point, plus 134 test that earned one point equals points. Divide that by total number of STAAR Progress Measures (the total number of possible points) to get the score for Part A: 71. In this case, we loosely conclude that 71% of students have gained a year academically. Technically, however, this is the percentage of tests taken, with some adjustment for maintaining proficiency.
21
Part A Scores: Frequency by Campus Type
School Progress Domain: Feedback Opportunities New approach to growth Additional ways to measure growth in high school Percentage of students who need to grow to constitute Excellent performance Minimally acceptable performance Part A Scores: Frequency by Campus Type Elementary (4,219) Middle School (1,653) K–12 (334) District (1,203) Quantile Part A Score (based on modeling data from 2017 accountability) 100% (Max) 100 96 99% 88 85 87 86 95% 84 81 83 79 90% 82 78 80 77 75% (Q3) 75 76 73 50% (Med) 70 25% (Q1) 68 65 64 66 10% 63 61 59 62 5% 56 1% 52 54 45 49 0% (Min) 34 41 24 So here is specific feedback that we are looking for. The new approach to growth, is it the right direction? Are tweaks needed? What other ways exist to measure growth in high school? And lastly what percentage of students needs to grow in order to show excellent performance? what about minimally acceptable performance? In the example that we used 71 percent of students grew. The table on this slide shows how campuses and districts would have performed in 2017 had Part A been used in accountability. The highlighted row shows that at the 90th percentile, 82 percent of students in elementary schools showed growth, 78 percent in middle schools, 80 percent for K–12 campuses, and 77 percent for districts.
22
Common Questions: School Progress Domain, Part A
Q: Is there no additional credit for meeting or exceeding growth at the Meets and Masters levels? A: Students at Meets or Masters are given the same one point as students who show growth at Does Not Meet and Approaches. Q: Slide 14 shows an example of a student who falls from Approaches Grade Level one year to Does Not Meet the next year and still meets STAAR Progress Measure expectations. Can this really happen? A: It’s very rare, but, statistically, it’s possible when a student skips a grade. Our modelling with 2017 data produced ten such instances in the entire state. Q: Why are high schools only scored on relative performance? Is there no growth measure for high school? A: The relatively few STAAR Progress Measures for high school make them an unreliable measure of a high school’s progress with students. But the STAAR Progress Measure scores will be available on TAPR. Here are a few common questions that we get about Part A of this domain.
23
School Progress: Two Aspects to Progress
Part A: Student Growth Part B: Relative Performance Part B of this domain looks at relative performance.
24
Disadvantaged Students
Relative Performance: Measuring School Progress Higher Levels of Student Achievement Student Achievement Domain Score for All Students Includes STAAR, CCMR, and graduation rates for districts and campuses that have that data The second part of the School Progress domain is relative performance, comparing the Student Achievement domain score against the percentage of students who are economically disadvantaged, then comparing that performance against like campus types. The left axis is the score in the Student Achievement domain. For elementary schools and middle schools, STAAR is the only component. For high schools and districts, it’s STAAR, CCMR, and graduation rates Along the bottom is the percentage of students who are considered economically disadvantaged Each dot in this example represents an elementary campus, where each one falls is determined by its score in the Student Achievement domain and its percentage of students who are considered economically disadvantaged Higher Rates of Economically Disadvantaged Students % Economically Disadvantaged Students 24
25
Disadvantaged Students
Relative Performance: Measuring School Progress Higher Levels of Student Achievement Student Achievement Domain Score for All Students Includes STAAR, CCMR, and graduation rates for districts and campuses that have that data Average Line So how can we compare performance of the green-dot campus and the red-dot campus? It looks like the green-dot campus is out performing the red-dot campus in terms of proficiency, but is that really true in terms of the impact it has on students? If we draw a line of best fit, it turns out that each campus is precisely average for all elementary campuses in Texas Higher Rates of Economically Disadvantaged Students % Economically Disadvantaged Students 25
26
Disadvantaged Students
Relative Performance: Measuring School Progress Higher Levels of Student Achievement A campus with fewer economically disadvantaged students on average has higher levels of student achievement. Student Achievement Domain Score for All Students Includes STAAR, CCMR, and graduation rates for districts and campuses that have that data Average Line A campus with more economically disadvantaged students tends to have lower levels of student achievement. We know that, generally, a campus with fewer economically disadvantaged students has higher levels of student achievement. And, generally, a campus with more economically disadvantaged students tends to have lower levels of student achievement. But poverty is not destiny. Higher Rates of Economically Disadvantaged Students % Economically Disadvantaged Students 26
27
Disadvantaged Students
Relative Performance: Measuring School Progress Higher Levels of Student Achievement A campus with fewer economically disadvantaged students on average has higher levels of student achievement. Student Achievement Domain Score for All Students Includes STAAR, CCMR, and graduation rates for districts and campuses that have that data Average Line A campus with more economically disadvantaged students tends to have lower levels of student achievement. Look at the orange dot. This campus clearly is outstanding in School Progress. The last thing we want to do is set up an accountability system that has low expectations for economically disadvantaged students With the approach of Part B, we are able to identify the campuses that are really excelling at progress so we can learn from them; the schools at the top right of this plane are “A” schools. The green and red dots are not “A” schools in terms of School Progress, even though the green dot has a high level of student achievement. Higher Rates of Economically Disadvantaged Students % Economically Disadvantaged Students 27
28
A B C D F Relative Performance: Measuring School Progress
Higher Levels of Student Achievement A B Student Achievement Domain Score for All Students C From that average line, we draw bands that determine grades. We’re still working to determine where the cut points should be. The slope of the lines will all be the same; grades will be determined by how far above or below the average line a district or campus falls. This is an area where we would like feedback When we run this for accountability, we’re planning several different scatterplots: one for elementary schools, one for middle schools, one for high schools/K–12, one for AEAs and one for districts. We’ve looked at whether to use other characteristics in addition to percentage of economically disadvantaged students (English learners, Special Ed, etc.), but research has shown that the one characteristic that matters more than any other is the percentage of economically disadvantaged students. We’re going to fix a distribution for the 2017–18 school year using data from 2016–17 accountability, then we’ll hold those cut points for hopefully five years. D F Higher Rates of Economically Disadvantaged % Economically Disadvantaged Students 28
29
Common Questions: School Progress Domain
Q: Does the Student Achievement domain score (y-axis in relative performance) include CCMR and graduation rates? A: Yes, for schools that have that data. Q: House Bill 22 specifically says that the method used to evaluate performance should provide for the mathematical possibility that all districts and campuses receive an A, but this looks like a forced distribution that guarantees a set percentage of schools will get Ds and Fs. A: Once the cut points are set using 2016–17 accountability data, the cut points will stay fixed for five years. That way any district or campus will be able to earn an A. Here are a few common questions that we get about Part B. The one on the right is especially important; we’re hearing that there is a lot of misunderstanding about this in the field.
30
Relative Performance: Measuring School Progress
Scatter plot of each district and campus (by campus type) comparing Student Achievement domain score Percentage of students who are economically disadvantaged Trendline showing average relationships Sliding cut points for campuses and districts based on Cut points for each grade based on bands below and above the average line Separate cut points Elementary Schools Middle Schools High Schools/K–12 AEAs Cut points based on slope-intercept form Based on 2016–17 performance Intended to stay fixed for five years Cut points will be known before ratings release Here’s a little more detailed explanation of how Part B will work We build a scatter plot of each district and campus that compares the score in the Student Achievement domain against the percentage of students who are considered economically disadvantaged Draw a line of best fit and have sliding cut cores based on distance above or below the line Elementary schools, middle schools, high schools/K–12, AEA, and districts. The cut points will be set using 2016–17 accountability data and are intended to stay fixed for five years. The cut points will be known before ratings release. 30
31
Relative Performance: Sample Calculation
𝑦=𝑚𝑥+𝑏 𝑦 is the predicted Student Achievement domain score. 𝑥 is the percentage of students who are economically disadvantaged. 𝑚 is the slope of the trendline. 𝑏 is the distance from the trendline (what decides the grade); it is based on average variance from trendline. Sample Middle School 94.4% economically disadvantaged (𝑥) 𝑦 = –.15666(𝑥) 𝑦 = –.15666(94.4) 𝑦 = – Predicted Student Achievement domain score (𝑦 ) = 31 Actual Student Achievement domain score: 25 Score in relative performance: D The cut points will be based on the slope-intercept form Where 𝑦 is the predicted Student Achievement domain score 𝑥 is the percentage of students who are economically disadvantaged. 𝑚 is the slope of the trendline. 𝑏 is the distance from the trendline (what decides the grade); it is based on average variance from trendline. We will establish letter grade lookup tables using five different 𝑦=𝑚𝑥+𝑏 calculations: elementary, middle, high school/K–12, AEA, and districts Here’s an sample calculation [read through the bullets] Using that calculation, we compare the difference between a campus’s predicted Student Achievement domain score and its actual Student Achievement domain score to determine the grade. We’re not intending for districts and campuses to use this formula to try to determine their score for Part B. We’re going to produce lookup tables based on the percentage of students considered economically disadvantaged. 31
32
School Progress Domain: Feedback Opportunities
New approach to growth Additional ways to measure growth in high school Percentage of students who need to grow to constitute Excellent performance Minimally acceptable performance Combining two parts Best of Weighted average Average For Part B, what is the right cut points for Excellent performance Unacceptable performance This slide shows all of the feedback that we’re asking for related to this domain. The first four bullets are for Part A and are the same as in the slide 17 The specific feedback that we’re looking for related to Part B is the cut points. What are is the right cut point for excellent performance? Unacceptable performance?
33
? https://www.surveymonkey.com/r/5RBLDFM feedbackAF@tea.texas.gov
Questions and Feedback Feedback ? Resources (512) 33
34
The Closing the Gaps Domain
HB 22 A-F Accountability Good afternoon, everyone. Thank you for joining us for the second in our series of three ToT TETNs titled “The Implementation of House Bill 22: Collaborating to Build a Better Accountability System. The purpose of this series of TETNs is to provide greater details on the development of the new accountability system and get your feedback, then have you use this same presentation to provide the same training to your districts so we can get their feedback as well. In this episode, we will explore the School Progress domain. We’ll go into detail of the Closing the Gaps domain at the TETN in October. The Closing the Gaps Domain
35
Closing the Gaps: Ensuring Educational Equity
Student Achievement School Progress Closing The Gaps
36
Closing the Gaps: Ensuring Educational Equity
All Students Continuously Enrolled and Mobile English Learners (ELs) Economically Disadvantaged Race/Ethnicity Special Education x x
37
Closing the Gaps: Ensuring Educational Equity
Student Groups All Students African American Hispanic White American Indian Asian Pacific Islander Two or More Races Economically Disadvantaged Current and Former Special Education Current and Monitored English Learners Continuously Enrolled/Non-Continuously Enrolled Indicators Academic Achievement in Reading, Mathematics, Writing, Science and Social Studies Growth in Reading and Mathematics (Elementary and Middle Schools) Graduation Rates English Learner Language Proficiency Status College, Career, and Military Readiness Performance At or Above Meets Grade Level Performance in Reading and Mathematics
38
Closing the Gaps: Student Groups
Current and Former Special Education Defined by HB 22 Formerly receiving special education services The student was reported in PEIMS the preceding year as enrolled at the campus and participating in a special education program. The student is reported (PEIMS and STAAR answer documents) as enrolled at the campus in the current year and not participating in a special education program. Current modeling shows that this affects approximately 110 districts and six campuses when a the minimum-size criteria of 25 is applied. Feedback Opportunity For how many years in the past should we check for participation in special education?
39
Closing the Gaps: Student Groups
Continuously Enrolled and Non-Continuously Enrolled Not defined by HB 22 Districts Grades 4–12: Enrolled at a district in the fall snapshot in the current school year and each of the three previous years Grade 3: Enrolled at a district in the fall snapshot in the current school year and each of the previous two years Campuses Grades 4–12: Enrolled at a campus in the fall snapshot in the current school year and in the same district in each of the three previous years Grade 3: Enrolled at a campus in the fall snapshot in the current school year and in the same district each of the previous two years Feedback Opportunity Should we use an alternate definition? If so, what?
40
Closing the Gaps: Continuously Enrolled in District
2017 2016 2015 2014 10th Grade 9th Grade 8th Grade 7th Grade Continuously Enrolled
41
Closing the Gaps: Continuously Enrolled in District
2017 2016 2015 2014 10th Grade 9th Grade 8th Grade Non-Continuously Enrolled
42
Closing the Gaps: Continuously Enrolled in District
2017 2016 2015 3rd Grade 2nd Grade 1srt Grade Continuously Enrolled
43
Closing the Gaps: Continuously Enrolled in District
2017 2016 2015 3rd Grade 1st Grade Non-Continuously Enrolled
44
Closing the Gaps: Continuously Enrolled in District
2017 2016 2015 2014 10th Grade 9th Grade 8th Grade 7th Grade Continuously Enrolled
45
Closing the Gaps: Continuously Enrolled in District
2017 2016 2015 2014 10th Grade 9th Grade 8th Grade Non-Continuously Enrolled
46
Closing the Gaps: Continuously Enrolled in District
2017 2016 2015 3rd Grade 2nd Grade 1st Grade Continuously Enrolled
47
Closing the Gaps: Continuously Enrolled in District
2017 2015 3rd Grade 1st Grade Non-Continuously Enrolled
48
Closing the Gaps: Student Groups
Current and Monitored ELs Allowed by ESSA Current ELs ELs through their fourth year of monitoring. Feedback Opportunities Should we monitor for four years? Only two? Should we report current and monitored ELs separately?
49
Closing the Gaps: Indicators
Academic Achievement STAAR performance (percentage at or above Approaches Grade Level) Targets by subject area English Language Arts/Reading Mathematics Writing Science Social Studies Targets stable for five years Safe Harbor/Required Improvement applied
50
Closing the Gaps: Indicators
Growth Elementary and Middle Schools English Language Arts/Reading (School Progress domain) Mathematics (School Progress domain) Graduation Rates High Schools, K–12, Districts Federal graduation rates (without exclusions) Targets Stable for five years Safe Harbor/Required Improvement applied
51
Closing the Gaps: Indicators
English Language Proficiency Status TELPAS Progress Rate Current ELs Feedback Opportunity Should we wait on TELPAS given changes in test this year? This would involve different standards within a 5 year window.
52
Closing the Gaps: Indicators
School Quality or Student Success High Schools, K–12, and Districts College, Career, and Military Readiness (Student Achievement domain) Targets stable for five years Safe Harbor/Required Improvement applied Elementary and Middle Schools STAAR Grade 3–8 Performance Reading (percentage at or above Meets Grade Level) Mathematics (percentage at or above Meets Grade Level)
53
Overall Grade Closing the Gaps: Ensuring Educational Equity
Student Group Achievement Target % of Student Groups that meet target Overall Grade Feedback Opportunity Percentage of student groups doesn’t count degree of challenge in any group. Should we attempt a more complicated formula? And should we weight a given cell type more than others?
54
Closing the Gaps: Aligning Accountability Systems
55
Closing the Gaps: Sample Status Report
56
Closing the Gaps: Sample Status Report
All African Students American Hispanic White Academic Achievement STAAR Performance Status (Percentage at or above Approaches Grade Level) Target 80.0% Reading Y Mathematics Writing Science Social Studies
57
Growth (EL & MS)/ Graduation Rates (HS & K12)
Closing the Gaps: Alignment with ESSA All African Students American Hispanic White Growth (EL & MS)/ Graduation Rates (HS & K12) STAAR Growth Status (Elementary and Middle Schools) Target 70.0% Reading Y Mathematics Federal Graduation Status (Target: See Reason Codes) (High Schools and K–12) Graduation Target Met Reason Code a + Graduation uses ELL (Ever HS) rate ***Federal Graduation Rate Reason Codes: a = Graduation rate goal of 90% c = Safe harbor target of a 10% difference from the prior year rate and the goal B = Four-year graduation rates target of ##% d = Five-year graduation rate target of ##%
58
Closing the Gaps: Alignment with ESSA
ELL (Current) ELP English Learner Language Progress 42.0% TELPAS Progress Rate Target Y TELPAS Progress Rate
59
Closing the Gaps: Progress of ELs
EL Progress reflects an English Learner’s progress towards achieving English language proficiency. Data source is TELPAS results. Accountability subset rule is applied. A student is considered having made the EL Progress if he/she advances by at least one score of the composite rating from the prior year to the current year, or his/her result is “Advanced High.” If the prior year composite rating is not available, second or third year prior are used. The minimum size is 25. Small number analysis is applied if there are fewer than 25 current EL students.
60
School Quality or Student Success
Closing the Gaps: Alignment with ESSA All African Students American Hispanic White School Quality or Student Success College, Career, and Military Readiness Performance Status (High Schools and K–12) Target 40.0% College, Career, and Military Readiness Y STAAR Grade 3–8 Reading and Mathematics Performance (at or above Meets Grade Level Standard) (Elementary and Middle Schools) 45.0% Reading Mathematics
61
Closing the Gaps: Safe Harbor Provision
To avoid unintended consequences and recognize improvement over time Available for all indicators For districts and campuses that do not meet the target on an indicator District and campuses that miss a target will have no negative consequences if they make sufficient progress over the previous year. The progress must be enough that (if continued at that rate) a district or campus would meet an interim or long-term goal in a specified amount of time.
62
Closing the Gaps: Safe Harbor Calculation
Variables Last year’s result This year’s result Goal (interim or long term) Years to meet goal Example One Scenario Performance on mathematics STAAR by students in special education Last year’s score (45) This year’s score (53) Goal (interim) (80) Years to meet goal (5) Example One Calculation Last year’s result missed the target by 35 points (80 – 45 = 35) Because the years to meet goal is 5, this campus must improve its score for this indicator by 7 points each year (35 5 = 7). This year’s score is 8 points better than last year’s (53 – 45 = 8) Safe harbor is invoked. There are no negative consequences of missing that target for this indicator.
63
Closing the Gaps: Safe Harbor Calculation
Example Two Scenario Performance on mathematics STAAR by students in special education Last year’s score (60) This year’s score (61) Goal (long term) (90) Years to meet goal (15) Example Two Calculation Last year’s result missed the target by 30 points (90 – 60 = 30) Example Two Calculation (cont.) Because the years to meet goal is 15, this campus must improve its score for this indicator by 2 points each year (30 15 = 2). This year’s score is 1 points better than last year’s (61 – 60 = 1) Safe harbor is not invoked. There are negative consequences of missing that target for this indicator. Feedback Opportunity Should we apply the same standard for expectation to all student groups, given safe harbor rules?
64
Closing the Gaps: Data Modeling
Percentage of Elementary Schools Meeting Achievement Target Without Safe Harbor With Safe Harbor (Five-Year Target) With Safe Harbor (Fifteen-Year Target) Group Frequency Percent 00–20% 2009 46.29 21–40% 720 16.59 41–60% 549 12.65 61–80% 479 11.04 81–100% 583 13.43 Group Frequency Percent 00–20% 887 20.44 21–40% 993 22.88 41–60% 909 20.94 61–80% 784 18.06 81–100% 767 17.67 Group Frequency Percent 00–20% 691 15.92 21–40% 970 22.35 41–60% 995 22.93 61–80% 878 20.23 81–100% 806 18.57
65
Closing the Gaps: Data Modeling
Percentage of Middle Schools Meeting Achievement Target Without Safe Harbor With Safe Harbor (Five-Year Target) With Safe Harbor (Fifteen-Year Target) Group Frequency Percent 00–20% 903 54.63 21–40% 248 15.00 41–60% 225 13.61 61–80% 154 9.32 81–100% 123 7.44 Group Frequency Percent 00–20% 249 15.06 21–40% 387 23.41 41–60% 434 26.26 61–80% 334 20.21 81–100% Group Frequency Percent 00–20% 130 7.86 21–40% 290 17.54 41–60% 505 30.55 61–80% 417 25.23 81–100% 311 18.81
66
Closing the Gaps: Data Modeling
Percentage of High Schools Meeting Achievement Target Without Safe Harbor With Safe Harbor (Five-Year Target) With Safe Harbor (Fifteen-Year Target) Group Frequency Percent 00–20% 169 13.29 21–40% 288 22.64 41–60% 369 29.01 61–80% 242 19.03 81–100% 204 16.04 Group Frequency Percent 00–20% 34 2.67 21–40% 137 10.77 41–60% 351 27.59 61–80% 444 34.91 81–100% 306 24.06 Group Frequency Percent 00–20% 24 1.89 21–40% 102 8.02 41–60% 315 24.76 61–80% 486 38.21 81–100% 345 27.12
67
Closing the Gaps: Alignment with ESSA
Identification of Schools: Comprehensive Support and Improvement Lowest-performing five percent of campuses based on overall A–F grade High schools with less than 67 percent graduation rate Certain targeted schools that do not improve in a specified time Beginning in summer 2018 based on 2017–18 data Updated at least every three years thereafter Feedback Opportunity Should we identify these schools every year or every three years?
68
Closing the Gaps: Sample Status Report
69
Targeted Campus Determination
Closing the Gaps: Alignment with ESSA Identification of Schools: Targeted Support and Improvement Three consecutive years of missing a target in the same student group on the same indicator Summer 2019 based on 2017, 2018, and 2019 data z All African Students American Hispanic White Targeted Campus Determination Multi-Year Performance Status Consecutive Years Missing Performance Target Reading Mathematics Multi-Year Growth Status Consecutive Years Missing Growth Target Multi-Year Graduation Status Consecutive Years Missing Graduation Target Multi-Year English Learner Language Proficiency Status Multi-Year Student Success Status STAAR Grade 3- 8 Reading and Mathematics Performance (at or above Meets Grade Level Standard) (Elementary and Middle Schools) College, Career, and Military Readiness
70
Closing the Gaps Domain: Common Questions
Q: Must every student group meet each of the indicators? A: Campuses and districts will be evaluated for each student group and associated indicator that has data and meets minimum-size criteria. Q: Must a district or campus meet every one of the indicators for which it has data in order to make an A? A: Not necessarily. Our current plan is to determine grade cut points based on the percentage of indicators met. Q: If looking at students who formerly receive special education services as a student group affects so few districts and campuses, why is it being included in accountability A: Looking at that specific student group is required by House Bill 22. Q: Why does the accountability system now include former ELs in their third and fourth year of monitoring? A: The Every Student Succeeds Act (ESSA) allows it.
71
? Questions and Feedback Feedback Resources
Resources (512) 71
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.