Connecticut’s Growth Model for the Smarter Balanced Summative Assessments: English Language Arts (ELA) and Mathematics October 2016 Hello. Thank you.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
1 The New York State Education Department New York State’s Student Reporting and Accountability System.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
Standards-Based Assessment Overview K-8 Fairfield Public Schools Fall /30/2015.
Annual Measurable Objectives (trajectory targets).
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
KHS PARCC/SCIENCE RESULTS Using the results to improve achievement Families can use the results to engage their child in conversations about.
1 Graduation Rates: Students Who Started 9 th Grade In 2003, 2004, 2005, 2006, and 2007.
Every Student Succeeds Act (ESSA) Accountability
California Assessment of STUDENT PERFORMANCE and PROGRESS
Conversation about State Report Card November 28, 2016
DJJ Accountability Rating System
Understanding the Smarter Balanced Assessment Results
Connecticut’s Growth Model for the Smarter Balanced Summative Assessments: English Language Arts (ELA) and Mathematics Hello. Thank you for listening.
What is a CAT? What is a CAT?.
Teacher SLTs
CORE Academic Growth Model: Introduction to Growth Models
Charlton Kings Junior School
Driving Through the California Dashboard
Teacher SLTs
Wethersfield Teacher Evaluation and Support Plan
Smarter Balanced Assessment Results
Student Achievement Data Displays Mathematics & Reading Grade 3
Student Growth Measurements and Accountability
Understanding the Next-Generation MCAS
Understanding the Next-Generation MCAS
Mesa Union School District “A Day in the Life of Data”
IT’S ALL ABOUT GROWTH!. Hemet Unified School District’s Use of Measures of Academic Progress (MAP)
Release of PARCC Student Results
What is API? The Academic Performance Index (API) is the cornerstone of California's Public Schools Accountability Act of 1999 (PSAA). It is required.
NWEA Measures of Academic Progress (MAP)
How to Interpret the District Created Final Exam Teacher Report
Overview of the Georgia Student Growth Model
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
FY17 Evaluation Overview: Student Performance Rating
Session 4 Objectives Participants will:
EVAAS Overview.
Framework for a Next-Generation Accountability System
Massachusetts’ Next-Generation Accountability System
Understanding the Next-Generation MCAS
CORE Academic Growth Model: Results Interpretation
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
ESSA Update “Graduation Rate & Career and College Readiness”
Specifications Used for School Identification Under ESSA in
Framework for a Next-Generation Accountability System
Danvers Public Schools: Our Story
Campus Comparison Groups and Distinction Designations
Understanding the Next-Generation MCAS
Summative: Formative resources: Interim Assessments:
Validating Student Growth During an Assessment Transition
Teacher SLTs
ESSA for AFESC Schools 2018 Under the reauthorization of ESEA, the federal government required each state to design an accountability system that met.
School Performance Measure Calculations SY
Introduction to the Georgia Student Growth Model
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
Madison Elementary / Middle School and the New Accountability System
WAO Elementary School and the New Accountability System
Driving Through the California Dashboard
5/16/2017 Inspiring excellence!
Office of Strategy, Innovation and Performance
2019 Report Card Update Marianne Mottley Report Card Project Director
(Introduce new electronic score reports)
Welcome Reporting: Individual Student Report (ISR), Student Roster Report, and District Summary of Schools Report Welcome to the Reporting: Individual.
Teacher SLTs
Understanding the CAASPP Student Score Reports
Presentation transcript:

Connecticut’s Growth Model for the Smarter Balanced Summative Assessments: English Language Arts (ELA) and Mathematics October 2016 Hello. Thank you for listening to this narrated presentation about Connecticut’s growth model. This model applies to the Smarter Balanced Summative Assessments in English Language Arts – ELA - and Mathematics for students entering grades 4 through 8. The results from this growth model are a key component of Connecticut’s Next Generation Accountability System for districts and schools.

Agenda What is growth? How is it different from achievement? What is Connecticut’s approach to measuring growth? What factors are considered when establishing ambitious yet achievable targets? How and when will growth be incorporated into the Next Generation Accountability System for districts and schools? Here are the topics we will cover in this presentation. What is growth? How is it different from achievement? What is Connecticut’s approach to measuring growth? What factors are considered when establishing ambitious yet achievable individual student growth targets? How and when will growth be incorporated into the Next Generation Accountability System for districts and schools?

What is growth? How is it different from achievement? Achievement or Proficiency: A one-time snapshot measurement of a student’s academic performance Growth: Change in achievement score for the same student between two or more points in time. To understand the differences between achievement and growth, let’s start first with a simple definition of achievement. Achievement or proficiency or status is a one-time snapshot measurement of a student’s academic performance in a subject area like ELA or Mathematics. Growth on the other hand is about the change in that achievement score for the same student between two or more points in time.

Three Ways to Understand Change in Performance CT SAT School Day Presentation 11/7/2018 Three Ways to Understand Change in Performance Achievement Change “Rough Cohort” Change Matched Student Cohort Growth What is it? How does it work? Compares student achievement across years (e.g., achievement of grade 4 students in 2014-15 is compared to the achievement of grade 4 students in 2015-16) Compares the achievement of a group of students from one grade in year 1 to a group of students in the next higher grade in year 2 (e.g., grade 3 in 2014-15 to grade 4 in 2015-16) Compares the achievement of the same student from one grade in year 1 to the next higher grade in year 2 (e.g., student in grade 3 in 2014-15 to grade 4 in 2015-16) Who is compared? Different students across different years Mostly the same students though there can be some mismatches due to student mobility, entry, and exit The same students from one year to the next… no mismatches What is measured? Proficiency rate (e.g., percent at or above level 3) and/or average scale scores The amount of growth to standard achieved by each student and groups of students What does it offer? The starting point for understanding change A “rough estimate” of growth The gold standard for growth and for understanding curricular and instructional effectiveness Let’s look at three ways to understand change in performance. The first, Achievement change, simply compares student achievement for the same grade across years. For example, a superintendent may say that the proficiency rate of students in grade 4 in our district has increased from 50% in one year to 53% in the next year, an improvement of 3 percentage points. While that is technically accurate, this approach is actually comparing the performance of two different groups of fourth graders. The difference in performance between the two groups may be due to the fact that the groups are different to begin with; may be the higher performing group of fourth graders just started off higher. The second approach is something we refer to as the “rough cohort” approach. In this approach, for example, a superintendent may compare the proficiency rate of this year’s fourth graders to that of last year’s third graders. If your district experiences little student mobility and almost all students are promoted from one grade to the next each year, most of the students will be the same across years. However, if your district experiences high student mobility, a greater percentage of students across the two years will be different. The last approach presented here, the matched student cohort growth, compares the achievement of the same student from one grade in year 1 to the next higher grade in year 2. This is generally considered as the gold standard for growth because there are no mismatched students; only those students who are matched across years are included in this calculation. The matched approach allows us to quantify the amount of growth achieved by the same students from near the end of one grade, to the end of the next grade – a good measure of curriculum and instructional effectiveness.

What is Connecticut’s approach to measuring growth? Similar to approach used with CMT growth model Criterion referenced Uses Smarter Balanced vertical scale that spans grades/years Preserves achievement level concept for interpretability Provides ambitious yet achievable individual student targets Expects all students to grow, including those performing in Levels 3 and 4 Can be aggregated for group level results Reviewed by Connecticut Technical Advisory Committee So, what is Connecticut approach to measuring growth? Connecticut’s growth model uses the matched student cohort growth approach. This approach is very similar to the one used in the past with the Connecticut Mastery Test or CMT. The approach is criterion referenced in that the amount of growth made by a student from one year to the next is evaluated against a fixed standard – or criterion – and not against how other students grew. This criterion is based on the Smarter Balanced Vertical Scale for ELA and Mathematics, a scale that spans the grades from 3 through 8. The ELA and Math scales range from around two thousand one hundred to two thousand eight hundred. The growth model preserves the concept of the Smarter Balanced achievement levels to support interpretation. It provides targets and an approach to evaluating growth against those targets that are both ambitious yet achievable for each and every student. It expects all students to grow, including those in the highest achievement levels of 3 and 4. The individual student growth can be aggregated for group level results. This model was reviewed multiple times with Connecticut’s Technical Advisory Committee which is a group of psychometric experts from around the country.

What factors are considered when establishing ambitious yet achievable targets? Empirical: What is the actual growth achieved by students performing at different segments of the vertical scale? Measurement Error: Does the growth expectation exceed the pooled average measurement error from both year 1 and year 2 assessments? Time: Are students on a path to higher levels of achievement in the future? Now let’s look at the factors that are considered when establishing these ambitious yet achievable growth targets. Three main factors were considered. The first is the actual amount of growth achieved by students from 2014-15 to 2015-16. This growth was evaluated for students who started at different levels of performance on the vertical scale. Such a segmented analysis is done because we know that students at different levels of achievement on the vertical scale often exhibit different amounts of growth. Therefore, we wanted to ensure that the targets that are established are achievable for all students. The second factor we considered is measurement error. Psychometric theory tell us that a test score is an estimate of a student’s achievement and comes with a certain amount of measurement error. When calculating growth, we are comparing test scores from two tests, each of which has some error. Therefore, when establishing growth targets, we considered whether the growth expectations exceed the combined average standard error from both tests. The third factor we considered was whether the targets put students on a path to higher levels of achievement in future years. The targets that were ultimately established were achieved by approximately 40 percent of the matched students.

ELA Achievement Level Ranges and Growth Targets Grade in Yr. 1 Level Level 1: Not Met Level 2: Approaching Level 3: Met Level 4: Exceeded 1 - LOW 2 - HIGH 3 - LOW 4 - HIGH 5 - LOW 6 - HIGH 7 - LOW 8 - HIGH 3 Range 2114-2330 2331-2366 2367-2399 2400-2431 2432-2460 2461-2489 2490-2522 2523+ Target 82 71 70 69 68 64 60 45/maintain 4 2131-2378 2379-2415 2416-2444 2445-2472 2473-2502 2503-2532 2533-2568 2569+ 58 55 49 34/maintain 5 2201-2405 2406-2441 2442-2471 2472-2501 2502-2541 2542-2581 2582-2619 2620+ 56 48 43 39 30 16/maintain 6 2210-2417 2418-2456 2457-2493 2494-2530 2531-2574 2575-2617 2618-2656 2657+ 73 53 47 44 38 33 21/maintain 7 2258-2438 2439-2478 2479-2515 2516-2551 2552-2600 2601-2648 2649-2687 2688+ 50 40 31 20 12/maintain 8 2288-2446 2447-2486 2487-2526 2527-2566 2567-2617 2618-2667 2668-2703 2709+ This is the annual growth target table for English language arts. Notice the four Smarter Balanced achievement levels on the very top. Also note that each level is divided into two categories: low and high – to yield eight total categories for each grade. The scale score range for each category within each grade is listed in the corresponding cell along with the requisite amount of scale score growth that is expected. Here’s how you determine the growth target amount for a student. Let’s say a 3rd grader has a Smarter Balanced ELA vertical scale score of 2350 in the first year. This places the student in the High Level 1 category in Grade 3. By the end of grade 4, this student will be expected to grow 71 points from 2350 or achieve a vertical scale score of 2421. Let’s take another example. Let’s look at a 6th grader with a Smarter Balanced ELA score of 2460. This places the student in Low Level 3 in Grade 6. By the end of grade 7, this student is expected to grow 53 points from 2460 or achieve a vertical scale score of 2513.

Math Achievement Level Ranges and Growth Targets Grade in Yr. 1 Level Level 1: Not Met Level 2: Approaching Level 3: Met Level 4: Exceeded 1 - LOW 2 - HIGH 3 - LOW 4 - HIGH 5 - LOW 6 - HIGH 7 - LOW 8 - HIGH 3 Range 2189-2351 2352-2380 2381-2408 2409-2435 2436-2468 2469-2500 2501-2526 2527+ Target 77 61 59 60 57 56 47/maintain 4 2204-2381 2382-2410 2411-2447 2448-2484 2485-2516 2517-2548 2549-2574 2575+ 51 38 40 44 46 47 43 37/maintain 5 2219-2419 2420-2454 2455-2491 2492-2527 2528-2553 2554-2578 2579-2605 2606+ 45 42 41 44/maintain 6 2235-2434 2435-2472 2473-2512 2513-2551 2552-2580 2581-2609 2610-2639 2640+ 49 36 31/maintain 7 2250-2438 2439-2483 2484-2525 2526-2566 2567-2600 2601-2634 2635-2664 2665+ 58 35 31 37 35/maintain 8 2265-2455 2457-2503 2504-2544 2545-2585 2586-2619 2620-2652 2653-2685 2686+ And this is the annual growth target table for Mathematics that should be read in the same manner as the ELA table. It is expected that these growth targets will remain in place for at least three years.

Now let’s look at an example of how the individual student level targets play out in the aggregate. Let’s look at four students who have the exact same target of 60 points. Student 1 grew 42 points from one year to the next. This student did not meet the target but achieved seventy percent of the target (42 out of 60 points). Student 2 grew exactly 60 points. This student met the target, and achieved 100 percent of the target. Student 3 grew 66 points. This student met the target and actually achieved 110 percent of the target. Student 4 grew 36 points. This student did not meet the target but achieved 60 percent of the target. Overall, the growth rate was 50 percent because 2 out of 4 students met their target. The average percentage of target achieved was 85%; that is the average of the individual student percentages of target achieved.

Two Aggregate Outcome Metrics Growth Rate Percentage of Target Achieved Measure? Percentage of students meeting their respective growth target Average percentage of growth target achieved for all students Precision? Binary (yes/no), less precise Based on scale score, more precise Continuous? No. Students nearly meeting target will be deemed not meeting target Yes. Students get “credit” for any growth up to and beyond the target Interpretability? Simple to understand More nuanced Uses? Reporting only Reporting and district/school accountability So, the two aggregate statistics that will be reported are the growth rate and the percentage of target achieved. The growth rate is the percentage of students meeting their respective growth target, while the percentage of target achieved is the average percentage of the growth target that is achieved by all students in the group. The growth rate is a binary measure. It asks a yes/no question. Did the student meet her target? The Percentage of Target Achieved on the other hands asks a different question… how much of the target did the student achieve? The growth rate is not a continuous measure. Students nearly meeting the target will be deemed to not have meet the target, even if they missed it by just 1 scale score point! On the contrary, the Percentage of Target Achieved is a continuous measure. Students get credit for any growth up to and even 10 percent beyond the target. The growth rate is simpler to understand while the percentage of target achieved is a bit more nuanced. We will report both measures but will include the more precise, average percentage of target achieved in the district and school accountability model.

How and when will growth be incorporated into the Next Generation Accountability System? Growth (Indicator 2) will be added to the system starting with the 2015-16 results. As with achievement, Growth (Indicator 2) points are awarded for All Students and High Needs groups. The points for Achievement (Indicator 1) will be halved for any school with Growth results. Growth will carry slightly more weight in the model than Achievement. In light of the discontinuance of the ELA Performance Task in February 2016, the rescored 2014-15 ELA scores that were based on the Computer-Adaptive Test (CAT) only will be used as the ELA baseline for an apples-to-apples comparison. Now let’s look at how and when these growth results will be incorporated into the next generation accountability system for districts and schools. Growth is indicator 2 of the system. It will be added starting with the 2015-16 results. As is the case with the achievement indicator, growth points are awarded for the All Students and High Needs groups. For any district or school with growth results, the points for the achievement indicators will be cut in half. As a result, growth will carry slightly more weight than achievement in the accountability system. In February 2016, the ELA Performance Task was eliminated from the ELA assessment. Therefore, the ELA results from 2014-15 were rescored after excluding the ELA Performance Task and using only the computer-adaptive test portion of the assessment. This enables an apples-to-apples comparison. For growth calculations, the rescored ELA results will be used.

What about other factors like poverty, language ability, or disability? The CSDE is not using a value-added approach to adjust targets or evaluate growth relative to some preconceived expectation based on student characteristics of what a student can achieve or how much he/she can grow. The CSDE is not setting different targets for different students. All students at a prior achievement range will have the same growth amount expectation. You may be wondering… so, what about factors like poverty, English language ability or student disability? To be clear, Connecticut’s model is not a value-added approach where targets are adjusted or growth results are evaluated using some preconceived expectation of student achievement that is based on student characteristics. Connecticut’s model does not set different targets for different students based on student characteristics. All students at a prior achievement range have the same growth amount expectation.

Not All Growth Models are Value-Added The terms “growth model” and “value-added” are often used interchangeably. But Value-Added is only one of several types of models that measure student growth. It is also the only model designed to determine which aspect of schooling (e.g., school, teacher, education program) is responsible for a students' growth. (Center for Public Education). Value-added models are focused on the effects of teachers and leaders… on student score gains. They address whether students grew more or less than expected. (O’Malley, McClarty, Magda, and Burling, 2011) Some people use the terms growth model and value-added model interchangeably. Connecticut’s approach is indeed a growth model but it is not a value-added model. There is no mysterious, statistical calculation that is done to precisely quantify the effects of teachers, leaders, schools or districts on student growth. Quite the contrary. Under Connecticut’s model, the calculations are transparent. Anyone with authorized access to student test scores from year 1 and year 2 can determine if those students achieved their target, and how much of the target they achieved.

Summary Criterion referenced: does not depend on how others do Continuous: all growth counts; no golden bands Familiar: similar to approach used with CMT Transparent: easily replicable; no “black-box” adjustments Collaborative: transparency allows for conversation/reflection Fair: excludes “partial-year” students Achievable: based on actual growth of Connecticut students Ambitious: encourages growth above target To summarize, Connecticut’s model is Criterion referenced: It does not depend on how others do. It is Continuous: all growth counts; there are no more golden bands. Unlike in the past, there is no incentive in this system to focus on getting a small group of students over some magical proficiency bar; instead the message here is that all growth achieved by all students counts. It is Familiar : because it uses an approach similar to that used with the CMT It is Transparent: Local districts and schools can replicate the results; there are no “black-box” adjustments to the growth results. It is Collaborative: because the transparency allows for conversation and reflection among educators. It is Fair: because it excludes “partial-year” students. Only those students who were enrolled in the same district or school on October 1st and at the time of testing are included in the calculations. It is Achievable: because it is based on the actual growth achieved by Connecticut students And lastly, It is Ambitious: because the model encourages growth above target. That bring us to end of this narrated presentation about Connecticut’s growth model for the Smarter Balanced ELA and Mathematics summative assessments. We hope this has been informative. Thank you.