Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Summary of NCDPI Staff Development 12/4/12
Why Take PLAN? PLAN shows your strengths and weaknesses in English, mathematics, reading, and science. PLAN lets you know if you’re on target for college.
Illinois High School to College Success Report High School Feedback Reporting In Cooperation with ACT, Inc. Illinois Community College Board Illinois Board.
Haywood County Schools February 20,2013
Student Growth Percentiles (SGP): A Comparison of Legislative Intent with Implemented Results at one Utah School (Timpanogos Academy)
Pennsylvania Value-Added Assessment System Overview: PVAAS
Welfare to Careers Medger Evers College Metropolitan College of New York Pace University December 2, 2008.
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
PVAAS + Other Data Consultation 2013 PVAAS AND OTHER DATA TOOLS SCHOOL CONSULTATION FALL 2013.
WATAUGA COUNTY SCHOOLS MIDDLE GRADES PRE-MATH 1 & MATH 1 PLACEMENT Advanced Math Placement Procedure.
Brock’s Gap Intermediate School Hoover City Schools Testing- Spring 2014 Results / Analysis- Fall 2014.
Mark DeCandia Kentucky NAEP State Coordinator
Monitoring Student Outcomes in California Partnership Academies David Stern Graduate School of Education University of California, Berkeley College & Career.
Aron, Aron, & Coups, Statistics for the Behavioral and Social Sciences: A Brief Course (3e), © 2005 Prentice Hall Chapter 3 Correlation and Prediction.
James Scott Ford Northside ISD, San Antonio, TX DEMOGRAPHIC FACTORS IN THE NEW TEXAS SCHOOL ACCOUNTABILITY SYSTEM: GROWTH AND ACHIEVEMENT.
Institute of Education Sciences (IES) 25 th Annual Management Information Systems Conference (Feb , 2012) Useful and Fair Accountability Data in.
Overview of the Early College High School Initiative Evaluation Susan Cole Mengli Song Andrea Berger American Institutes for Research Presentation at the.
How Does Secondary Education in Louisiana Stack up? Presented by Dr. Bobby Franklin January 31, 2005.
Fall Testing Update David Abrams Assistant Commissioner for Standards, Assessment, & Reporting Middle Level Liaisons & Support Schools Network November.
SY PVAAS Scatterplots State to IU Region to LEA Public Schools/Charter Schools/Comprehensive CTCs Math, Reading, Writing, Science Template Contact.
SY PVAAS Scatter Plots State to IU Region to School District Grades 4-8, 11 Math & Reading PVAAS Statewide Team for PDE Contact your IU PVAAS contact.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
California Assessment of Student Performance and Progress (CAASPP) 1 California Department of Education, September 2015 EL SEGUNDO UNIFIED SCHOOL DISTRICT.
Slide 1 Estimating Performance Below the National Level Applying Simulation Methods to TIMSS Fourth Annual IES Research Conference Dan Sherman, Ph.D. American.
CLINTON HIGH SCHOOL 2012 MCAS Presentation October 30, 2012.
MPS High School Evaluation Council of the Great City Schools Annual Fall Conference October, 2010 Deb Lindsey, Milwaukee Public Schools Bradley Carl, Wisconsin.
Widening Participation in Higher Education: A Quantitative Analysis Institute of Education Institute for Fiscal Studies Centre for Economic Performance.
Release of PARCC Student Results. By the end of this presentation, parents will be able to: Identify components of the PARCC English.
Developing Effective Teaching: When Assessment is a Gift.
Standards-Based Assessment Overview K-8 Fairfield Public Schools Fall /30/2015.
© CCSR ccsr.uchicago.edu. © CCSR Early Warning Indicators of High School Graduation and Dropout Elaine Allensworth.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Santa Ana Unified School District 2011 CST Enter School Name Version: Intermediate.
The animation is already done for you; just copy and paste the slide into your existing presentation.
College Comparisons. Mean Total Score by College (Possible Score Range 400 to 500) SSD = Total Scores for Colleges of Business, Education, Health and.
Understanding and Communicating About New Performance Standards on New Performance Standards on Michigan’s Standardized Tests RAISING EXPECTATIONS.
PED School Grade Reports (with thanks to Valley High School) ACE August 3, 2012 Dr. Russ Romans District Accountability Manager.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
ADEQUATE YEARLY PROGRESS. Adequate Yearly Progress Adequate Yearly Progress (AYP), – Is part of the federal No Child Left Behind Act (NCLB) – makes schools.
1 Mississippi Statewide Accountability System Adequate Yearly Progress Model Improving Mississippi Schools Conference June 11-13, 2003 Mississippi Department.
Kingsville ISD Annual Report Public Hearing.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Measuring Adequate Yearly.
The Effect of the Appalachian Math and Science Partnership on Student Achievement William Craig, Betsy Evans, and Eugenia Toma Martin School of Public.
Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
Breakout Discussion: Every Student Succeeds Act - Scott Norton Council of Chief State School Officers.
Analysis of AP Exam Scores
Introduction to Teacher Evaluation
What is Value Added?.
Student Growth Measurements and Accountability
Annual Report Georgetown ISD 2016 Accountability Rating:
IT’S ALL ABOUT GROWTH!. Hemet Unified School District’s Use of Measures of Academic Progress (MAP)
Release of PARCC Student Results
What is API? The Academic Performance Index (API) is the cornerstone of California's Public Schools Accountability Act of 1999 (PSAA). It is required.
Texas Academic Performance Report (TAPR)
Texas Academic Performance Report TAPR)
ACE August 3, 2012 Dr. Russ Romans District Accountability Manager
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
S519: Evaluation of Information Systems
CORE Academic Growth Model: Results Interpretation
ESSA Update “Graduation Rate & Career and College Readiness”
Using Data for Improvement
Annual Report Public Hearing
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
Madison Elementary / Middle School and the New Accountability System
WAO Elementary School and the New Accountability System
Presented by Joseph P. Stern
Annual Report Public Hearing
“Reviewing Achievement and Value-Added Data”
Presentation transcript:

Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership Academies Conference Sacramento, March 4,

What I’ll explain Why “value added” is the most valid way to compare academy students’ progress with other students in same school and grade How to compute value added Example of application to career academies 2

3 ??? What questions do you have at the start?

What is “value added”? Starts with matched data for individual students at 2 or more points in time Uses students’ characteristics and previous performance to predict most recent performance Positive value added means a student’s actual performance is better than predicted If academy students on average perform better than predicted, academy has positive value added 4

5.93 Correlation between academic performance index and % low-income students in California school districts

Value added is better measure than Comparing average performance of 2 groups of students without controlling for their previous performance – because one group may have been high performers to start with Comparing this year’s 11 th graders (for example) with last year’s 11 th graders – because these are different groups of students! 6

Creates better incentives Reduces incentive for academy to recruit or select students who are already performing well Recognizes academies for improving performance of students no matter how they performed in the past Provides a valid basis on which to compare student progress, and then ask why 7

What NOT to do DON’T attach automatic rewards or punishments to estimates of value added – use them as evidence for further inquiry DON’T rely only on test scores – analyze a range of student outcomes: e.g., attendance, credits, GPA, discipline, etc. DON’T use just 2 points in time – analyze multiple years if possible, and do the analysis every year 8

Recent reports National Academies of Science: “Getting Value out of Value Added” Economic Policy Institute: “Problems with the Use of Student Test Scores to Evaluate Teachers” m6iij8k.pdf 9

How it’s done Need matched data for each student at 2 or more points in time Accurately identify academy and non- academy students in each time period Use statistical regression model to predict most recent performance, based on students’ characteristics and previous performance 10

Example: comparing teachers Each point on graph shows one student’s English Language Arts test score in spring 2003 (horizontal axis) and spring 2004 (vertical axis) for an actual high school Regression line shows predicted score in 2004, given score in 2003 Students who had teacher #30 generally scored higher than predicted in 2004 – this teacher had positive value added 11

Scatterplot of 2003 and 2004 English Language Arts scores at one high school

Scatterplot of 2003 and 2004 scores, with regression line Dots above the line represent students who scored higher than predicted in Dots below the line represent students who scored lower than predicted.

Most students with teacher 30 scored higher in 2004 than predicted by their 2003 score This student’s 2004 score was higher than predicted This student’s 2004 score was lower than predicted

15 Example using academies, in a high school with 4 career academies and 4 other programs: Programs 2, 4, 5, and 8 are career academies

Parents’ education differs across programs 16

Student ethnicity also differs 17

18 Students in programs 4, 5, and 8 are less likely to have college-educated parents less likely to be white. Comparisons of student performance should take such differences into account.

Grade 11 enrollments, Analysis focused on students in grade 11 who were present in at least 75% of classes.

Mean GPA during junior year,

Mean 11 th grade test scores, spring

Mean 8 th grade test scores for juniors 22

23 Juniors in programs 4 and 5 had lower grades and test scores. But comparing 11 th grade test scores is misleading because students who entered programs 4 and 5 in high school were already scoring lower at end of 8 th grade. More valid comparison would focus on CHANGE in performance during

Numbers of students by change in English lang. arts performance level during Only program 8 had more students whose performance level went up than students whose performance level went down. Performance levels: far below basic, below basic, basic, proficient, advanced.

Change in GPA from grade 8 to Programs 1, 3, and 8 had students with highest GPAs in 8 th grade. GPA in 11 th grade was lower than in 8 th grade for students in these 3 programs.

Predicting 2010 test score based on 2009 score 26 Dots above the line represent students who scored higher than predicted in Dots below the line represent students who scored lower than predicted.

Predicting 11 th grade GPA based on 8 th grade 27 Dots above the line represent students who scored higher than predicted in Dots below the line represent students who scored lower than predicted.

28 Regression analysis uses prior performance along with other student characteristics to estimate each student’s predicted performance in In this analysis, programs 2-8 are compared to program 1. Positive regression coefficient says, on average, students in that program exceeded prediction more than students in program 1 did.

Value added results for test scores 29 Only program 8 had positive value added compared to program 1. The only statistically significant differences with program 1 were programs 2 and 4, both negative. In these two programs, students scored significantly lower than predicted.

Value added results for GPA 30 Programs 3, 6 and 8 were significantly different from program 1. Average GPA was lower than predicted in these three programs.

Questions for this school Why did juniors’ GPA in fall below prediction in programs 3, 6, and 8? Why did juniors’ test scores in English language arts fall below prediction in programs 2 and 4? Important to see whether these patterns persist for more than one year. 31

Conclusion Academy National Standards of Practice: “It is important to gather data that reflects whether students are showing improvement and to report these accurately and fairly to maintain the academy’s integrity.” Measuring value added will keep academies in the forefront of evidence-based practice 32

33 ??? What questions do you have now?