MPS High School Evaluation Council of the Great City Schools Annual Fall Conference October, 2010 Deb Lindsey, Milwaukee Public Schools Bradley Carl, Wisconsin.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Using Growth Models to improve quality of school accountability systems October 22, 2010.
A Model Description By Benjamin Ditkowsky, Ph.D. Student Growth Models for Principal and Student Evaluation.
School Report Cards 2004– The Bottom Line More schools are making Adequate Yearly Progress. Fewer students show serious academic problems (Level.
Haywood County Schools February 20,2013
Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership.
KIPP: Effectiveness and Innovation in Publicly-Funded, Privately-Operated Schools October 4, 2012 Presentation to the APPAM/INVALSI Improving Education.
The Differential Trajectories of High School Dropouts and Graduates By: Gregory P. Hickman, Ph.D. Mitchell Bartholomew Jennifer Mathwig Randy Heinrich,
Carolyn M. Wood - Assistant State Superintendent Division of Accountability, Assessment, and Data Systems October 31,
Understanding the Pennsylvania School Performance Profile Introduction.
Understanding Wisconsin’s New School Report Card.
The Impact of Comprehensive School Counseling Programs on Student Performance Greg Brigman, Ph.D. Linda Webb, Ph.D. Elizabeth Villares, Ph.D. Florida Atlantic.
99th Percentile 1st Percentile 50th Percentile What Do Percentiles Mean? Percentiles express the percentage of students that fall below a certain score.
1. The Process Rubrics (40 or 90) should be open soon. 2. The Data Profile and SI Plan are expected to open in December. 3. The complete CNA will.
Vouchers in Milwaukee: What Have We Learned From the Nation’s Oldest and Largest Program? Deven Carlson University of Oklahoma.
Dr. Michael Flicek Education Consultant October 8, 2013 Wyoming School Performance Rating Model Report to: Wyoming State Board of Education.
Cindy M. Walker & Kevin McLeod University of Wisconsin - Milwaukee Based upon work supported by the National Science Foundation Grant No
Questions & Answers About AYP & PI answered on the video by: Rae Belisle, Dave Meaney Bill Padia & Maria Reyes July 2003.
School Performance Index School Performance Index (SPI): A Comprehensive Measurement System for All Schools Student Achievement (e.g. PSSA) Student Progress.
2012 Traditional SPF Background & Measures September 17, 2012.
Measuring Charter Quality Eric Paisner, NAPCS Anna Nicotera, NAPCS Lyria Boast, Public Impact.
Fall Testing Update David Abrams Assistant Commissioner for Standards, Assessment, & Reporting Middle Level Liaisons & Support Schools Network November.
State Charter Schools Commission of Georgia SCSC Academic Accountability Update State Charter School Performance
1 Milwaukee Mathematics Partnership Program Evaluation Year 6 Results Carl Hanssen Hanssen Consulting, LLC Cindy Walker University of Wisconsin-Milwaukee.
STUDENT AIMS PERFORMANCE IN A PREDOMINANTLY HISPANIC DISTRICT Lance Chebultz Arizona State University 2012.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
Workgroup 1: College and Career Readiness Outcomes State Outcomes Subgroup COSA Winter Conference 2012.
1 Milwaukee Mathematics Partnership Program Evaluation Year 5 Results Carl Hanssen Hanssen Consulting, LLC Cindy Walker University of Wisconsin-Milwaukee.
Transforming the High School Experience: Early Lessons from the New York City Small Schools Initiative Council of the Great City Schools October 21, 2010.
The Impact of Comprehensive School Counseling Programs on Student Performance Greg Brigman, Ph.D. Linda Webb, Ph.D. Elizabeth Villares, Ph.D. Florida Atlantic.
This year’s PSSA results show that Pennsylvania is on track to move all students to proficiency by 2014 as required by the federal No Child Left Behind.
Iowa School Report Card (Attendance Center Rankings) December 3, 2015.
Assigns one of three ratings:  Met Standard – indicates campus/district met the targets in all required indexes. All campuses must meet Index 1 or 2.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
P-20 in Action – Michigan’s Focus on Career and College Ready Students: Success through Cross- Agency Collaboration 2012 MIS Conference February 16, 2012.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
March 26, 2012 North Scott School Board Presentation.
University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Challenges for States and Schools in the No.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Measuring Adequate Yearly.
CINS Community Meeting: Data Dig January 2016 APS Research & Evaluation John Keltz & Rubye Sullivan.
2009 Grade 3-8 Math Additional Slides 1. Math Percentage of Students Statewide Scoring at Levels 3 and 4, Grades The percentage of students.
Measuring Turnaround Success October 29 th, 2015 Jeanette P. Cornier, Ph.D.
Overview Plan Input Outcome and Objective Measures Summary of Changes Board Feedback Finalization Next Steps.
Measuring for Board Development Series June, 2014 SUCCESS Digging Deeper Into Academic Performance Data.
Using PVAAS for Gifted Learners Tanya Morret. How effectively is our system meeting the needs of our advanced learners? Are we projected to grow our advanced.
Evaluating Outcomes of the English for Speakers of Other Languages (ESOL) Program in Secondary Schools: Methodological Advance and Strategy to Incorporate.
School Accountability and Grades Division of Teaching and Learning January 20, 2016.
© 2014, Florida Department of Education. All Rights Reserved. Accountability Update School Grades Technical Assistance Meeting.
Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
Development of Statewide Community College Value- Added Accountability Measures Michael J. Keller Director of Policy Analysis and Research Maryland Higher.
February 2012 State Board Ruling: School Grade Calculations
State of Wisconsin School Report Cards Fall 2014 Results
A Growth Measure for ALL Students.
Brassfield 5th & 6th Grade Center Bixby Public Schools
Preliminary Analysis of EOG/EVOS Data – Greene County 2009,2010,2011
NWEA Measures of Academic Progress (MAP)
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
Dr. Robert H. Meyer Research Professor and Director
2017 NAEP RESULTS: DC PUBLIC CHARTER SCHOOLS
Urban Charter Schools IMPACT in New York March 2015
EVAAS Overview.
Specifications Used for School Identification Under ESSA in
Danvers Public Schools: Our Story
Campus Comparison Groups and Distinction Designations
Access Center Assessment Report
AWG Spoke Committee- English Learner Subgroup
School Performance Measure Calculations SY
Madison Elementary / Middle School and the New Accountability System
WAO Elementary School and the New Accountability System
Presented by Joseph P. Stern
Presentation transcript:

MPS High School Evaluation Council of the Great City Schools Annual Fall Conference October, 2010 Deb Lindsey, Milwaukee Public Schools Bradley Carl, Wisconsin Center for Education Research

High School Evaluation: Purpose, Design, Methodology Evaluation designed as comparisons between high school types (Small vs. Large, Charter vs. Non-Charter, etc.) –Not designed as a comparison/ranking of individual high schools –Not all schools included in some comparisons –Small/Large cutoff: 400 students –“Selectivity” defined somewhat narrowly: admissions requirement (4 schools)

High School Evaluation: Purpose, Design, Methodology 2 context measures (enrollment & demographics) + 11 outcome metrics (completion rate, test scores, attendance, suspensions, etc.) Time period generally through , corresponding to start of HS Redesign process Descriptive data + inferential (regression) analysis to account for differences in students served

High School Evaluation: Key Findings  Enrollment: distinct and purposeful shift toward smaller high schools  Corresponding increases in enrollment (numerical and “market share”) for several subsets of small high schools: small charters, small newly-created charters, etc.  Demographics: no evidence of any school types consistently under-serving student subgroups of interest (SpEd, ELL, etc.)

High School Evaluation: Key Findings WKCE Test Performance: –Stagnant rates of non-cohort proficiency on Grade 10 tests (% Proficient + Advanced); small increases in Grade 10 mean scale scores –Small schools appear to serve lower-performing students overall –Same-student gains (Grade 8-10; Fall 2005-Fall 2007 and Fall 2006-Fall 2008) for non-mobile students with matched tests: Most remain in same proficiency level, but more students drop 1+categories than increase No school types produce consistently superior gains for both Reading/Math for both growth cohorts; greater variation within school types than between

High School Evaluation: Key Findings  Mobility: higher rates of within-year mobility for Small sites, both descriptively (unadjusted) and inferentially (regression- adjusted):  Based on month-to-month changes in school of enrollment during and  Regression controls for student/school demographics + prior (grade 8) attendance & mobility  Variation in mobility again higher within school types than across school types

High School Evaluation: Key Findings Within-Year (September-May) Grade 9 Reading & Math Benchmark Gains: –Similar pattern to WKCE: no school type produces consistently superior gains, either descriptively or regression-adjusted; lower prior achievement in Small sites –Again, greater variation within than between types –Low participation rates may bias results

High School Evaluation: Key Findings Retention rates for first-time 9 th graders: Small schools have higher rates descriptively ( through ), but lower regression-adjusted rate for –Controls used in regression: student and school-level demographics & prior retention history (past 5 years) –Again, substantial within-type variance

High School Evaluation: Key Findings Attendance: 75-80% for grades 9-12 across school types; no significant change for most school types over past 5 years –Descriptive data: higher attendance rates (grades 9-12) for Large sites –Regression-adjusted data: higher for first- time 9 th graders in Small sites after controlling for demographics + prior (grade 8) attendance history –Again, large variance within school types

High School Evaluation: Key Findings GPA (overall and core subject): –Low GPA for all school types ( ); marginal (if any) improvement –GPA (both types) lower in Small sites; likely reflects lower ability levels upon entering high school

High School Evaluation: Key Findings High School Completion Rate: –Insufficient data to make meaningful comparisons (new schools/types + small student counts + high grade 9 retention rates) –“Total Quality Credits” (during Year 1 of H.S. and overall) used as proxy for progression through high school; higher TQC attainment in Large sites, but (again) substantial within-type variance

Conclusions Conclusion 1: very limited evidence of systemic improvement in high school outcomes studied Conclusion 2: some evidence that Small high schools overall serve different student populations (lower prior achievement, etc.) Conclusion 3: some high school types fare better on some outcomes, but very limited evidence of consistently superior outcomes for any school type across all years/metrics –Outcome variance generally greater within school types than across

Implications for HS Reform Change focus to a quality control/ performance management approach that encourages diversity of offerings + accountability for results –Select most valued/relevant outcomes, establish expectations, monitor results –MPS already taking steps in this direction: evaluating charters, EdStat, data warehouse reports, etc.