Statutory Requirements Quantitative Measures (35% of Total TLE) The State Board voted to use a Value Added Model to measure student academic growth for.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Square Peg and Round Hole… As parents and educators, the change in grading systems requires a fundamental switch in our thinking… 4=A 1=F 2=D 3=B.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Haywood County Schools February 20,2013
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Districts and States Working with VARC Minneapolis Milwaukee Racine Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
Catherine Cross Maple, Ph.D. Deputy Secretary Learning and Accountability
JUNE 26, 2012 BOARD MEETING Measures of Academic Progress (MAP)
Leader & Teacher SLTs 2014 – ComponentEvaluation for TeachersEvaluation for School Leaders Setting GoalsTeachers set two SLTs in collaboration with.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Common Questions What tests are students asked to take? What are students learning? How’s my school doing? Who makes decisions about Wyoming Education?
Introduction to the Georgia Student Growth Model Student Growth Percentiles 1.
Including a detailed description of the Colorado Growth Model 1.
2012 Secondary Curriculum Teacher In-Service
Addressing Student Growth In Performance Evaluation For Teachers and Administrators Patricia Reeves.
+ Equity Audit & Root Cause Analysis University of Mount Union.
Accountability SY Divisions of Assessment, Accountability and School Improvement.
SB : The Great Teachers and Leaders Act State-wide definition of “effective” teacher and principal in Colorado Academic growth, using multiple measures.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Alicia Currin-Moore Executive Director, TLE Oklahoma State Department of Education.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
A Closer Look at Adequate Yearly Progress (AYP) Michigan Department of Education Office of Educational Assessment and Accountability Paul Bielawski Conference.
State Charter Schools Commission of Georgia SCSC Academic Accountability Update State Charter School Performance
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC) October 2012.
Educator Evaluation Spring Convening Connecting Policy, Practice and Practitioners May 28-29, 2014 Marlborough, Massachusetts.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
Toolkit #3: Effectively Teaching and Leading Implementation of the Oklahoma C 3 Standards, Including the Common Core.
New York State Scores 2011—2012 School Year. Growth Ratings and Score Ranges Growth RatingDescriptionGrowth Score Range (2011–12) Highly EffectiveWell.
Final Reports from the Measures of Effective Teaching Project Tom Kane Harvard University Steve Cantrell, Bill & Melinda Gates Foundation.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
The Power of Two: Achievement and Progress. The Achievement Lens Provides a measure of what students know and are able to do relative to the Ohio standards,
Compass Framework & Goal Setting for Principals: Work Session St. James Parish August 25, Compass Framework & Goal Setting for.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
MMSD Value-Added Results January 3, Attainment versus Growth Grade 3Grade 4Grade 5Grade 6Grade 7Grade 8 2.
Michigan School Report Card Update Michigan Department of Education.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
Standards-Based Grading
CALIFORNIA STATE BOARD OF EDUCATION State Policies: Orchestrating the Common Core Mathematics Classroom Ilene W. Straus, Vice President California State.
Accountability SY Divisions of Assessment, Accountability and School Improvement.
Teacher SLTs General Format for Teacher SLTs with a District-wide Common Assessment The percent of students scoring proficient 1 in my 8 th.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Overview of the Model to Measure Student Learning Growth on FCAT as developed by the Student Growth Implementation Committee Juan Copa, Director of Research.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Becky Pearson and Joyce Gardner.
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
Overview of Student Learning Objectives (SLOs) for
VAM Training. Florida’s value-added model developed by Florida educators  The Department convened a committee of stakeholders (Student Growth Implementation.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
Huntsville City Schools School Year School Instructional Targets October 3,
 Mark D. Reckase.  Student achievement is a result of the interaction of the student and the educational environment including each teacher.  Teachers.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
Teacher SLTs
What is Value Added?.
Teacher SLTs
Release of PARCC Student Results
NWEA Measures of Academic Progress (MAP)
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
FY17 Evaluation Overview: Student Performance Rating
EVAAS Overview.
Teacher SLTs
Teacher SLTs
Presentation transcript:

Statutory Requirements

Quantitative Measures (35% of Total TLE) The State Board voted to use a Value Added Model to measure student academic growth for teachers and leaders in grades and subjects for which multiple years of standardized test data exist.

5 Myths about Value-Added Models MYTH Value-added isn’t fair to teachers who work in high- need schools where students tend to lag far behind academically. FACT Value-added controls for students past academic performance and demographic factors. Value-added looks at progress over a course of a school year and not at a single test score on a single day.

5 Myths about Value-Added Models MYTH Value-added scores are too unpredictable from year-to- year to be trusted. FACT Value-added scores are about as stable as batting averages in baseball and many other widely-accepted performance measures. Teachers who earn high value- added scores early in their career rarely go on to earn low scores later, and vice versa.

5 Myths about Value-Added Models MYTH There is no research behind value-added. FACT Value-added has been researched for nearly three decades by leading academics and economists. Value-added models have been used by school districts beginning in the 1990’s.

5 Myths about Value-Added Models MYTH Using value-added means that teachers will be evaluated based solely on standardized test scores. FACT Oklahoma Teacher and Leader evaluations are a system. The total evaluation will take into account many factors in determining the final evaluation score. – 50% Qualitative Measures – 35% Value-Added Score – 15% Other Academic Measures

5 Myths about Value-Added Models MYTH Value-added is useless because it is imperfect- it has a margin of error. FACT No measure of teacher performance is perfect. However, a value-added model provides crucial information on how well teachers are doing at their most important job-helping students learn.

Educator Evaluation: Value-Added Model Selection Process Foundation for Excellence in Education Christy Hovanetz, Ph.D. Senior Policy Fellow April 4, 2012

Agenda Purpose and definition of value-added Process for selecting a value-added model

Purpose for Using Growth Models Schools have students who enter with different levels of proficiency and characteristics Therefore, simply compare status scores across schools is not reasonable because the status scores simply reflect the students who entered the school, not the impact of the school 11

Purpose for Using Growth Models Growth models are designed to mitigate the influence of differences among the entering students In other words, growth models try to “level the playing field” so that teachers do not have advantages or disadvantages simply as a result of the students in their class 12

13 Identify Initial Models Select Models for Comparison Determine Variables and Business Rules for Data Processing Evaluate Selected Models Compare Results and Make Model Recommend- ation Report Results Use Results for Educator Evaluation Steps to Developing a Statewide Value-Added Model

Different Methods for Measuring Growth Status Methods Simply compute averages or percent proficient using a single year of test score data Comparisons can be made from one year to the next, but these are based on different groups of students Simple Growth Models Measure change in a student’s performance from test to test (e.g., gain from grade 3 to 4) Value-Added Models Statistical model estimates the portion of the student’s gain that is attributable to the school or teacher Student Growth Percentiles The quantile model estimates “growth percentiles” among students who started at a similar level Performance is judged entirely relative to that of other students, not against a learning criterion

Status Models 15 Using one year of data comparisons can be made between 2010 third graders and 2011 third graders to determine if there was “growth” in the percent of proficient third graders SubjectGrade2010 percent proficient 2011 percent proficient “Growth” Improvement Reading Reading Reading57271

Growth Models Source: CCSSO Policymaker’s Guide to Growth Models Growth (Simplified “generic” example) Performance after a specified period of time (i.e., in one school) At least two scores for each student are necessary. A starting point (which may be more than one year earlier) is important in a growth model. Year x Year x+1

Value-Added Models All value-added models are growth models. A value-added model must use at least two test scores for each student. A statistical model estimates the portion of the student’s gain that is attributable to the classroom teacher. Level the playing field by accounting for differences in the proficiency and characteristics of students assigned to teachers

Value-added models Source: CCSSO Policymaker’s Guide to Growth Models Value-Added Models (Simplified “generic” example) Starting point (which may be more than one year earlier) is important in a value-added model. Year x Year x+1 Expected performance after a specified period of time Performance after a specified period of time Value Added Actual Growth Expected Growth

Student Growth Percentiles PercentileAll Students - Median Growth th percentile ? 40 th percentile151117? 55 th percentile262429? 60 th percentile302631? 90 th percentile403542? Growth targets are determined based on the performance of other schools in the state Growth expectations are set annually Growth expectations shift based on statewide performance

Advantages of a value-added model Teachers teach classes of students who enter with different levels of proficiency and possibly different student characteristics Value-added models level the playing field by accounting for differences in the proficiency and characteristics of students assigned to teachers

Advantages of a value-added model Value-added models are designed to mitigate the influence of differences among the entering classes; teachers do not have advantages or disadvantages simply as a result of the students who attend a school and are assigned to a class

Contact Information Christy Hovanetz, Ph.D. Senior Policy Fellow Foundation for Excellence in Education P.O. Box Tallahassee, FL Phone: Website:

Value-Added: Deepening Your Understanding © 2012, Battelle for Kids

Who is Battelle for Kids? –Not-for-profit, educational-improvement organization that believes in: The Right People –Having highly effective educators throughout the system to maximize student opportunities. The Right Measures –Measuring Educator and Employee Effectiveness The Right Practices –Researching and Supporting Effective Educational Practices The Right Messages –Engaging Stakeholders for Strategic Improvement and Managing Change.

Who is Battelle for Kids?  Not-for-profit, educational-improvement organization that believes in: The Right People Having highly effective educators throughout the system to maximize student opportunities. The Right Measures Measuring Educator and Employee Effectiveness The Right Practices Researching and Supporting Effective Educational Practices The Right Messages Engaging Stakeholders for Strategic Improvement and Managing Change.

1 Roadie 5 Back- up Singer 10 Rock Star Technical: Value-Added Self-Assessment

The Power of Two

Opportunity To provide a clearer understanding of where your strengths and challenges are allowing you to create more focused improvement efforts Examples: Your district 6 th grade math performance is low, but your schools that are configured (K-6 or K-8) are show positive value added results. Your previously high achieving 7 th grade reading students in your building are not meeting growth predictions.

Opportunity To evaluate where your curriculum or programs are being more or less effective Examples: Your math value-added results across an entire grade level in your district demonstrated low value-added. Students in your gifted magnet school demonstrated less academic growth than similar students in traditional schools.

Opportunity To improve your placement of teachers and students Examples: Your value-added results indicate that 4 of your most ineffective math teachers are in the same middle school. A 4 th grade teacher in your building has very high value- added effects in math but very poor value-added effects in reading.

An opportunity To maximize the impact of your best teachers and principals Examples: You identify really highly effective teachers and share what makes them successful with others. You identify new ways use highly effective teachers to reach more students, high needs students, or lead the development of other teachers.

An opportunity To target professional development at needs of the teacher or group of teachers Examples: Develop protocols that lead teachers through a data inquiry, root cause analysis, and goal setting process. Align professional development to your instructional framework and promote collaboration across the organization.

Deepening Your Understanding Value-Added Predictions Control Variables

Value-Added: Actual minus Predicted Value-Added Above Prediction Value-Added Below Predicted Value-Added At Prediction

Prediction Predictions are more like magic than science.

Predictor Tests: Pre/Prior/Post  In subjects that are contiguously tested, such as reading and math, it is easy to understand how:  A prior reading test can be predictive of reading E.g., 3 rd grade reading predicts 4 th grade reading  A prior math test can be predictive of math E.g., 4 th grade math predicts 5 th grade math

Prediction – You Try!  What about 5 th grade science?  As a science teacher in 5 th grade, what might you predict about the following students’ performance in your class? Select from: Strong, Average, Struggle How certain are you? Would more information be helpful? ValerieAnnMatt Reading (1000) Science???

Prediction – Try again!  Let’s expand the information: Select from: Strong, Average, Struggle Did your answer change for any student? How confident are you? ValerieAnnMatt Reading Math Science???

Predictors for Non-Contiguous  Predictor tests must have a strong relationship to the test being analyzed. SCI- 04 SOC- 04 MATH- 04 READ- 04 SCI- 05 SOC- 05 MATH- 05 READ- 05 SCI SOC MATH READ SCI SOC MATH READ- 05 1

Let’s Look at SIMPLE Prediction A math score of 1425 would be “advanced.” Year 1Year 2 “Predicted” 1500 Basic prediction assumes the student who scored 1425 in Year 1 would score similar to all others students who scored If all students who scored 1425 in Year 1 score 1400 in Year 2, the models would predict the student would perform similar – that is scoring lower than the previous year. If the student scores above the prediction, we have value-added

The top of the scale is a score of Year 1Year What are the chances of these students scoring above 1500? Impossible At 1500? Possible, but improbable Below 1500? Possible and probable The Perfect Students… 1475

Control Variables  Control variables help to make the predictions more accurate and relevant to a student.  Aren’t always perfect.

Checking for Understanding

Example Reporting

Value-Added Reporting READING Subject Number of Students Analyzed (Weighted) Value-Added Estimate with Color Code Confidence Interval 3-Year Average Value- Added Results District Average Past Academic Year Value-Added Results

Grade-Level Results In the previous slide, this school’s overall ELA was 3.1 – at the district average or predicted performance. Is the grade-level performance in ELA consistent with the overall result? What could be contributing to our strengths? Our problems? Develop a hypothesis. READING

Teacher Effectiveness Hypothesize & report out a potential problem related to: 1. Planning and Preparation Standards-based activities Student assessment for learning 3. Instruction Questioning & discussion techniques Engage students in learning Assessment in instruction to advance student learning Level 3 and above Level 1 Level 2 READING

Checking for Understanding

Todd Hellman