EE Coaches August 2015. Welcome! Introductions –Who you are –Where you are from –An EE success from last year.

Slides:



Advertisements
Similar presentations
NYC Teacher Data Initiative: An introduction for Teachers ESO Focus on Professional Development December 2008.
Advertisements

NYC Teacher Data Initiative: An introduction for Principals ESO Focus on Professional Development October 2008.
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
SASD DATA RETREAT Agenda Welcome Purpose and Outcomes of Day School Learning Objectives (SLO) Overview & Connection to Educator Effectiveness SLO.
EVAAS EVALUATION EFFECTIVENESS …AND WHAT DOES IT SAY??? Brawley Middle School November 27, 2012.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Haywood County Schools February 20,2013
PVAAS School Consultation Fall 2011 PVAAS Statewide Core Team
Joining the dots Supporting & challenging your school Governor Dashboard 1 Paul Charman Director of Strategy & Operations, FFT Chair of Governors, Dyson.
EDUCATOR EFFECTIVENESS: 1 An Orientation for Teachers.
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
Student Learning Objectives (SLOs) Upper Perkiomen School District August 2013.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
GOAL SETTING CONFERENCES BRIDGEPORT, CT SEPTEMBER 2-3,
EDUCATOR EFFECTIVENESS: 1 An Orientation for Principals and Assistant Principals.
1 The New York State Education Department New York State’s Student Reporting and Accountability System.
Introduction to NYC Teacher Data Initiative Training for Schools Fall 2008.
Principal Performance Evaluation System
Growing Success-Making Connections
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
ILLUMINATING COLLEGE AND CAREER READINESS State Report Cards for Districts and Schools.
Reporting college and career readiness results to the public DQC Public Reporting Task Force | January 9, 2014.
Horizon Middle School June 2013 Balanced Scorecard In a safe, collaborative environment we provide educational opportunities that empower all students.
Becoming a Teacher Ninth Edition
A UGUST 2015 P RINCIPAL MEETING ~ Billie Finco & Sherri Torkelson ~
+ Equity Audit & Root Cause Analysis University of Mount Union.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
SLOs for Students on GAA February 20, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
State Charter Schools Commission of Georgia SCSC Academic Accountability Update State Charter School Performance
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC) October 2012.
SLOs for Students on GAA January 17, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
EDUCATOR EFFECTIVENESS: 1 An Orientation for Teachers August 2015.
Evidence-Based Observations Training for Observers of Teachers Module 5 Dr. Marijo Pearson Dr. Mike Doughty Mr. John Schiess Spring 2012.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
I NFUSING V ALUE -A DDED INTO THE SLO PROCESS August 2015.
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of Derry Township School District School Board Presentation Sept.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Quality Jeanne M. Burns, Ph.D. Louisiana Board of Regents Qualitative State Research Team Kristin Gansle Louisiana State University and A&M College Value-Added.
MMSD Value-Added Results January 3, Attainment versus Growth Grade 3Grade 4Grade 5Grade 6Grade 7Grade 8 2.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Mathematics Subject Leader Network Meeting Autumn 2013.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
ESEA, TAP, and Charter handouts-- 3 per page with notes and cover of one page.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
1 WI Educator Effectiveness System Understanding Student Learning Objectives (SLOs)
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
PED School Grade Reports (with thanks to Valley High School) ACE August 3, 2012 Dr. Russ Romans District Accountability Manager.
Candidate Assessment of Performance CAP The Evidence Binder.
Measuring College and Career Readiness 2015 PARCC RESULTS: YEAR ONE EDGEWATER SCHOOL DISTRICT ELEANOR VAN GELDER SCHOOL.
Candidate Assessment of Performance CAP The Evidence Binder.
Student Growth Goals for Coaching Conversations. Requirements for Student Growth Goals Time line Reference to an Enduring Skill Proficiency and Growth.
Examining Student Work Middle School Math Teachers District SIP Day January 27, 2016.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
Supporting the Development of Student Learning Objectives How to Create an SLO.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
INSTRUCTIONAL LEADERSHIP TEAM CAMPUS IMPROVEMENT PLANNING MARCH 3, 2016.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
KHS PARCC/SCIENCE RESULTS Using the results to improve achievement Families can use the results to engage their child in conversations about.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
1 Testing Various Models in Support of Improving API Scores.
Performance Wisconsin Student Assessment System
New Statewide Accountability System
CORE Academic Growth Model: Results Interpretation
Jean Scott & Logan Searcy July 22, MEGA
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
SGM Mid-Year Conference Gina Graham
State of Wisconsin School Report Cards Fall 2014 Results
Presentation transcript:

EE Coaches August 2015

Welcome! Introductions –Who you are –Where you are from –An EE success from last year

Warm-up Read the 2 page ‘The Change Process’ document. Silently and alone…reflect: –How have you seen these stages play out with EE? –What stage would gauge your school at today? –How can you support forward movement?

80% of teaches disagree with the statement that the WI Educator Effectiveness System gives them the tools to improve their practice.

Why are you called an Educator Effectiveness Coach and not an EE Expert or Guru or Smarty Pants?

Page 1 7

Beginning of Year Working collaboratively with their evaluator or a peer, educators draw upon the SLO and Outcome Summary Process Guide (see page 2) to develop a minimum of one SLO. The development of the SLO now must include the review of teacher and principal value-added, as well as graduation rates or schoolwide reading value-added (as appropriate to the role of the educator). Educators continue to document the goal within the appropriate online data management system (e.g., Teachscape or MyLearningPlan). Collaborative learning-focused conversations are required as part of the process, but flexibility exists in whom educators collaborate with in Supporting Years. However, in Summary Years, educators must conduct this process with their evaluators. 8

Page 2 9

What is different from last year? Finco 10

Page 3 Finco 11

TEACHERS Teacher Value-Added and Schoolwide Reading: When developing SLOs, teachers must review individually, as well as with teacher teams at both the grade level and across the content area (e.g., schoolwide reading value-added), to identify trends (i.e., strengths and areas for growth) across time. These trends can inform SLOs or professional practice goals, based on areas of need. Working in teams with other teachers could inform the development of a team SLO that may align to a School Learning Objective identified by the principal. Value-added trends may also illuminate strategies that have worked well, based on areas of strength, and can support ongoing instructional efforts. Working in teams with other teachers could provide the opportunity to share best practices and successful strategies which support school improvement plans and/or goals. Let’s walk through this… Finco 12

Graduation Rate: When developing SLOs, high school teachers must review graduation rate data across time to identify positive or negative trends regarding the matriculation of their school’s students. During this review, teachers should reflect on how their practice has supported the trends within the graduation rate data. Teachers should also review the data in vertical and horizontal teams to review school (and district) practices which positively and negatively impact graduation rates. This analysis can inform the development of SLOs, as well as professional practice goals, to support the improvement of graduation rates of the educator’s students. This review can also illuminate the success of various college and career ready strategies implemented by teachers and across the school to be modified or duplicated. Finco 13

Educators are not required to develop a goal based on these data or to develop a goal with the intention to improve these data, unless the data indicates that is necessary. As always, the purpose of the Educator Effectiveness System is to provide information that is meaningful and supports each individual educator’s growth in their unique roles and contexts. By reviewing multiple data points, including those listed above, the educator has access to a more comprehensive view of their practice and a greater ability to identify areas of strength and need—both of which can inform the development of goals, as well as instructional/leadership strategies which can support progress towards goals. Note: Due to the lag in data provided by DPI to districts, as well as the date in the year in which the data is provided to the districts (i.e., the following year), educators should only use the data to review trends across time when developing an SLO. Educators should not use the data to score SLOs. 14

Turn and Talk: What do you know about value-added data?

There are 2 general ways to look at student assessment data 16 Attainment model - a “point in time” measure of student proficiency –compares the measured proficiency rate with a predefined proficiency goal. Growth model – measures average gain in student scores from one year to the next –accounts for the prior knowledge of students.

What is Value- Added? It is a type of growth model that measures the contribution of schooling to student performance on the WKCE in reading and in mathematics Uses statistical techniques to separate the impact of schooling from other factors that may influence growth Focuses on how much students improve on the WKCE (or our new assessment) from one year to the next as measured in scale score points 17

Last YearThis Year Educator Practice % Based on Danielson or Stronge Student Outcomes 1-4 5% VA 95% SLO Educator Practice % Based on Danielson or Stronge Student Outcomes % SLO Teachers

Last YearThis Year Educator Practice % Based on WI Principal Rubric or Stronge Student Outcomes % VA 50% SLO Educator Practice % Based on WI Principal Rubric or Stronge Student Outcomes % SLO Principals 45% Principal Value Add

More Clear Data Picture Many data pieces give us a fuller picture… STAR WKCE or Badger AIMSweb ACT WorkKeys Classroom Assessments AP Surveys Aspire Observation Data PALS VA Data Why would we care about Value Added data?

VA allows for fairer growth comparisons to be made (in contrast to pure achievement) 21 90% Proficiency School A School B 86% Proficiency 6% Free and Reduced 90% Free and Reduced

VA allows for fairer growth comparisons to be made (in contrast to pure growth) We know that in Wisconsin, certain groups of students do not grow (or achieve) at the same rate as others. This can be due to the achievement level of a child (lowest students can grow the most) This can also be related to demographics such as –Special Ed status –ELL –Race/ethnicity –Economically Disadvantaged –etc.

Hi! I’m a 4 th grade boy. I got a scale score of 2418 on my WKCE in reading this year! And these are all the other boys in WI who had the exact same scale score as me. 4 th grade

Now I’m in 5 th grade and just got a scale score of 2449 on my reading WKCE! I grew 31 points. All of the other boys took the test again, too. Their average scale score was Their average growth was 25 points. 5 th grade

So we would say that my teachers in 4 th grade had a higher Value-Add than would be expected. 5 th grade Their average growth was 25 points I grew 31 points

Outside the school’s influence Race/Ethnicity Gender Section 504 Economic Status Disability (by type) Prior Year Score (reading and math) English Proficiency (by category level) Mobility Using the same process, VA Controls for these factors

How do they decide what to control for? Check 1: Is this factor outside the school or teacher’s influence? Check 2: Do we have reliable data? Check 3: If not, can we pick up the effect by proxy? Check 4: Does it increase the predictive power of the model?

Checking for Understanding What would you tell a 5 th grade teacher who said they wanted to include the following in the Value-Added model for their results?: A.5 th grade reading curriculum B.Their students’ attendance during 5 th grade C.Education level of the parents D.Student motivation Check 1: Is this factor outside the school or teacher’s influence? Check 2: Do we have reliable data? Check 3: If not, can we pick up the effect by proxy? Check 4: Does it increase the predictive power of the model?

Reporting Value-Added In the latest generation of Value-Added reports, estimates are color coded based on statistical significance. This represents how confident we are about the effect of schools and teachers on student academic growth. Green and Blue results are areas of relative strength. Student growth is above average. Gray results are on track. In these areas, there was not enough data available to differentiate this result from average. Yellow and Red results are areas of relative weakness. Student growth is below average.

Grade Value-Added is displayed on a 1-5 scale for reporting purposes. About 95% of estimates will fall between 1 and 5 on the scale. Most results will be clustered around represents meeting predicted growth for your students. Since predictions are based on the actual performance of students in your state, 3.0 also represents the state average growth for students similar to yours. Numbers lower than 3.0 represent growth that did not meet prediction. Students are still learning, but at a rate slower than predicted. Numbers higher than 3.0 represent growth that beat prediction. Students are learning at a rate faster than predicted.

Grade % Confidence Interval 30 READING Value-Added estimates are provided with a confidence interval. Based on the data available for these thirty 4 th Grade Reading students, we are 95% confident that the true Value-Added lies between the endpoints of this confidence interval (between 3.2 and 4.4 in this example), with the most likely estimate being

Confidence Intervals Grade READING Grade 4 36 Grade Grade Grade 4 36 Grade MATH Color coding is based on the location of the confidence interval. The more student data available for analysis, the more confident we can be that growth trends were caused by the teacher or school (rather than random events).

LET’S LOOK AT THE VALUE ADDED REPORTS THESE ARE HOUSED IN THE SCHOOL ACCESS FILE EXCHANGE (SAFE)

We begin with some caveats! VA is one data source among many that provides a different perspective on student growth. VA should never be the sole data source to identify effective/ineffective schooling! Taking VA out of the Student Outcome score allows each educator to decide how (or if) this data informs the SLO process.

Page 1 Introduction to VA Color Coding

Page 2 With a partner: How is school growing students in reading and in math? How is this school growing students across grades?

Share your thinking

Pages 3 & 4 With a partner: How is this school growing subgroups in reading? math?

Share your thinking

Page 5 Introduction to VA Scatter Plots

With a partner: What does this data tell you? How does this school compare to others in the state?

Grade Level VA & Achievement Plots With a partner: What does this data tell you? How might a grade- level use this data to inform the focus of their SLO? Pages 6 & 7

Share your thinking

How do/don’t these reports add to our total data picture?

Finco What are your take-aways about VA Data?

Peer Review is required for those in Supporting Years!

Feedback requested…

Time for Technology MLP and Teachscape updates…

Our Next Meeting… December 3 rd April 7th

Closing… What are you taking from today?