The VAM Roadshow Note: This presentation will be provided electronically.

Slides:



Advertisements
Similar presentations
ProgressBook User Start-Up
Advertisements

New York State’s Teacher and Principal Evaluation System VOLUME I: NYSED APPR PLAN SUBMISSION “TIPS”
Completing the Classroom Teacher and Non-Classroom Teacher Evaluations for Presented by: The Office of Talent Development Employee Evaluations.
Changes To Florida’s School Grades Calculations Adopted By The State Board Of Education On February 28, 2012 Prepared by Research, Evaluation & Accountability.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
Educator Evaluations Education Accountability Summit August 26-28,
Using RUBRICS to Assess Program Learning Outcomes By Dr. Ibrahim Al-Jabri Director, Program Assessment Center April 16, 2007.
LCSD APPR Introduction: NYS Teaching Standards and the Framework for Teaching Rubric Welcome! Please be seated in the color-coded area (marked off by colored.
The Assistant Principal Pool Process 2014
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Deliberate Practice Technical Assistance Day
SCPS is…  We are a high-performing district  We are focused on student achievement  We are committed to achieving excellence and equity through continuous.
Buckeye Elementary School District’s Curriculum Mapping Initiative
John Cronin, Ph.D. Director The Kingsbury NWEA Measuring and Modeling Growth in a High Stakes Environment.
BY Karen Liu, Ph. D. Indiana State University August 18,
DRE Agenda Student Learning Growth – Teacher VAM – School Growth PYG Area Scorecards. PYG, and other Performance Indicators.
Overview of SB 736 Legislation Pertaining to Personnel Evaluation Systems and Race to the Top 1.
Human Capital Management Office of Human Resources Office of Professional Development & Evaluation.
Student Learning Growth Details November 27 th and November 29th.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
Robert Barnoski (Barney) Washington State Institute for Public Policy Phone: (360) Institute Publications:
Assessing Students With Disabilities: IDEA and NCLB Working Together.
© 2014, Florida Department of Education. All Rights Reserved Annual District Assessment Coordinator Meeting VAM Update.
Measuring Student Growth in Educator Evaluation Name of School.
A New Approach to Assessment Based on extensive research that has identified teaching and instructional practices that are most effective in impacting.
STUDENT GROWTH MEASURES Condensed from ODE Teacher Training.
Teacher Appraisal System Updates by: Dawn Capes, Coordinator for Teacher and Administrator Appraisal Systems.
DOE STAFF DATABASE: Overview of Changes Presenter : Teresa R. Sancho FAMIS 2011 CONFERENCE Tallahassee, Florida June 2011.
DRE FLDOE “Value-Added Model” School District of Palm Beach County Performance Accountability.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
Alabama’s Professional Development Management System
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
Overview of the Model to Measure Student Learning Growth on FCAT as developed by the Student Growth Implementation Committee Juan Copa, Director of Research.
14/15 301Plan for Governing Board December 2, 2014 Dr. Heather Cruz.
Florida Department of Education’s Florida Department of Education’s Teacher Evaluation System Student Learning Growth.
Santa Rosa District Schools Mr. Lewis Lynn Assistant Superintendent of Human Resources.
Best Practices in CMSD SLO Development A professional learning module for SLO developers and reviewers Copyright © 2015 American Institutes for Research.
35% Non-FCAT Teachers – Teacher Level Student Growth Component – 40% Bay District has adopted teacher-level student growth measures for those teachers.
Student Growth Measures in Teacher Evaluation: Writing SLOs August 2014 Presented by Aimee Kirsch.
Lessons Learned. Communication, Communication, Communication Collaborative effort Support of all stakeholders Teachers, Principals, Supervisors, Students,
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
My Learning Plan Observation and Appraisal SYStem
C OMPREHENSIVE & F OCUSED E VALUATIONS U SING P IVOT CEL5D+ Training Summer 2015 This session will: Give an overview of Comprehensive and Focused Evaluations.
© 2014, Florida Department of Education. All Rights Reserved. Accountability Update School Grades Technical Assistance Meeting.
Bay District Schools Non-FCAT Tested Decisions.
Mark Howard, Chief Performance Accountability
STARS Training – How to Give District Tests
Introduction to Teacher Evaluation
Merit & Incentive Pay Based on High Stakes Testing
Best Practice for Reporting Gifted - EMIS
VAM Primer.
Courtney Mills Principal, Midlands Middle College
SB 1664 Changes to Personnel Evaluations
Introduction to Teacher Evaluation
FY17 Evaluation Overview: Student Performance Rating
STARS Training – How to Give District Tests
EVAAS Overview.
STARS Training – How to Give District Tests
Owatonna Public Schools Teacher Development PLAN (TDE)
Maryland Online IEP System Instructional Series – PD Activity #8
Maryland Online IEP System Instructional Series - PD Activity #5
Whose Job Is It? Part Two At the Board Table Discussion Tool
Quantitative Measures: Measuring Student Learning
DATA WAREHOUSE The District School Board of Collier County
Your SLO.
Assessing Students With Disabilities: IDEA and NCLB Working Together
Presentation transcript:

The VAM Roadshow Note: This presentation will be provided electronically.

Current Legislative Session The current Legislative Session may make changes to SB736 and approve DOE provided rules required by SB736 Decisions made will impact some of the information in this PowerPoint. Could possibly be affected: Explanations of cut points (slide 12) FAQ (21, 26, 28-31) Will most likely not be affected: Definitions/Explanations of VAM, Direct, Indirect, characteristics of VAM (4-11, 13-15) FAQ information (17-20, 21-22, 25, 27)

What will you learn? Review of the Value Added Model (characteristics, standard error, confidence interval, rating information) How to access student VAM information (only for those with direct VAM) General FAQs and Info you need to know – Rubric changes – Assessments being used or not used – Plan for adding assessments (K-3) – VAM is not CAG friendly, but it can be useful information What if an evaluation is below Effective?

What is VAM? It is different than School Grades and the way we’ve always used FCAT for things such as learning gains Covariate adjustment model Uses student-level prior test scores and other measured characteristics to predict student achievement Compares a student’s success to other student’s “like” them to create a predicted score Comparisons are created based on Value Added Characteristics Students predicted score is then compared to their actual score Currently VAM used for FCAT and Alg I

Mount FCAT School grade: Where are students on the mountain? Value Added: Where are students on the mountain in relation to other similar students?

Direct Versus Indirect Direct state assessment- An FCAT Course and students took FCAT Indirect- Not an FCAT Course but there are students who took FCAT School FCAT-Reading VAM District FCAT-Reading VAM

What characteristics are used to determine student comparisons and a student’s predicted score?

Value Added Characteristics Indicator Achievement: Prior Year Achievement: Two Prior Years Characteristics with greatest ability to predict score Other predicting characteristics

Other characteristics Indicator Number of students in Class 2 Traumatic Brain Injured Enrolled in 6 or more courses Other Heath Impaired Gifted Student Indicator Autism Spectrum Disorder Deaf or Hard of Hearing Emotional/Behavioral Disability Enrolled in 3 or more courses Enrolled in 5 or more class periods Enrolled in 5 or more courses Enrolled in 4 or more class periods Enrolled in 4 or more courses Enrolled in 3 or more class periods

Other characteristics Indicator Homogeneity of Class 6 Prior Year Test Scores Dual-Sensory Impaired Visually Impaired Number of students in Class 4 Homogeneity of Class 4 Prior Year Test Scores Number of students in Class 3 Homogeneity of Class 5 Prior Year Test Scores Number of students in Class 6 Missing Mobility Data Indicator Enrolled in 6 or more class periods

Value Added Information Continued VAM scores are not singular (example- your score is a 1) VAM scores are actually ranges Initial range Standard of Error Confidence Interval

VAM Information Unsatisfactory Needs ImprovementEffectiveHighly Effective Effective Needs Improvement Unsatisfactory Effective

Accessing Direct FCAT VAM Information Do you know your High Score? Low score? Do you know your school’s high score and low score? Did any part of your range go into the negative?

Scale Score versus Predicted Score

Accessing the Direct FCAT VAM Information Go to the AIMS Companion site at Log in using your usual sign in credentials Go to My Appraisals and select the + sign to the far left of the brown box. If you have a direct VAM, you have a section titled something like, “VAM Mathematics/Reading.” Select the + sign. Down at the bottom there is a link titled, “Student Details” You will only have this link if you have a DIRECT VAM (you teach an FCAT course and students took FCAT)

FAQs and other things YOU need to know!

Important Rubric Adjustment During 10/11 the TASC decided to have different expectations for category 1, 2, 3, and 4 teachers. This was approved by the TASC, the Board and DOE. “When you know better, you do better.” Scenario: – Category 4 teachers with an HE Instructional Practice and HE IPDP and an E in Student Growth were rated Effective (2.5 points). A Category 3 or 2 teacher would have had a HE rating with the same amount of points. Solution: – For the 12/13 school year, all category of teachers will be rated the same. – The point range for HE for everyone is 2.40 and above.

Will we be able to use Discovery Education Scores? At this time there is no plan to use Discovery Education scores for Student Growth Discovery Education is used to progress monitor student learning and predict student achievement on FCAT Cannot mix Progress Monitoring and Evaluative Data Varying administration at schools impacts data reliability

When will all courses be given direct assessments? Senate Bill 736 states that by school year, all courses offered by the district must have a direct assessment to assess student growth Where do these direct assessments come from? – State Assessments: FCAT, End Of Course Assessments, FAA – District Approved Assessments: Brigance, IB, AICE, Dual Enrollment, AP – Approximately 50% of courses remaining not covered. DOE provided Item Bank – Margaret Gamble is working with DOE and CFAC (Central Florida Assessment Coalition) to provide item writers – Concerns Due to be complete , the first year the performance pay is mandated by law Who will maintain it, who will pull assessments, who will assess fidelity, who will input? – Resolution We will maintain current Student Growth measurements and add state assessments and corresponding VAM as we have done with Alg I Hope: The Legislature will give us more time.

K-3 Specifically Issues we are grappling with- – Must include Student Growth measure (SB736) – DEA: Cannot be used (Progress Monitoring) – FAIR: Currently not aligned to Common Core and will not be aligned to Common Core until 2014 (Would it be evaluative or progress monitoring) – SAT 10: Not Aligned to Common Core and expensive – Iowa Assessment: Aligned to Common Core and VERY expensive (70K per year) – Item Bank: Not available until 14/15 – Current solution: Maintain School Level VAM decision – The Oversight Committee plans to revisit this very soon.

VAM isn’t CAG Friendly We do not get a list that neatly describes all of the variables and how a teacher’s students do against all the students in the state We aren’t given a neat matrix that says, “Teacher A- your ESE students did this, your 4 th period did this, etc.” We only know if your students met their predicted score or not and which students counted into your VAM We are asking DOE if they can provide us the kinds of data people want to know so that we can use it to get better But right now, we are not getting that information

So how can we use VAM information?

It is useful! For the 1 st time ever, like students are being compared to like students.

A more complete picture of student learning

Bay District Schools *Excludes schools with no school grade Low Growth High Achievement  3 High Growth High Achievement  14 Low Growth Low Achievement  4 High Growth Low Achievement  8 Value Added Achievement High Achievement= A-B School Grade Low Achievement- C-D School Grade High Growth- Highly Effective- Effective VAM Low Growth- Needs Improvement VAM

What if my overall evaluation falls below effective? Breathe. Don’t panic. Your first few steps should involve checking all of your data in AIMS. Sometimes there’s a computer glitch or something was overlooked etc. THE BEST TIME TO DO THIS IS BEFORE YOU LEAVE FOR SUMMER BREAK. Go through your instructional practice rating and know what it is and know that it is accurate before you go!!! What do Option B people check? (IPDP only)

Okay, it’s accurate and below effective. Now what happens? This will affect your pay. According to Florida Statute, only teachers with an overall summative rating of effective or higher are eligible for either a step increase or the longevity bonus. You will receive a letter from me explaining the change in your pay (or lack of change as the case may be).

And then? We are required to remediate teachers with overall ratings of needs improvement and unsatisfactory. You will receive a self-assessment that you and your principal will complete together indicating mutually-agreed upon areas of need. The professional development team and I will then review those assessment and match needs with existing professional development (at our expense).

What does a plan look like? Each plan is organized by the domains and contains the identified needs, and PD required, to address those needs. Plans include a combination of Beacon online classes and TDY days (paid for by the district) with the services of Staff Training Specialists who work with teachers on an individual basis. Where needs are common, we have after-school workshops to work with small, focus groups.

This is embarrassing. Who will find out? Obviously your principal will know as will the Director of HR. The Staff Training Specialists assigned to work with you will also “know” but must keep the information confidential as part of their job descriptions. STSs work with teachers across the district … no one will know why you are working with one other than you have chosen to do so. Note: It is not public record for that school year. However, according to SB 736, the 11/12 evaluation is public record at the end of the following year (one year). What can be provided is still up for debate and we are working on clarification (just the summative rating OR the summative pieces– IPDP, Instructional and Student Growth)

Want more information? VAM presentation – Project 8 link at x x – Go to Florida Value Added and Student Growth DOE VAM information – – addedModel.pdf addedModel.pdf – White-Paper.doc White-Paper.doc

Contact Information Sharon Michalik – Executive Director for Human Resources – Dawn Capes – Instructional Specialist for Race to the Top – Leon Faircloth – Project Manager for LIIS –