Measured Progress ©2012 Student Growth in the Non-Tested Subjects and Grades: Options for Teacher Evaluators Elena Diaz-Bilello, Center for Assessment.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
 Teacher Evaluation and Effectiveness laws are now in place  Legislature has passed a law that student performance can now be a part of teacher evaluation.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
Upper Darby School District Growth Data
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Standard Setting Inclusive Assessment Seminar Marianne.
Assessing Student Learning
Grade 3-8 English. 2 The Bottom Line This is the first year in which students took State tests in Grades 3,4,5,6,7, and 8. With the new individual.
Vertical Scale Scores.
Student Growth Goals: How Principals can Support Teachers in the Process Jenny Ray PGES Consultant KDE/NKCES.
1 Ohio’s Entry Year Teacher Program Review Ohio Confederation of Teacher Education Organizations Fall Conference: October 23, 2008 Presenter: Lori Lofton.
Including a detailed description of the Colorado Growth Model 1.
NCLB AND VALUE-ADDED APPROACHES ECS State Leader Forum on Educational Accountability June 4, 2004 Stanley Rabinowitz, Ph.D. WestEd
Becoming a Teacher Ninth Edition
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
 Student Learning Objectives February 26, 2015 Work and Creation Session.
PUBLIC SCHOOLS OF NORTH CAROLINA STATE BOARD OF EDUCATION DEPARTMENT OF PUBLIC INSTRUCTION 1 Review of the ABCs Standards SBE Issues Session March 2, 2005.
ERIKA HALL CENTER FOR ASSESSMENT PRESENTATION AT THE 2014 NATIONAL CONFERENCE ON STUDENT ASSESSMENT NEW ORLEANS JUNE 25, 2014 The Role of a Theory of Action.
Colorado’s Student Perception Survey. Agenda Why use a Student Perception Survey? What the Research Says Survey Overview Survey Administration Use of.
The Power of Comparison Groups Rita O’Sullivan Evaluation, Assessment, & Policy Connections School of Education, University of North Carolina at Chapel.
Striving to Link Teacher and Student Outcomes: Results from an Analysis of Whole-school Interventions Kelly Feighan, Elena Kirtcheva, and Eric Kucharik.
Stronge Teacher Effectiveness Performance Evaluation System
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Quantitative Measures: Measuring Student Learning September 2011.
Teacher Effectiveness Pilot II Presented by PDE. Project Development - Goal  To develop a teacher effectiveness model that will reform the way we evaluate.
Using Data in the Goal-Setting Process Webinar September 30, 2015.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
March 7, 2013 Texas Education Agency | Office of Assessment and Accountability Division of Performance Reporting Accountability Policy Advisory Committee.
Portability of Teacher Effectiveness across School Settings Zeyu Xu, Umut Ozek, Matthew Corritore May 29, 2016 Bill & Melinda Gates Foundation Evaluation.
LEAP in School Staff. Training Objectives  Understand the changes to LEAP for  Have questions answered.
Copyright © 2009 National Comprehensive Center for Teacher Quality. All rights reserved. Evaluating Teacher Effectiveness: Some Models to Consider Laura.
“Value added” measures of teacher quality: use and policy validity Sean P. Corcoran New York University NYU Abu Dhabi Conference January 22, 2009.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
What Use Are International Assessments for States? 30 May 2008 Jack Buckley Deputy Commissioner National Center for Education Statistics Institute of Education.
USING GRAPHICAL DISPLAY by John Froelich A Picture is Worth a Thousand Words:
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Race to the Top Assessment Program: Public Hearing on High School Assessments November 13, 2009 Boston, MA Presenter: Lauress L. Wise, HumRRO Aab-sad-nov08item09.
Race to the Top Assessment Program: Public Hearing on Common Assessments January 20, 2010 Washington, DC Presenter: Lauress L. Wise, HumRRO Aab-sad-nov08item09.
Created by: Krystal Barker, Teresa Campbell, Kim Grubb, and Tristan Parsons.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Standard VI Teachers Contribute to the Academic Success of Students.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
1 BUILDING QUALITY LEARNING USING PERIODIC ASSESSMENTS Session Outcomes: Use diagnostic Periodic Assessments as instructional tools for quality enhancement.
SCOTT MARION, CENTER FOR ASSESSMENT PRESENTATION AT CCSSO NCSA AS PART OF THE SYMPOSIUM ON: STUDENT GROWTH IN THE NON-TESTED SUBJECTS AND GRADES: OPTIONS.
Building an Interim Assessment System: A Workbook for School Districts CCSSO National Conference on Student Assessment Detroit, MI June 22, 2010.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
Educational Psychology Jeanne Ormrod Eighth Edition © 2014, 2011, 2008, 2006, 2003 Pearson Education, Inc. All rights reserved. Developing Learners.
Creating Alternative Pathways for Students to Achieve Academic Credit in School The contents of this PowerPoint were developed under a grant from the US.
 Mark D. Reckase.  Student achievement is a result of the interaction of the student and the educational environment including each teacher.  Teachers.
Focus Questions What is assessment?
Standard VI Teachers Contribute to the Academic Success of Students.
How the CAP Science and Social Studies Tests Measure Student Growth.
1 Testing Various Models in Support of Improving API Scores.
It Begins With How The CAP Tests Were Designed
Preliminary Data Analyses
An Introduction to NETS*T
Educational Analytics
Teaching and Educational Psychology
Portability of Teacher Effectiveness across School Settings
EVAAS Overview.
Student Mobility and Achievement Growth In State Assessment Mohamed Dirir Connecticut Department of Education Paper presented at National Conference.
Quantitative Measures: Measuring Student Learning
STUDENT GROWTH OBJECTIVES
Presentation transcript:

Measured Progress ©2012 Student Growth in the Non-Tested Subjects and Grades: Options for Teacher Evaluators Elena Diaz-Bilello, Center for Assessment Michael Cohen, Denver Public Schools Ruth Chung Wei, Stanford University Scott Marion, Center for Assessment Stuart Kahl, Measured Progress NCSA New Orleans June 25, 2014

Measured Progress ©2012 Alternative Assessment Strategies for Evaluating Teaching Effectiveness (AASETE) Stuart Kahl Measured Progress, Inc NCSA, New Orleans

Measured Progress ©2012 The AASETE objective was to: design a research-based system for using performance assessments along with other instruments to measure student academic growth, which in turn, could be used with other measures of teaching effectiveness for purposes of teacher evaluation in non-tested subjects and grades.

Measured Progress ©2012 The Research >12,000 students in three states Approx. 250 teachers >600 classrooms 18 subject/grade-level/state combinations Pre- and post-testing Prior achievement data (test scores and grades) Comparison of growth models

Measured Progress ©2012 Major Findings moderate to high correlations among results for different growth models applied to data from the same (or equated) tests quite variable correlations among indicators based on the same model, but different end-of- year tests (state vs. AASETE) or based on the same model applied to different test components (multiple-choice vs. performance)

Measured Progress ©2012 Findings (continued) only slightly higher correlations among indicators with sophisticated scaling of student scores as opposed to raw (or linearly transformed) student scores moderate to high correlations among indicators based on simple growth or simple prediction models and those based on more sophisticated models

Measured Progress ©2012 Features of AASETE-Recommended Approach common end-of-course (or interim) assessments across teachers, schools, districts; multiple assessment components, including performance less sophisticated analyses easily run with commonly used software packages such as Excel

Measured Progress ©2012 Features (continued) a simple prediction model subtracting students’ predicted scores on a common end-of-course measure from the actual scores on that measure, and aggregating (averaging) the differences at the teacher level human judgment in deciding if the student growth for a particular teacher is adequate given the unique characteristics of the teacher’s students, the other unique contextual factors of the teacher’s situation, and previous growth indicators for the teacher

Measured Progress ©2012 A System for Using Student Academic Growth in the Evaluation of Teaching Effectiveness in the Non-Tested Subjects and Grades A Guide for Education Policy Makers and Evaluators of Teachers Measured Progress, Inc. May 2014 This document was prepared by Measured Progress, Inc. with funding from the Bill and Melinda Gates Foundation, Grant No. OPP The content of the publication does not necessarily reflect the views of the Foundation.

Measured Progress ©2012 Table of Contents Preface …………………………………………………………… 3 Acknowledgments …………………………………………… 5

Measured Progress ©2012 Contents (continued) Student Academic Growth and Teacher Evaluations …… 6 The Problem in the Non-Tested Subjects and Grades … 6 What Are the Options for the Non-Tested Subjects and Grades ……………………………………………………………… 7 Test-Based Value-Added/Growth Indicators ……… 7 Student Learning Objectives (SLOs) …………………… 8 Comparisons of Approaches …………………………… 9 Human Judgment and Multiple Measures …………………12 Interpreting Normative Data ……………………………12 Weighing Multiple Measures ……………………………13

Measured Progress ©2012 Contents (continued) The Recommended “Simple Regression” Approach ……15 Common Assessments ……………………………………15 Why Not Simple Pre-Post Growth? ……………………. 16 Simple Prediction/Regression …………………………16 Numbers of Students and Teachers ……………………18 Outcome or End-of-Course Measures.……………… 19 More on Performance Components …………………… 21

Measured Progress ©2012 Contents (continued) Predictor Variables ……………………………………………22 Predictors in General …………………………………… 22 Predictors to Use ………………………………………… 23 Associated Analyses and Checks …………………………23 Some Final Words on the Generation, Interpretation, and Use of Value-Added/Growth Statistics ……………25 More on the Proposed Method ……………………25 How Much Work Is It? ………………………………26

Measured Progress ©2012 Contents (continued) Appendix A: Overview and Recommendations of AASETE Study ………………………………28 Appendix B: Instructions for Performing Regression-Based Growth Analysis at the Teacher Level Using Excel …………31

Measured Progress ©2012

How Much Work Is It? End-of-course testing already happening Data management systems in place with data Capability to access and use data One new Excel analysis to learn The challenges Policy Human judgment District assessment program

Measured Progress ©2012 P.O. Box 1217, Dover, NH | Web: measuredprogress.org | Office: It’s all about student learning. Period. Thank you!