Exploring Data Use & School Performance in an Urban School District Kyo Yamashiro, Joan L. Herman, & Kilchan Choi UCLA Graduate School of Education & Information.

Slides:



Advertisements
Similar presentations
Jamesville-DeWitt School Report Card Presented to the Board of Education May 10, 2010.
Advertisements

SEED – CT’s System for Educator and Evaluation and Development April 2013 Wethersfield Public Schools CONNECTICUT ADMINISTRATOR EVALUATION Overview of.
Heather Zavadsky, Ph.D. Bringing School Reform to Scale: Moving From Islands of Greatness to Successful Systems.
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
AMOs 101 Understanding Annual Measurable Objectives Office of Educational Accountability Wisconsin Department of Public Instruction November 2012.
Collaborative Evaluation Communities in Urban Schools.
Determining Validity For Oklahoma’s Educational Accountability System Prepared for the American Educational Research Association (AERA) Oklahoma State.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
© 2010 AdvancED Oconee County Schools District Accreditation Process.
CRESST / U C L A Slide 1, Implementing No Child Left Behind: Assessment Issues Joan L. Herman UCLA Graduate School of Education & Information Studies.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
C R E S S T / U C L A Improving the Validity of Measures by Focusing on Learning Eva L. Baker CRESST National Conference: Research Goes to School Los Angeles,
Using School Climate Surveys to Categorize Schools and Examine Relationships with School Achievement Christine DiStefano, Diane M. Monrad, R.J. May, Patricia.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Reducing Chronic Absence What Will It Take? 2014.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Copyright © 2012 American Institutes for Research All rights reserved. CASEL/NoVo Collaborating Districts Initiative for Social and Emotional Learning:
Silas Deane Middle School Steven J. Cook, Principal Cynthia Fries, Assistant Principal October 22, 2013 Wethersfield Board of Education.
DEVELOPING ALGEBRA-READY STUDENTS FOR MIDDLE SCHOOL: EXPLORING THE IMPACT OF EARLY ALGEBRA PRINCIPAL INVESTIGATORS:Maria L. Blanton, University of Massachusetts.
Examining Teachers’ Experiences with the Adoption of Published Reading Programs at the Primary Grades: An Exploratory Study LeeAnn M. Trusela and Noelle.
DLT September 28, State Indicators and Rating for OFCS (have) Key Factors and Points to Keep in Mind (have) This power point presentation (will.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
C.O.R.E Creating Opportunities that Result in Excellence.
Update on Virginia’s Growth Measure Deborah L. Jonas, Ph.D. Executive Director for Research and Strategic Planning Virginia Department of Education July-August.
What Was Learned from a Second Year of Implementation IES Research Conference Washington, DC June 8, 2009 William Corrin, Senior Research Associate MDRC.
Instruction, Teacher Evaluation and Value-Added Student Learning Minneapolis Public Schools November,
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC) October 2012.
Teacher Effectiveness Pilot II Presented by PDE. Project Development - Goal  To develop a teacher effectiveness model that will reform the way we evaluate.
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
CRESST’s Evaluation of the Artful Learning Program: “Findings,” Contexts, and Future Explorations Noelle Griffin,Ph.D UCLA Graduate School of Education.
Evaluation of Shoreline Science Jia Wang & Joan Herman UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation,
Reform Model for Change Board of Education presentation by Superintendent: Dr. Kimberly Tooley.
Student Achievement Teacher and Leader Effectiveness Stronge and Associates Educational Consulting, LLC  Uniform evaluation system for teachers, educational.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Melrose High School 2014 MCAS Presentation October 6, 2014.
This year’s PSSA results show that Pennsylvania is on track to move all students to proficiency by 2014 as required by the federal No Child Left Behind.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
State Practices for Ensuring Meaningful ELL Participation in State Content Assessments Charlene Rivera and Lynn Shafer Willner GW-CEEE National Conference.
1 Children Left Behind in AYP and Non-AYP Schools: Using Student Progress and the Distribution of Student Gains to Validate AYP Kilchan Choi Michael Seltzer.
Exploring the Relationship between Teachers’ Literacy Strategy Use and Adolescent Achievement Kelly Feighan, Research for Better Schools Elizabeth Heeren,
+ SOUTH DAKOTA PRINCIPAL EFFECTIVENESS MODEL PROCESS OVERVIEW PE WEBINAR I 10/29/2015.
Massachusetts Comprehensive Assessment System (MCAS) /22/2010.
No Child Left Behind Impact on Gwinnett County Public Schools’ Students and Schools.
AMOs 101 Understanding Annual Measurable Objectives Office of Educational Accountability Wisconsin Department of Public Instruction November 2012.
Andrew C. Porter Vanderbilt University Measuring the Content of Instruction: Uses in Research and Practice.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Measuring Adequate Yearly.
1 Grade 3-8 English Language Arts Results Student Growth Tracked Over Time: 2006 – 2009 Grade-by-grade testing began in The tests and data.
1 Willa Spicer, Assistant Commissioner Cathy Pine, Director Carol Albritton, Teacher Quality Coordinator Office of Professional Standards, Licensing and.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Addressing Learning Problems in Elementary School Ellen Hampshire.
Consortium for Educational Research and Evaluation– North Carolina Building LEA and Regional Professional Development Capacity First Annual Evaluation.
Annual Progress Report Summary September 12, 2011.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
2007 – 2008 Assessment and Accountability Report LVUSD Report to the Board September 23, 2008 Presented by Mary Schillinger, Assistant Superintendent Education.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
Education.state.mn.us Principal Evaluation Components in Legislation Work Plan for Meeting Rose Assistant Commissioner Minnesota Department of Education.
American Education Research Association April 2004 Pete Bylsma, Director Research/Evaluation/Accountability Office of Superintendent of Public Instruction.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
RSU #38 Response to Intervention
Evaluation of An Urban Natural Science Initiative
Superintendent’s Goals
Roland Wilson, David Potter, & Dr. Dru Davison
Teacher Evaluation “SLO 101”
Andrew C. Porter Vanderbilt University August, 2006
WAO Elementary School and the New Accountability System
Assessment Literacy: Test Purpose and Use
For all stakeholders in Smitha Middle School April 30, 2013
Presentation transcript:

Exploring Data Use & School Performance in an Urban School District Kyo Yamashiro, Joan L. Herman, & Kilchan Choi UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing (CRESST) CRESST Conference UCLA September 8, 2005

Context & Background Large urban school district in the Pacific Northwest Value-added Assessment System implemented in District Need for more info on schools’ use of data (VA and other)

Data Use & Evidence-based Practice ) Data use at the heart of test-based reforms (NCLB) & continuous improvement efforts ) Little evidence of effects of data use on performance ) Some evidence shows limited access and capacity of schools to use data

Study Components CRESST conducts multi-year, multi- faceted study of data use: Transformation Plan Review - content analysis of school improvement plans Interviews, surveys, and observations from site visits of case study schools Analysis of district achievement and survey data Observations of school presentations about progress

Sampling Latent variable, multilevel analyses used to estimate gains (student-level, longitudinal ITBS data in reading & math) Gains based on growth from 3rd to 5th grade for 2 cohorts in each school: 3rd graders in rd graders in 2001 Within each cohort, 3 performance subgroups (average, low, high)

Sampling (cont’d) 13 Schools met the following criteria: Greater than district average % of low-SES students Starting point below district average “Beat the Odds” Sample (7): Higher than average gains Relatively more consistent across: 2 cohorts (98 & 01) reading and math performance subgroups (hi, avg, lo)

Sample Extremely diverse set of 13 small, elementary schools African American student populations between % Asian American student populations between % White student populations between 5- 59% Enrollment range: 134 to 533

Transformation Plan Review TP Review Rubric (Rating of 1 to 3) Types of evidence or indicators used Breadth; depth; VA data; technical sophistication Identification of goals/objectives or needs analysis Identification of solution strategies Specificity; based on theory/ research/data Analysis of progress Inclusion of stakeholders

Case Study Site Visits 2-day visits to 4 case study sites: Interviews/focus groups: Principal Building Leadership Team (BLT) Teachers (primary, upper) Teacher Survey

Additional Achievement Analyses Latent Variable Multiple Cohort (LMC) Design (with SEMs) Estimating gains on ITBS based on data across 5 cohorts (1998 to 2002) Gains for performance subgroups: Average (students starting at school mean initial status) High (students starting at 15 points above school’s average) Low (students starting at 15 points below school’s average) Patterns of growth differ from 2-cohort analysis

Results: Achievement Differences between Pre- and Post- Transformation Plan Reform High/Avg: 4 schools - consistent growth across rdg & math & subgroups Low: 6 schools - left some subgroups behind in math and/or rdg Very Low: 3 schools - no growth or negative gains

Results: Data Use Data Use Is Improving but Still Varied Over 3 years, schools increased use of assessment results and other evidence Schools increased mention of VA data Data Review Process is Inclusive When Capacity Exists Principal often conduit (filter, interpret) However, many schools developed collaborative processes for data review Transf Planning Process May become More Centralized (Less Inclusive) in Later Years

Results: Data Use (cont’d) Accessible and Excessive Data Teachers use data for schoolwide reform and (to lesser degree) instructional planning Teachers are overwhelmed with amount of data More Capacity Needed Whether schools integrate data into instructional decisions tended to be person- or climate-driven Principals need help, too More Diagnostic, Instructionally Sensitive Data Needed State testing data not seen as useful, valid, timely, or interpretable lack of continuity in tests (from grade to grade) lack of diagnostic info (item analyses) lack of individual growth info (pre-post) District assessments seen as more helpful to instruction

Results: Data Use & Achievement Pre-Post Gains & Data Use Practices

Results: Data Use & Achievement (cont’d) Ratings overlap for 7 of 13 schools For the most discrepant case (Polk): showing high gains but low data use school in chaos, with new leadership For remaining 5 moderate discrepancies, no case study data

Conclusions Less use of data for instructional planning probably a function of: type of data provided leadership & climate capacity Principals and teacher leaders need more help in interpreting and using data Data use and gains appear to have a moderate link for struggling schools; more case study info needed Need for more research on how to use value-added (gains) in an accountability setting