Download presentation
Presentation is loading. Please wait.
Published byPercival Casey Modified over 9 years ago
1
Multiple Ways to Measure Student Growth A West Virginia Perspective: Tales from a State Transitioning to CCSS and SBAC Juan M. D’Brot Executive Director Office of Assessment and Accountability West Virginia Department of Education
2
Overview WV’s Context The Need for Growth Data Potential and Expected Uses of Growth Policy and Practice Implications (Early) Lessons Learned
3
Setting the Context: West Virginia… Is a highly centralized system Has newly articulated (and more rigorous) – Standards – Assessments – Cut Scores Uses externally benchmarked standards (NAEP) to define cuts Faces challenges around AYP in the face of new Annual Measurable Objectives (AMOs)
4
Setting the Context: The Order of Events Revisions to the Standards and Assessment System – Standards: (pre) SY 2008-2009 – Instruction: Ongoing – Assessment: SY 2008-2009 – Cut Scores: SY 2009-2010 – AMOs: 2010-2011…
7
~ +6% per year
10
So Why Growth? Why Now? WV is a Governing State in the SMARTER Balanced Assessment Consortium – Can’t we wait? A culture of accountability has shifted focus to school performance SGPs provide an opportunity to shift focus back to the student – A dichotomous distinction (both school and student) requires more granular articulation – Increasing the sensitivity of the accountability system
11
How do we communicate value on continuous improvement and not AYP? Accountability isn’t going away Insensitivity of AYP AYP shouldn’t be the driver of the conversation AYP is the outcome, improvement is the process Sensitivity requires one to answer 3 questions: 1.Am I proficient? 2.How much did I grow? 3.Did I grow enough? Ultimately, we want to change conversations about student data
12
But how? What do we need? 1.Infrastructure 2.Communication Plan 3.Technical Assistance
13
But how? What do we need? 1.Infrastructure – Only requires scale scores (SBAC) – Embed data into student information system 2.Communication Plan – Marketing (not advertising!) blitz…timeline? – Disseminating data and results…timeline? 3.Technical Assistance – What these data mean – (INDIRECT) Accountability implications – Potential for data around instructional, resource allocation, and data-driven decisions. – Learning from states like Colorado
14
Potential Uses in WV School Improvement Educator Effectiveness School Accountability Informing Instructional Decisions Program Evaluation and Research Informing Stakeholders
15
Expected Short-Term Uses: Policy vs. Practice Policy – Not for the accountability system…yet (had been considered) – Embedding into the revised teacher evaluation system – 1003(g) SIG and SFSF applicability Practice – To drive instructional, student-focused goal setting – Provide increased transparency for parents, teachers, administrators, and students
16
Immediate Lessons Learned Concerted consideration from the ground up (3 administrations) Communication Buy-in Information dissemination Timeline revision: https://sites.google.com/a/wvde.k12. wv.us/oaar-file-cabinet/research - see “WV Growth Model” https://sites.google.com/a/wvde.k12. wv.us/oaar-file-cabinet/research
17
Questions?
18
Thank You Comments? Criticism? Suggestions? Juan D’Brot (jdbrot@access.k12.wv.us)jdbrot@access.k12.wv.us Executive Director Office of Assessment and Accountability
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.