Aligning Assessments to Monitor Growth in Math Achievement: A Validity Study Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Washington.

Slides:



Advertisements
Similar presentations
Measuring Teacher Impact on Student Learning PEAC Discussion Document| August 20, 2010.
Advertisements

Response to Intervention: Linking Statewide Initiatives.
Testing for Tomorrow Growth Model Testing Measuring student progress over time.
Overview of Progress Monitoring Training Session Part of a training series developed to accompany the AIMSweb Improvement System. Purpose is to provide.
North Carolina Educator Evaluation System. Future-Ready Students For the 21st Century The guiding mission of the North Carolina State Board of Education.
 Teacher Evaluation and Effectiveness laws are now in place  Legislature has passed a law that student performance can now be a part of teacher evaluation.
Gwinnett Teacher Effectiveness System Training
Student Learning Targets in the CCSS Classroom: Mathematics.
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
Iowa Assessment Update School Administrators of Iowa November 2013 Catherine Welch Iowa Testing Programs.
Department of Assessment, Curriculum, and Instruction Kahle Charles, Director of Curriculum.
Progress Monitoring project DATA Assessment Module.
Glenview School District 34 Parent’s Guide to Measures of Academic Progress MAP.
Issues of Technical Adequacy in Measuring Student Growth for Educator Effectiveness Stanley Rabinowitz, Ph.D. Director, Assessment & Standards Development.
1 Reading First Internal Evaluation Leadership Tuesday 2/3/03 Scott K. Baker Barbara Gunn Pacific Institutes for Research University of Oregon Portland,
Using observation to improve teaching and learning Robert C. Pianta, Ph.D. Dean, Curry School of Education Director, Center for Advanced Study of Teaching.
EasyCBM: Benchmarking and Progress Monitoring System Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Shereen Henry Math Instructional.
DRIVING INSTRUCTION THROUGH DATA WITHOUT DATA IT IS JUST AN OPINION.
12 Ways MAP Data Can Be Used in a School. 12 Ways To Use MAP Data Monitor Academic Growth Using National Norms Identify Individual Reading Pathway using.
1 MSP-Motivation Assessment Program (MSP-MAP) Tools for the Evaluation of Motivation-Related Outcomes of Math and Science Instruction Martin Maehr
Measures of Academic Progress. Make informed instructional decisions  Identify gaps/needs  Support specific skill development across content areas 
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
ICSD District RtI Committee Agenda 3/13/12 3:45- Review of Our Norms and today’s agenda 4:00- Defining RtI and screening tool criteria 4:30- Begin review.
NCLB AND VALUE-ADDED APPROACHES ECS State Leader Forum on Educational Accountability June 4, 2004 Stanley Rabinowitz, Ph.D. WestEd
Predictive Validity Evidence for DIBELS: Correlation to WASL Reading Scores Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information.
Maximizing Reading Gains to Meet AYP Targets: Decision Support Analytics for School Board Providence School District, RI April 2014.
Instructionally-Linked Assessments in an Age of Accountability Lauren B. Resnick University of Pittsburgh.
Development of a Math Screening Assessment on a Districtwide Basis Washington Educational Research Association Annual Conference December Mike.
Striving to Link Teacher and Student Outcomes: Results from an Analysis of Whole-school Interventions Kelly Feighan, Elena Kirtcheva, and Eric Kucharik.
SB : The Great Teachers and Leaders Act State-wide definition of “effective” teacher and principal in Colorado Academic growth, using multiple measures.
What Was Learned from a Second Year of Implementation IES Research Conference Washington, DC June 8, 2009 William Corrin, Senior Research Associate MDRC.
Carlos Martínez Group Leader – Standards and Assessments U.S. Department of Education.
C. “Changing the conversation…” Instructional Change –  Align to standards  Set higher expectations  Rigorous coursework  Assess  Data driven intervention.
Title I Academic Standards and Testing Informational Meeting September 24, 2014 WELCOME!
Administrative Retreat: Assessment System Common Assessments to MAP to KPREP June 7, 2013 Woodford County Schools Curriculum and Instruction.
IES Evaluations and Data Collection Instruments Lauren Angelo National Center for Education Evaluation and Sally Atkins-Burnett Mathematica Policy Research.
Using Data for Decisions Points to Ponder. Different Types of Assessments Measure of Academic Progress (MAP) Guided Reading (Leveled Reading) Statewide.
High Plains Education Cooperative.  A Multi-Tier System of Supports (MTSS) is a term used in Kansas to describe how schools go about providing supports.
Standards-Based Assessment Overview K-8 Fairfield Public Schools Fall /30/2015.
Classroom Diagnostic Tools. Pre-Formative Assessment of Current CDT Knowledge.
Using Data to Guide Decision-Making and Continuous Improvement.
End-of –year Assessments MAP and DRA Testing Workshop for Parents
Using the Iowa Assessments Interpretation Workshops Session 3 Using the Iowa Assessments to Track Readiness.
1 Assessing Student Understanding David Niemi UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards,
Catholic College at Mandeville Assessment and Evaluation in Inclusive Settings Sessions 3 & /14/2015 Launcelot I. Brown Lisa Philip.
Welcome to MMS MAP DATA INFO NIGHT 2015.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
MAP: Measured Academic Progress© Parent Coffee February 10, 2010.
Gary W. Phillips American Institutes for Research CCSSO 2014 National Conference on Student Assessment (NCSA) New Orleans June 25-27, 2014 Multi State.
Application of Growth and Value-Added Models to WASL A Summary of Issues, Developments and Plans for Washington WERA Symposium on Achievement Growth Models.
Tuesday, November 27, 2015 Board of Education Meeting District MAP.
Measures of Academic Progress (MAP)  Universal Screener and Diagnostic Test  AIMSWeb is only a screener  Adaptive test items  Depending on how student.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
KCPS Board Workshop October 8, 2014 Academic Division BP 1.1 “Transitioning to the NWEA Assessment”
Trigg County Public Schools Continuous Assessment Update March 10, 2011.
Next Generation Iowa Assessments.  Overview of the Iowa Assessments ◦ Purpose of Forms E/F and Alignment Considerations ◦ Next Generation Iowa Assessments.
USING MAP DATA TO SET GOALS AND ENGAGE STUDENTS AND FAMILIES 7/14/16.
STUDENT ASSESSMENT FOR INDEPENDENTS SCHOOLS: BRIEF REMARKS. J. Enrique Froemel, Ph.D. Director, Student Assessment Office.
You Can’t Afford to be Late!
M.A.P. Measures of Academic Progress
M.A.P. Measures of Academic Progress
Smarter Balanced Assessment Results
Assessment Research Centre Online Testing System (ARCOTS)
Using Assessments to Transform Instruction
Shelton School District
Study Questions To what extent do English language learners have opportunity to learn the subject content specified in state academic standards and.
Measures of Academic Progress (MAP)
Formative Assessments Director, Assessment and Accountability
Why do we assess?.
Presentation transcript:

Aligning Assessments to Monitor Growth in Math Achievement: A Validity Study Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Washington State Assessment Conference December 2008

Core Themes 1.Technical quality of locally developed district assessments as outcome measure for program evaluation Growth modeling in HLM 2.Fidelity of local district instrumentation to state instrumentation Sampling same domain Built from same maps and item specs Measuring same thing?

Possible Questions: 1.What does growth in math look like over the course of one year? Do the data match our expectations and theory of action? 2.Why do some students have steeper growth curves than others? 3.What causes growth? Does math coaching explain variation in the slopes of students’ growth curves? Evaluating Program with Growth Curves

Background of this Project Issue #1: WASL math achievement seems to decline after 3rd grade Working Explanations: 1.Reflection of curriculum and instruction? 2.Artifact of WASL standard setting

Background of this Project Issue #2: District assessments ready for action Written Curriculum STANDARDS Taught Curriculum Tested Curriculum ASSESSMENT of Learning EVOLOVING PURPOSES (AND VALIDATION) OF ASSESSMENTS Monitor learning across the system Inferences about fidelity to curriculum, instructional effectiveness Curriculum development, professional development Predictive of WASL performance

Design for Exploratory Study What’s happening to our students from Grade 5 to Grade 7? Longitudinal: Follow the same students from Grade 5 to Grade 7 Stratified: Approximately 10 students per classroom Repeated measurements: What can we learn from these data sets?….. –2007 Grade 4 WASL –Fall SASL –Winter SASL –5 End-of-unit district assessments –2008 Grade 5 WASL 10 waves

Research Questions What can we validly infer from WASL-like district benchmark assessments about…. 1.Fidelity to curriculum: Are teachers teaching the district curriculum? 2.Rigor of curriculum: Is it rigorous enough? Are we preparing our kids adequately? 3.Growth toward state standard: Are our district assessments showing us true achievement of state standards?

Results: Performance on District Assessments

Research Questions Further Questions about Growth toward State Standard : Are our district assessments showing us true achievement of state standards? Are our districts assessments aligned to WASL and measuring the same thing? How can I “link” scores from district assessments to WASL so that they all sit on the same scale? Could difficulty parameters for local items help teachers see the relative difficulty of different skills?

Linking District Assessments to WASL Equating, Scaling and Linking Literature Linear Equating Equipercentile Equating Item Response Theory (IRT) Rasch Model (Winsteps) places item difficulties and person abilities on the same scale “Common person” single group design Concurrent calibration

Grade 4 WASL Fall SASL Unit 2 Unit 3 Unit 4 Random sample of 5 students Results: Tentative Growth Curves

Two Challenges Reliability A function of test length and redundancy of items District assessments fairly short HLM raised doubts about reliability of growth curves Dimensionality Rasch model assumes one dimension Each district assessment measures somewhat different content

Broader Issues for District Assessments Locally developed WASL-like district benchmark assessments Advantages: Local capacity building, content validity Challenge: Time and expense of development, technical quality, scaling and equating Outside Instruments, e.g., Measures of Academic Progress (MAP) Advantage: Technical quality, growth scale Challenge: Content validity