Jack B. Monpas-Huber, Ph.D. Director of Assessment and Student Information How Do We Know When They’ve Learned It? Guidance for Development of Common Assessments.

Slides:



Advertisements
Similar presentations
An Introduction to Test Construction
Advertisements

Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
College of Health Sciences Lunch and Learn Series February 12, 2010 Rena Murphy & Sharon Stewart.
Academy 2: Using Data to Assess Student Progress and Inform Educational Decisions in Culturally Responsive RTI Models Academy 2: Culturally Responsive.
1 SESSION 3 FORMAL ASSESSMENT TASKS CAT and IT ASSESSMENT TOOLS.
Student Learning Targets in the CCSS Classroom: Mathematics.
The Research Consumer Evaluates Measurement Reliability and Validity
ASSESSMENT LITERACY PROJECT4 Student Growth Measures - SLOs.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Assessment: Reliability, Validity, and Absence of bias
Group 3 Teachers: No Growth Model Classes
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Grade 12 Subject Specific Ministry Training Sessions
Understanding Validity for Teachers
EasyCBM: Benchmarking and Progress Monitoring System Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Shereen Henry Math Instructional.
CLASSROOM ASSESSMENT FOR STUDENT LEARNING
ASSESSMENT& EVALUATION Assessment is an integral part of teaching. Observation is your key assessment tool in the primary and junior grades.
CURRICULUM ALIGNMENT Debbi Hardy Curriculum Director Olympia School District.
Revising instructional materials
Bank of Performance Assessment Tasks in English
Understanding by Design
Classroom Assessment and Grading
Designing and evaluating good multiple choice items Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information.
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Clear Purpose: Assessment for and of Learning: A Balanced Assessment System “If we can do something with assessment information beyond using it to figure.
Classroom Assessment A Practical Guide for Educators by Craig A
The World of Assessment Consider the options! Scores based on developmental levels of academic achievement Age-Equivalent scores.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Math In Focus Haverford’s New Math Program. Strategies of MIF The “why” as well as the “how” is taught with all math concepts. This allows teachers to.
Overview to Common Formative Assessments (CFAs) Adapted from The Leadership and Learning Center Presented by Jane Cook & Madeline Negron For Windham Public.
Invention Convention Seth Krivohlavek Angie Deck.
Assessing Student Learning
Assessments Matching Assessments to Standards. Agenda ● Welcome ● What do you think of assessment? ● Overview of all types of evidence ● Performance Tasks.
Dr. Marcia Cassidy presents … USING PRE- and POST-TESTS : - Characteristics and Cautions.
Classroom Assessment for Student Learning: Doing It Right – Using It Well.
Classroom Assessment Literacy and What you need to know to do it well!
Record Keeping and Using Data to Determine Report Card Markings.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
4-Day Agenda and Expectations Day 2 Building Formative Assessments linked to deconstructed content and skills Building Formative Assessments linked to.
The PLC Team Learning Process Review Step One: Identify essential (key) learning standards that all students must learn in each content area during each.
8th Grade Criterion-Referenced Math Test By Becky Brandl, Shirley Mills and Deb Romanek.
Grades K-2 Guidebook and Remediation Guidance. Please have open on your computer: -Session Power Point -Grades K-2 Math GuidebookGrades K-2 Math Guidebook.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
PLC Team Leader Meeting
ANALYSIS AND ATTRIBUTES OF APPROPRIATE ASSESSMENTS Coastal Carolina University.
Gathering Evidence to Achieve Results.  ALL CSD students and educators are part of ONE proactive educational system.  Evidence-based instruction and.
What is grading? What is its purpose? What does it represent? How should it be done?
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Validity Evidence for the Orleans-Hanna Algebra Prognosis Test Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information.
Gathering Evidence to Achieve Results. A Culture of Collaboration - PLC Norms - Systems Support A Focus on Results -Pre-assessments -Common Formative.
SUMMATIVE ASSESSMENT A process – not an event. Summative assessment “Information is used by the teacher to summarize learning at a given point in time.
Evidence for Impact on Student Learning Tony Norman Associate Dean, CEBS In P. R. Denner (Chair), Evidence for Impact on Student Learning from the Renaissance.
Foundations of American Education: Perspectives on Education in a Changing World, 15e © 2011 Pearson Education, Inc. All rights reserved. Chapter 11 Standards,
RUBRICS Why use them?. FEEDBACK John Hattie identifies improving feedback as one of the most effective strategies for improving the learning experience.
Assessment Literacy and the Common Core Shifts. Why Be Assessment Literate? ED Test and Measurement 101 was a long time ago Assure students are being.
PLCs Professional Learning Communities Staff PD. Professional Learning Committees The purpose of our PLCs includes but is not limited to: teacher collaborationNOT-
How to use the assessment process to improve the afterschool program.
Summer Assessment Institute: Linking Data Teams & Quality Assessment August 2015.
Chapter 11 Effective Grading in Physical Education 11 Effective Grading in Physical Education C H A P T E R.
The pre- post assessment 3 points1 point Introduction Purpose Learning targets Instruments Eliminate bias & distortion Writing conventions.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Classroom Assessment A Practical Guide for Educators by Craig A
Assessments for Monitoring and Improving the Quality of Education
Building an Assessment Map!
PLCs Professional Learning Communities Staff PD
Common Core State Standards Deconstructing the Standards (Guiding a Work Session) Once participants have practice the process …switch to this short guiding.
Presentation transcript:

Jack B. Monpas-Huber, Ph.D. Director of Assessment and Student Information How Do We Know When They’ve Learned It? Guidance for Development of Common Assessments How Do We Know When They’ve Learned It? Guidance for Development of Common Assessments (206) Office (206) Cell

1.The purpose of the common assessment is clear. Formative? Summative? Where in the learning process? 2.What the common assessment is intended to measure is clear. What are the power standards, or learning targets. Develop a test map. 3.The instrument gathers the right kind of data for the learning target. Knowledge, or process skill? Selected-response, or performance assessment? 4.The instrument gathers the same kind of data in a consistent way. If a performance, need to develop a common rubric and agree on its application. The rubric needs to guide the scores, not teacher autonomy or preference. 5.The instrument gathers enough data to provide sufficient evidence of learning, not chance. Three tasks. Features of a quality common assessment

Define power standards What are the big ideas that we expect students to learn in this period of time? Develop an assessment (test) map How to measure the power standards? What kinds of items are appropriate? How large/long should this assessment be? How many tasks do we need to adequately measure mastery? Develop (or populate with existing) items/tasks/scoring rubrics What do you have already? Which standards do they measure? Review and Piloting Are we really measuring what we say we’re measuring? Is anything confusing, ambiguously worded, or biased? Steps in the development process Standard setting What counts as proficiency? Developing common assessments

The importance of the “test map” 1.It lays out a plan for assessing the expectations. 2.It connects expectations to instruments. 3.It ensures that the assessment covers what was taught/expected. 4.It builds consistency into assessment practice. A Test Map from 7th Grade Math Developing common assessments

Standard setting Is…a (judgmental) “process of establishing cut scores on examinations” (Cizek, p. 225) Is not…a “search for a knowable boundary that exists a priori between categories, with the task of standard setting participants simply to discover it” (Cizek, p. 227) Standards must be set because decisions must be made on some basis For established common assessments Recommendation: Some variation of the “bookmark method” 1.Items/tasks (re)ordered by difficulty (based on difficulty data) 2.Judges place a bookmark where they believe the cutoff for proficiency should be

The Split and Switch Design 1 A variation on the traditional pretest-posttest design Collecting good evidence of instructional effectiveness 1 Popham, J. (2001). The truth about testing. Alexandria, VA: Association for Supervision and Curriculum Development. 1.Create two forms of a test, somewhat equal in difficulty. 2.Split class into two halves 3.Half the class takes Form A as pretest, the other half takes Form B as pretest 4.Instruction 5.Switch forms for posttest 6.Blind-score all Form As (pres and posts scrambled), all Form Bs (scrambled) 7.Calculate gains on Form A, Form B Note: Typically you would subtract the pre from the post for each student and then average the gain scores. Not in this case! Instead, you’re subtracting Form A pretest mean from Form A posttest mean--even though those means are based on different students. Can still make inferences about instruction because all the same students have received the same instruction (treatment). Try it!