Michigan Assessment Consortium Common Assessment Development Series Module 9B – Editing the Draft Test Items.

Slides:



Advertisements
Similar presentations
Writing constructed response items
Advertisements

Summative Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
TEACHER LICENSURE AND CERTIFCATION April 4, 2012 Quality-Inequality Quandary: Transacting Learning Relevance and Teacher Education in South Asia: from.
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
ASSESSMENT LITERACY PROJECT4 Student Growth Measures - SLOs.
Teaching & Assessing English Learners on California’s Standards © Northern California Comprehensive Assistance Center, WestEd, 2001 John Carr
Computer Applications in Testing and Assessment James P. Sampson, Jr. Florida State University Copyright 2002 by James P. Sampson, Jr., All Rights Reserved.
Web-based Transdisciplinary Training: Problem Solving and Response to Intervention Presented to Nebraska RtI Consortium February 23, 2007 Kathy L. Bradley-Klug,
Chapter 2: The Communication Process
Software Quality Assurance
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
Developing Assessment Literacy of Students, Educators, and Policymakers Edward Roeber.
RELIABILITY BY DESIGN Prepared by Marina Gvozdeva, Elena Onoprienko, Yulia Polshina, Nadezhda Shablikova.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Copyright 2010, The World Bank Group. All Rights Reserved. Training and Procedural Manuals Section A 1.
Grade 10 MCAS OPEN RESPONSE QUESTION SPRING 2001 Exam, #40
Kaizen–What Can I Do To Improve My Program? F. Jay Breyer, Ph.D. Presented at the 2005 CLEAR Annual Conference September Phoenix,
Iowa Collaborative Assessment Modules (ICAM) Heartland Area Education Agency.
Sample Writing Prompt Your principal is asking interested students to write a letter to him, explaining 3 reasons why you would be a good candidate to.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
1 Sarah McManus and Hope Tesh-Blum North Carolina Department of Public Instruction Division of Accountability Services/Testing Section North Carolina High.
Assessment Literacy Series 1 -Module 6- Quality Assurance & Form Reviews.
School Improvement Planning Today’s Session Review the purpose of SI planning Review the components of SI plans Discuss changes to SI planning.
1 Bias and Sensitivity Review of Items for the MSP/HSPE/EOC August, 2012 ETS Olympia 1.
Objectives This section will show you how to: write effective paragraphs and essays, describe the relationships between writing and reading provide some.
Overview of Michigan’s Secondary Assessments of Science Edward Roeber Office of Educational Assessment and Accountability.
Michigan Assessment Consortium Common Assessment Development Series Common Assessment Development Series Rubrics and Scoring Guides.
MAC Common Assessment Training Modules Session F3 Michigan School Testing Conference February 23, 2012.
Research Seminars in IT in Education (MIT6003) Research Methodology I Dr Jacky Pow.
Building Assessment Literacy in Michigan through Quality Common Assessment Development.
MAC Common Assessment Training Modules MAC Common Assessment Training Modules Session F5 Michigan School Testing Conference February 21, 2013 Ann Arbor.
ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal:
Michigan Assessment Consortium Building and Using Common Assessments: A Professional Development Series Module 1 Overview of the Series.
Let’s build a backyard swimming pool! Mr. Horstmeyer’s 8 th Grade Current Events Class.
4th grade Expository, biography Social Studies- Native Americans
Inside NAEP Developing NAEP Test Questions 1 Peggy G. Carr National Center for Education Statistics November 17, 2007.
Writing Multiple Choice Questions. Types Norm-referenced –Students are ranked according to the ability being measured by the test with the average passing.
Michigan Assessment Consortium Common Assessment Development Series Module 7 Assessing Special Needs Students.
McGraw-Hill © 2007 The McGraw-Hill Companies, Inc. All rights reserved. Objectives This section will show you how to: write effective paragraphs and essays,
Claim 1 Smarter Balanced Sample Items High School - Target I Solve equations and inequalities in one variable. Questions courtesy of the Smarter Balanced.
Claim 1 Smarter Balanced Sample Items High School - Target H Understand solving equations as a process of reasoning and explain the reasoning. Questions.
The Writing Process & Types of Essays. The Writing Process The path that you should follow when writing an essay: Prewrite Drafting Revising Editing Publishing.
Read the textbox descriptions and review the pictures. With one mouse click the picture will link online to the Unit Assessment System. Click the Space.
High Quality Items for High Stake Tests Delhi, 20 November, 2015.
Essay Questions. Two Main Purposes for essay questions 1. to assess students' understanding of and ability to think with subject matter content. 2. to.
Michigan Assessment Consortium Common Assessment Development Series Module 16 – Validity.
Michigan Assessment Consortium Common Assessment Development Series Module 2 – Determining the Parameters of the Common Assessment.
Michigan Assessment Consortium Common Assessment Training Modules Session B5 Michigan School Testing Conference February 23, 2011.
Proposed End-of-Course (EOC) Cut Scores for the Spring 2015 Test Administration Presentation to the Nevada State Board of Education March 17, 2016.
STATE OF TEXAS ASSESSMENTS OF ACADEMIC READINESS (STAAR TM ) Grade 4 Writing Victoria Young Director of Reading, Writing, and Social Studies Assessments.
Reviewing, Revising and Writing Effective Social Studies Multiple-Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards.
RUBRICS AND SCALES 1. Rate yourself on what you already know about scales. Use the scale below to guide your reflection. 2.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction CISC General Membership Meeting March (Excerpted version.
Present apply review Introduce students to a new topic by giving them a set of documents using a variety of formats (e.g. text, video, web link etc.) outlining.
Reviewing, Revising and Writing Mathematics Multiple- Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards Reviewing,
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Taking the TEAM Approach: Writing with a Purpose
Session 1, Whole-class Assessments
Steps for development and evaluation of an instrument - Revision
A College Writer's Process
Office of Education Improvement and Innovation
Session 6, Writing Performance Task Preparation Guide, Part 2
Instructional Learning Cycle:
Program Assessment Plans Step by Step
Challenges of Piloting Test Items
Michigan Assessment Consortium Common Assessment Development Series Module 21 Assessment Administration and Scoring MAC CAD-PD Mod-6 BRF
Michigan Assessment Consortium Common Assessment Development Series Module 18 Reliability MAC CAD-PD Mod-6 BRF [comp].ppt 1.
Claim 1 Smarter Balanced Sample Items Grade 4 - Target F
Preparing Educators in Classroom Assessment
Presentation transcript:

Michigan Assessment Consortium Common Assessment Development Series Module 9B – Editing the Draft Test Items

Developed and Narrated by Edward Roeber Professor, MQM Michigan State University

Support The Michigan Assessment Consortium professional development series in common assessment development is funded in part by the Michigan Association of Intermediate School Administrators in cooperation with …

In this module, you will learn about Why item editing is needed Why item editing is needed What are some common errors item writers make What are some common errors item writers make How to edit the draft items submitted How to edit the draft items submitted How to prepare the assessments for field testing/pilot testing How to prepare the assessments for field testing/pilot testing

Why is Item Editing Needed? Most item writers are not very proficient as item writers They make a variety of errors that need to be corrected if the item is to be used Even good item writers can make mistakes or create poor items, which they can’t see because they are so “wrapped up” in item development

Common Errors Made by Writers There are a number of errors you may see: Correct answer is the longest response Stem is unnecessarily wordy Stem may be poorly worded or confusing Some incorrect answers are not plausible Written response items may not yield scorable responses

Common Errors Made by Writers Other errors you may see: Unnecessary or confusing stimulus materials Lack of alignment between the item and the intended standard or expectation Unclear scoring rubrics for written- response items One item may cue the correct response to another

How to Edit the Items There are typically three steps in editing needed: Assessment edits Content review Bias and sensitivity review These might be done with one person or different individuals An advisory or review group might also help out

Assessment Edits Someone with experience in developing and editing items should be used May be one or more than one person The goal is to correct the issues raised in the earlier slides Shorten, simplify, and straighten out the items, one at a time Discard items that appear to be unfixable (e.g., when a complete re-write would be needed to salvage the item)

Content Review Content review is needed to make sure that the final item pool does not contain any items where the correct answers are incorrect or not accurate, according to content experts This may be a review panel of content area experts, or just one specialist Should occur after the assessment specialist edits are complete but before the item is field tested

Bias and Sensitivity Reviews This review is essential to assure that there is no systematic bias in the items, nor that any items contain material that might be deemed as offensive to a subgroup of test takers The challenge is to come up with interesting topics for the questions, without using content that might be offensive

Bias and Sensitivity Reviews If items that appear to be biased are found, they need to be revised or dropped This is especially true about items that delve into sensitive areas, such as politics, religion, and so forth Use persons who know how to spot bias and sensitive topics, not just minority group members

How to Prepare the Items for Field Testing/Pilot Testing Before the items are used for real, they need to be tried out: Pilot tested with small samples of students Field tested with representative groups of students Tryouts are essential for any set of items that will be used in high-stakes decisions, such as promotion or graduation, with students (for ethical, policy, and legal reasons)

How to Prepare the Items for Field Testing/Pilot Testing Once the assessment edits, content review, and bias/sensitivity reviews have been completed, the item pool should be updated - all changes made to the items Ideally, the items would be entered by this time (or earlier) in an item bank The pilot or field test design will dictate how many forms, containing how many items measuring how many different skills The item bank should be used to assemble the needed forms