Assessment Literacy Series 1 -Module 3- Item Specifications.

Slides:



Advertisements
Similar presentations
Writing constructed response items
Advertisements

The Assessment Toolbox Linda Suskie Middle States Commission on Higher Education AB Tech February 2005.
ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal:
ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal:
Assessment Literacy Series
Gary D. Borich Effective Teaching Methods 6th Edition
Assessment Literacy Series
Benchmark Assessment Item Bank Test Chairpersons Orientation Meeting October 8, 2007 Miami-Dade County Public Schools Best Practices When Constructing.
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
Assessment Literacy Series 1 -Module 4- Scoring Keys and Rubrics.
Assessment “We assess students not merely to evaluate them, but to improve the entire process of teaching and learning.” - Douglas B. Reeves, Making Standards.
Some Practical Steps to Test Construction
Jan/Feb 2008CAPA Train-the-Trainer Workshop1 CAPA Training CAPA Examiners.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
1 Focusing on the FCAT/FCAT 2.0 Test-Taking Strategies Grades 3-5 Nancy E. Brito, Department of Assessment , PX47521.
1 Focusing on the FCAT/FCAT 2.0 Test-Taking Strategies Grades 9-11 Nancy E. Brito, Department of Assessment , PX47521.
Assessment Literacy Series
Assessment Cadre #3: “Assess How? Designing Assessments to Do What You Want”
The New England Common Assessment Program (NECAP) Alignment Study December 5, 2006.
Tuning Up your Common Assessments Michigan School Testing Conference February 21, 2012 Dr. Ed Roeber Kim Young Dr. Ellen Vorenkamp.
SCORING. INTRODUCTION & PURPOSE Define what SCORING means for the purpose of these modules Explain how and why you should use well-designed tools, such.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Groton Elementary Agenda: Discuss assessments, modifications, and accommodations Review common accommodations for assessments Study of Test.
1 Focusing on the FCAT Test-Taking Strategies Grades 3-5 Nancy E. Brito, Department of Assessment , PX47521 Information.
Assessment Literacy Series 1 -Module 6- Quality Assurance & Form Reviews.
Choose Your Own Adventure. Introduction Use this as a guide when working on a Smarter Balanced question.
Prepare and Use Knowledge Assessments. IntroductionIntroduction Why do we give knowledge tests? What problems did you have with tests as a student? As.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Teaching Today: An Introduction to Education 8th edition
English Language Arts Item Review Considerations Training Module.
Patterns of Square Numbers Module 1. A Question For You… You are helping your niece with her homework and she says, “I notice that every time I square.
Assessment Literacy Series 1 -Module 1- Design & Purpose Statement.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal:
Target -Method Match Selecting The Right Assessment.
8 Strategies for the Multiple Choice Portion of the AP Literature and Composition Exam.
4-Day Agenda and Expectations Day 2 Building Formative Assessments linked to deconstructed content and skills Building Formative Assessments linked to.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Assessment and Testing
Copyright © 2009 Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and the Intel Teach Program are trademarks.
Building Exams Dennis Duncan University of Georgia.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 4 Overview of Assessment Techniques.
1 Focusing on the FCAT Test-Taking Strategies Grades 6-8 Nancy E. Brito, Department of Assessment , PX47521
1 Focusing on the FCAT Test-Taking Strategies Grades 9-11 Nancy E. Brito, Department of Assessment , PX47521
Module 7 1. What do we know about selected- response items? Well constructed selected- response items can target: factual knowledge comprehension analysis.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Language Testing How to make multiple choice test.
The Constructed Response Assessment The Journey Continues.
Using Multiple Measures ASSESSING STUDENT ACHIEVEMENT.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
Reviewing, Revising and Writing Effective Social Studies Multiple-Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards.
SBAC-Mathematics November 26, Outcomes Further understand DOK in the area of Mathematics Understand how the new SBAC assessments will measure student.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
 The Common Core State Standards (CCSS) will be assessed in the spring of 2014 for Grades 3 through 5:
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
 Assessment and Score Reporting for SPRING 2014 will be aligned exclusively to the Common Core State Standards for Mathematics (CCSSM) (corestandards.org).
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Reviewing, Revising and Writing Mathematics Multiple- Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards Reviewing,
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
ALS Design, Build, Review: Using PDE’s Online Tools to Implement the SLO Process SAS Portal:
Assessment Literacy Series
PSSA Parent University
EDU 385 Session 8 Writing Selection items
Assessment Literacy Series
Constructing Exam Questions
Presenter :21: In an ongoing…and legislatively mandated…effort to reduce testing time, MDE.
Summative Assessment Grade 6 April 2018 Develop Revise Pilot Analyze
Designing Your Extended Written Response Assessment
Presentation transcript:

Assessment Literacy Series 1 -Module 3- Item Specifications

Participants will: 1. Examine multiple choice and constructed response items/tasks. 2. Develop items/tasks that fit the previously created specifications and blueprint in terms of:  content accuracy;  item type;  cognitive load; and  sufficiency. 2 Objectives

Participants may wish to reference the following: Guides Handout #4 – DoK Chart Handout #5 – Item Examples Other “Stuff” DoK Chart II “Smart Book” Textbooks, teacher manuals, and other supplemental materials 3 Helpful Tools

4 Outline of Module 3 Module 3: Item Specifications Multiple Choice Items Constructed Response Items/Tasks Short Constructed Response Extended Constructed Response Process Steps

MC items consist of a stem and answer options Grades K-1 = 3 options, Grades 2-12 = 4 options MC items contain only one correct answer Other options (distractors) are the same structure and length as the answer The distractors should be plausible (realistic) Balance the placement of the correct answer Avoid answer options that provide clues to the answer 5 Multiple Choice (MC) Guidelines

Answer options should be in ascending or descending order when possible Avoid “All of the above” and “None of the above” Directions state what to do, where and how to respond, and point values Refrain from adding directions when test-takers are repeating the same behavior from item to item 6 MC Guidelines (cont.)

Constructed Response (CR) Guidelines Language is appropriate to the age and experience of the students Student expectations are clear- Explain vs. Discuss Describe vs. Comment State the extent of the expected answer Give three reasons vs. give some reasons Directions state what to do, where and how to respond, and point values Refrain from adding directions when test-takers repeat the same behavior from item/task to item/task 7

Short Constructed Response (SCR) items/tasks: One step to solve Requires a brief response (2-5 minutes) Worth up to 3 points Extended Constructed Response (ECR) items/tasks: Multiple steps to solve Requires 5-10 minutes to answer Worth 4 or more points Note: ALL CR items/tasks require a well-developed, scoring rubric. 8 CR Guidelines (cont.)

Scenarios and passages should be: Relatively short Developmentally appropriate Sensitive to readability Performance expectations must state exactly what is to be observed and how it will be measured. Items/tasks are considered secure even in draft form. Copyrights must be handled appropriately. Images, graphs, and charts must be clear and of sufficient size for interpretation. 9 Helpful Hints

Each item/task requires a unique identifier. ◦ Tags contain information used to code and identify items/tasks. ◦ Item tags typically contain: Item number Subject Grade Post test Item type DoK level Content standard number (0008.MTH.GR4.POST.MC-LV2-4OA1) 10 Item Tag Codification Item #Subject Grade Post Item Type DoK Standard ID

1. Review specification tables, blueprint, and targeted standards. 2. Draft the item, passage, prompt, or scenarios. Insert any necessary tables, graphs, or images. 3. For MC items, create the single correct answer and then identify distractors, which reflect common errors/misinterpretations. For CR items/tasks, create a scoring “rubric” that articulates different levels of performance, including a sample response for each level. 4. Specify item/task unique ID: [ e.g., item number. subject. grade. post. item type-DoK level-content standard number ] 5. Repeat Steps 1-4 for the remainder of the items/tasks needed to complete the blueprint. 11 Process Steps

QA Checklist Types, point values, and DoKs match the blueprint. There are sufficient items/tasks to sample the targeted content. The items/tasks are developmentally appropriate for the intended test-takers. The correct answers and/or expected responses are clearly identified. Each item/task is assigned a unique ID. 12

Think-Pair-Share Based on the discussed item specifications, decide what is wrong with the following questions. 1. Some scientists believe that Pluto is __________. A. an escaped moon of Neptune B. usually the most distant planet in the solar system C. the name of Mickey Mouse’s dog D. all of the above 2. The people of Iceland __________. A. a country located just outside the Arctic Circle B. work to keep their culture alive C. claim to be descendants of the Aztecs D. the capital, Reykjavik, where arms talks have been held 13

Summary Module 3: Item Specifications Developed performance measure items/tasks that match the applicable blueprints. Next Steps Module 4: Scoring Keys and Rubrics Given the items/tasks created, develop scoring keys and rubrics. 14 Summary & Next Steps