Administering, Analyzing, and Improving the Written Test

Slides:



Advertisements
Similar presentations
An Introduction to Test Construction
Advertisements

Alternate Choice Test Items
Item Analysis.
 1. Defining research objectives  2. Selecting a sample  3. Designing the questionnaire format  4. Pretesting the questionnaire  5. Pre-contacting.
Quality Assurance Documentation Procedures and Records Stacy M. Howard, MT(ASCP)
Administration and Scoring of Early Numeracy Curriculum-Based Measurement (CBM) for use in General Outcome Measurement.
GETTING TO KNOW THE SAT TIPS AND TRICKS TO IMPROVE YOUR SAT SCORE MR. TORRES 10/02/2013.
Assessment Literacy Series
Benchmark Assessment Item Bank Test Chairpersons Orientation Meeting October 8, 2007 Miami-Dade County Public Schools Best Practices When Constructing.
Evaluating tests and examinations What questions to ask to make sure your assessment is the best that can be produced within your context. Dianne Wall.
TESTING LISTENING By: Nane Magdalena
Item Analysis What makes a question good??? Answer options?
© De Montfort University, 2001 Questionnaires contain closed questions (attitude scales) and open questions pre- and post questionnaires obtain ratings.
THE TEST CYCLE. CREATION Write objectives – match objectives! Table of Specifications Write items (match objectives, meet quality characteristics, edit,
1 Module 8 Proofreading Matakuliah: G1222, Writing IV Tahun: 2006 Versi: v 1.0 rev 1.
Chapter 4 Topics –Sampling –Hard data –Workflow analysis –Archival documents.
ANALYZING AND USING TEST ITEM DATA
Chapter 13 Survey Designs
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
D ATA, D ATA, AND MORE D ATA Paraeducator Recertification Presented by: Lisa Andreasen April 17, 2013.
Improving Test Taking Strategies. Test Taking Skills  Most students have NEVER been taught test taking strategies.  Studies show that as many as 20.
Beginning-of-Grade 3 Practice Activity Fall 2013.
UNIT 3: DOCUMENT FORMATTING PROOFREADING. INTRODUCTION In this lesson you will learn: ◦ What proofreading is ◦ Who does proofreading ◦ Why proofreading.
1 JOB APPLICATION DOS AND DON ’ TS I can complete a job application with accuracy.
1 JOB APPLICATION DOS AND DON ’ TS.. Group Application Review Look over the three job applications. Make notes about whether you believe they did a good.
Business and Management Research
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Chapter 1: Introduction to Statistics
By: Christopher Prewitt & Deirdre Huston.  When doing any project it is important to know as much information about the project and the views of everyone.
Chapter 8 Measuring Cognitive Knowledge. Cognitive Domain Intellectual abilities ranging from rote memory tasks to the synthesis and evaluation of complex.
Market Survey. “ Who are the potential clients? What do they want and prefer? ”
Part #3 © 2014 Rollant Concepts, Inc.2 Assembling a Test #
Works Cited Page. Overview: Your Works Cited page is where you will list all the articles/books/websites/etc you will use in your paper. If you decide.
Welcome to the Manage Revalidation module of the “MIP Release 3 Study Workflow Training” course! This module guides you through the revalidation process.
Techniques to improve test items and instruction
EVALUATOR TRAINING. EVALUATION TEAMS 1 Adult Evaluator 2 Student Evaluators + 1 Student Room Consultant 1 Student Timer 1 Student Clerk 1 Monitor (only.
Reading, Multiple Choice and Short Writing Tasks.
Reading, Multiple Choice and Graphic Text.  Information paragraph- presents ideas and information on a topic  News report- presents information in the.
OHIO’S ALTERNATE ASSESSMENT FOR STUDENTS WITH SIGNIFICANT COGNITIVE DISABILITIES(AASCD).. AT A GLANCE.
Chapter 15 Qualitative Data Collection Gay, Mills, and Airasian
Your name odd, … {18} Work work Work Your name odd, … {32} Work work Work Homework – Scoring and Grading Each HW Package contains 3 or.
Grading and Analysis Report For Clinical Portfolio 1.
Assessment and Testing
RESEARCH METHODS Lecture 29. DATA ANALYSIS Data Analysis Data processing and analysis is part of research design – decisions already made. During analysis.
CENTURY 21 ACCOUNTING © 2009 South-Western, Cengage Learning LESSON 6-4 Finding and Correcting Errors on the Work Sheet  Finding and correcting errors.
Facilitate Group Learning
Fall 2002CS/PSY Empirical Evaluation Data collection: Subjective data Questionnaires, interviews Gathering data, cont’d Subjective Data Quantitative.
APA NJ APA Teacher Training 2 What is the Purpose of the APA? To measure performance of students with the most significant cognitive disabilities.
10-1 Questionnaire & Form Design Questionnaire Definition A questionnaire is a formalized set of questions for obtaining information from respondents.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Educational Methods The bag of tricks Direct Instruction/Lecture ä Advantages ä Teacher controlled ä Many objectives can be mastered in a short amount.
© Associate Professor Dr. Jamil Bojei, 2007 Questionnaire DesignSlide 1.
Appropriate Testing Administration
What have we learned? 1.Writing objectives is important (science) -They help us to organize -They guide instruction -They help our students -They help.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Energy Express Administrating the Woodcock Johnson Test for Achievement Allison Nichols, Ed.D. WVU Extension Specialist.
M ARKET R ESEARCH Topic 3.1. W HAT IS MARKET RESEARCH ? The process of gaining information about customers, products, competitors etc through the collection.
6 th Form Study Skills Session 2. Prize draw! Thanks for attending the session today At the end of the session we will draw a name to win a voucher.
Fact Finding (Capturing Requirements) Systems Development.
Administration and Scoring of Early Numeracy Curriculum-Based Measurement (CBM) for use in General Outcome Measurement.
CENTURY 21 ACCOUNTING © 2009 South-Western, Cengage Learning LESSON 6-4 Finding and Correcting Errors on the Work Sheet.
Business and Management Research
Office of Education Improvement and Innovation
Essay #1: Your Goals as a Writer
Organizing Your Laboratory Notebook
LESSON 6-1 Creating a Worksheet
SDLC Phases Systems Design.
Gathering data, cont’d Subjective Data Quantitative
Presentation transcript:

Administering, Analyzing, and Improving the Written Test

Assembling the test You have already Now you have to Written objectives Written test items Now you have to Package test Reproduce test

Packaging the test Group together all items of similar format Arrange test items from easy to hard Space the items for easy reading Keep items and options on the same page Position illustrations near descriptions

Packaging cont. Check your answer key Determine how student record answers Provide space for name (and date) Check test directions Proofread test

Reproducing The Test Know the machinery Make extra copies (2 – 3) Specify copying instruction (if giving to someone else to copy) Avoid Fine print Finely detailed maps or drawings Barely legible masters or originals File original test

Test Assembly Checklist YES NO 1. Are items of similar format grouped together? _____ _____ Are items arranged from easy to hard levels of difficulty? _____ _____ 3. Are items properly spaced? _____ _____ 4. Are items and options on the same page? _____ _____ Are diagrams, maps and supporting material above designed items and on the same page with items? _____ _____ 6. Are answers random? _____ _____ Have you decided whether an answer sheet will be used? _____ _____ 8. Are blanks for name (and date) included? _____ _____ 9. Have the directions been checked for clarity? _____ _____ 10. Has the test been proofread for errors? _____ _____ 11. Do items avoid racial and gender bias? _____ _____

Administering the Test Maintain a positive attitude Maximize achievement motivation Equalize advantages Avoid surprises Clarify rules Rotate distribution

Administering cont. Remind students to check copies Monitor students Minimize distractions Give time warnings Collect test uniformly

Scoring the Test Prepare an answer key (ahead of time) Check your answer key Score blindly Check machine-scored answer sheets Check scoring Record scores

Analyzing the Test Quantitative item analysis Qualitative item analysis Key Distracter Difficulty index (p) Discrimination index (D) Positive Negative Zero

(p) and (D) p = Number of students selecting correct answer Total number of students taking test D = Correct in UG – Correct in LG ½ student in class (if odd, bigger group)

Example Example for item Y (Class size = 28) Options A* B C D Upper 4 1 5 4 Lower 1 7 3 3 p ? D ?

Example Example for item Z (Class size = 30) Options A B* C D Upper 3 4 3 5 Lower 0 10 2 3 p ? D ?

Pre – Post Test results If this is done, look at Percentage answering each item correctly on each test Percentage of items answered in the expected direction for the entire test Limitations Time Contamination

Debriefing Guidelines: Before handing back tests Discuss Problem items Listen to student reactions Avoid on-the-spot decisions Be equitable in changes Ask students to double-check Ask students to identify problems

Identifying problems Test can be improved – you are human Focus on test – it is not about you Sincerely examine if question is good or not – discrimination index really help – do not just throw out a question Scores may change – rank typically does not Objective of debrief is to make a better test!