Instructional Uses of Test Results

Slides:



Advertisements
Similar presentations
1 Effective Feedback to the Instructor from Online Homework Michigan State University Mark Urban-Lurain Gerd Kortemeyer.
Advertisements

Advanced Learning Program Greenwich Public Schools Setting the Standard for Excellence in Public Education Grade Two Fall/Winter Score Interpretation Procedures.
Chapter 6 Process and Procedures of Testing
Instructional Uses of Test Results Interpreting Item-Level Reports.
Extended Assessments Elementary & Middle/High Reading Oregon Department of Education and Behavioral Research and Teaching January 2007.
Training for Test Examiners CMT Training for Test Examiners New for 2012 Test Security  New statistical analyses will be used with the 2012.
ANALYZING AND USING TEST ITEM DATA
Spring 2011 End-of-Course Mathematics Exams Proctor Training Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Shereen Henry, M.Ed.
Considering Internal Control
What is the TPA? Teacher candidates must show through a work sample that they have the knowledge, skills, and abilities required of a beginning teacher.
OPT 101 Presenters: Vicki Angel, Karen Greer Math Institute Trainers.
1 Standard Test Administration Testing Ethics Training PowerPoint Spring 2007 Utah State Office of Education.
Alternate Assessment Attainment Task Overview Part 2.3 Click here to download the Administration Guide Required for completion of this training 1Overview/Attainment.
Online Test Security Training. Agenda Welcome Communication and Support Policy and Key Terms Scheduling Monitoring Preventing Plagiarism Testing Students.
Test of Early Reading Ability-3 (TERA-3) By: Jenna Ferrara.
Copyright © 2014 Pearson Education, Inc. Publishing as Prentice Hall. Chapter
Do Now  You have given your beginning of the year diagnostic assessment. Your 30 students produce these results:  20 score below 50%  7 score between.
©2005 Prentice Hall Business Publishing, Auditing and Assurance Services 10/e, Arens/Elder/Beasley Internal Control and Control Risk Chapter 10.
LEAP 2025 Practice Test Webinar for Teachers
Understanding Your PSAT/NMSQT Results
Oral Administration Training
2012 Grade 3 Reading Student Portfolio
Measures of Academic Progress
It’s All About the Data, Folks!
Quarterly Meeting Focus
Instructional Uses of Test Results
Smarter Balanced Assessment Results
Update on Data Collection and Reporting
Iowa Teaching Standards & Criteria
PN Comprehensive Predictor Testing at the Joliet Junior College Academic Skills Center PowerPoint created by AIAS Director on February 3, 2015.
AIR Ways Training Module
Getting Ready for the PreACT
ITBS END OF YEAR ALTERNATE STANDARDIZED ASSESSMENT
Math-Curriculum Based Measurement (M-CBM)
SDBQ reports Student Reports - Performance at a Glance
Online Testing System Assessment Viewing Application (AVA)
Interpreting Science and Social Studies Assessment Results
Online Testing System Assessment Viewing Application (AVA)
Understanding Your PSAT/NMSQT Results
Office of Education Improvement and Innovation
LearnSmart Achieve™ Adaptive Test Prep
Understanding Your PSAT/NMSQT® Results
2017 PAWS Test Administrator Training
Understanding Your PSAT/NMSQT Results
Cognitive Abilities Test (CogAT)
Gail E. Tompkins California State University, Fresno
Spencer County Public Schools Responsible Use Policy for Technology and Related Devices Spencer County Public Schools has access to and use of the Internet.
Understanding Your PSAT/NMSQT Results
Video 6: Practice Papers for
Understanding Your PSAT/NMSQT® Results
Understanding Your PSAT/NMSQT Results
Cognitive Abilities Test (CogAT)
Understanding ITBS Scores
Online Testing System Assessment Viewing Application (AVA)
Grade 3 Midyear Promotion (GTMYP)
Analysing your pat data
What is NAPLAN? The National Assessment Program – Literacy and Numeracy (NAPLAN) is an assessment program for Australian students in Years 3, 5, 7 and.
Intensive Individualized Interventions and Supports
2011 Grade 3 Reading Student Portfolio
Understanding and Using Standardized Tests
TEAS Testing at the Joliet Junior College Academic Skills Center
Understanding Your PSAT/NMSQT Results
Grade 3 Midyear Promotion (GTMYP)
P ! A L S Interpreting Student Data to
Understanding Your PSAT/NMSQT Results
Developing Mathematical Thinking Institute (DMTI)
2013 Grade 3 Reading Student Portfolio
IEEE MEDIA INDEPENDENT HANDOVER DCN: sec
IEEE MEDIA INDEPENDENT HANDOVER DCN: sec
ITBS END OF YEAR ALTERNATE STANDARDIZED ASSESSMENT
Presentation transcript:

Instructional Uses of Test Results Interpreting Item-Level Reports

Iowa Assessments Provide Teachers with Important information to use in your instructional decision-making Information that is intended to supplement your observations and classroom assessment data Information to help teachers plan to accommodate classroom and individual student needs

Today’s Session Procedures for the review of test booklets Class Item Response Record Class Item Analysis

Review of Test Booklets Provided for: Review of item-level reports Support of data analysis and interpretation Supported by: Assurances language for proper and ethical test administration and use (Directions for Administration) Iowa Testing Programs Online Tools

Test items are Copyright © 2012 by The University of Iowa Test items are Copyright © 2012 by The University of Iowa. All rights reserved. These tests contain operational questions that are to be used solely for report review and score interpretation purposes associated with the users of the Class Item Response Record or the Group Item Analysis. No test items may be disclosed or used for any other reason. By accepting delivery of or using these tests, the recipient acknowledges responsibility for maintaining test security that is required by professional standards and applicable state and local policies and regulations governing proper use of tests and for complying with federal copyright law which prohibits unauthorized reproduction and use of copyrighted test materials. THESE TEST BOOKLETS MAY NOT BE RETAINED BY YOU AND MUST BE RETURNED TO IOWA TESTING PROGRAMS WITHIN THREE WEEKS OF RECEIPT. I have read, understood, and accept this message.

Class Item Response Record Assists teachers in the diagnosis of learning difficulties within a test area Used to identify students who are not completing a test Used by teachers to identify areas where a large part of the class is having difficulty with the same concept Common misconception Scope and sequence issue

Class Item Response Record In the body of the chart Numbers are Percent Correct Letters are the incorrect response marked by that student Blanks are correct responses

Students to the whole Class Average Percent Correct scores for the nation, system, and class can be used to compare: Class to Nation System to Nation Class to System Students to the whole Class Average percent correct scores for the entire Reading test. The class did as well as the system and the nation Average percent correct scores for Reading subtests. In general, the class performed similarly to the system and slightly below the nation

This report can be used to help detect strengths and weaknesses of Individual students The class as a whole These students answered All “Central ideas & their support” items incorrectly The same on “Synthesizing/summarizing” items Correctly on all “Connecting/extending ideas” items This group of students may need more help grasping the first two concepts than the third.

Students either answered this item correctly or they chose incorrect option “C”. Further information about the item would allow the teacher to decide the best way to remedy this shared misunderstanding.

This student appears to be performing well on the Reading content in general. His incorrect answers to “Author’s Craft” items seem to be similar to other students’ incorrect responses. Perhaps these patterns indicate a misunderstanding shared among students. More information about the content covered by these items could help the teacher decide how to address such confusion. This student appears to be struggling with the Reading content in general. However, He answered only one “Understand stated information” item incorrectly. He answered all “Connecting/extending ideas” items correctly. This information is useful because it shows which areas he is more successful in, allowing the teacher to decide how to best address the other areas he is struggling with. How is it possible for Estavan Perez to have a 90 in Key Ideas but have answered all items correctly?

Class Item Analysis Provides a deeper diagnostic review of performance The items that are most difficult for a class can be identified Analysis of the most difficult items may be beneficial in understanding why test performance was low

These show the difference between the class average and the national average for each skill. A “+” or “–” indicates the difference is larger than 20 points. These graphs are useful to see stronger and weaker skill areas at a glance. This report allows for comparisons between the Class Building System Nation This can be useful for identifying skills that need extra attention or improvement, as well as skill areas that are being understood well.

The graphs illustrate that this class performed similarly or much better than the nation on “Understand stated information” items. On average, the class performed lower than the nation on items covering “Central ideas and their support”. However, in general, the class performed similarly to the building and the system. On items 9 and 12, the class did better than the building and system.