2013-2014 Test Irregularity Summary Report. Agenda Overview, Purpose, and Background Test Irregularity Summary Report.

Slides:



Advertisements
Similar presentations
LEAP/GEE/LAA 2 Test Security Summer 2014 Harriet Hillson JoLynn Tompson jpschools.org.
Advertisements

Test Monitor Training Administering Minnesota Assessments.
The Caveon Security Screen designed specifically for School District Assessment Programs.
Test Security/ Accommodations and Special Populations Issues DTC Workshops January 29 and 30, 2008.
Test Monitor Training Administering Minnesota Assessments “Leading for educational excellence and equity. Every day for every one.”
Wisconsin Knowledge & Concepts Examination (WKCE) Test Security Training for Proctors Wisconsin Department of Public Instruction Office of Educational.
Wisconsin Knowledge & Concepts Examination (WKCE) Test Security Training for Proctors Wisconsin Department of Public Instruction Office of Educational.
LEAP, iLEAP, LAA 2 Test Security for Test Monitors March 14, 2014 Karen Herndon Harriet Hillson jpschools.org.
Fall 2012 MEAP Online Social Studies Pilot Presented September 27,
Training for Test Examiners CMT Training for Test Examiners New for 2012 Test Security  New statistical analyses will be used with the 2012.
General Test Security Guidelines Test Security Guidelines 2 All secure test materials must be handled and stored securely. −For paper-pencil tests,
Wisconsin Knowledge & Concepts Examination (WKCE) Test Security Wisconsin Department of Public Instruction Office of Educational Accountability 06/26/2013.
EngageNY.org Scoring the Regents Examination in Algebra I (Common Core)
California Assessment of Student Performance and Progress (CAASPP) 2014 Test Security Guidelines March 26, – 10:30 a.m.
Test Security. Texas Education Code (TEC) Sec SECURITY IN ADMINISTRATION OF ASSESSMENT INSTRUMENTS. (a) The commissioner: (1) shall establish.
MCAS Administration Training Submitted by The Learning Center for Deaf Students Contact Jessica Greenfield at
Spring 2011 End-of-Course Mathematics Exams Proctor Training Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Shereen Henry, M.Ed.
Test Security June 26, >
Test Security and Special Populations. Test Security.
Stanford Achievement Test – Tenth Edition Grade 3 Alternate Assessment for Promotion
Online Test Security Training. Agenda Welcome Communication and Support Policy and Key Terms Scheduling Monitoring Preventing Plagiarism Testing Students.
Ensuring Valid Test Results CGCS Annual Meeting Curriculum and Research Directors July 13, 2012.
AAAF ALTERNATE ASSESSMENT & ACCOUNTABILITY FOLDER.
Assessment Coordinator Training Tom Corbett, Governor ▪ Ronald J. Tomalis, Secretary of Educationwww.education.state.pa.us Winter Keystone Exams Administration.
MARCH & APRIL 2012 TEST ADMINISTRATIVE ERRORS. LPSS TESTING JUST LIKE LPSS TURNAROUND WE ARE GOOD IN SOME SPOTS…NOT SO GOOD IN OTHERS… WE ARE WORKING.
Florida Test Security Measures Presented at CCSSO National Conference of Student Assessment National Harbor, Maryland June 2013.
Caveon Test Security Audit for Cesar Chavez Academy – Oral Report December 5, 2009 Commissioned by Colorado Department of Education.
1 Standard Test Administration Testing Ethics Training PowerPoint Spring 2007 Utah State Office of Education.
Bulletin 118 Statewide Assessments and Practices April 2011 Provide effective administration of state assessments Test Security is the key to effective.
New Hanover County Schools End-of-Course Test Administration Training Online Administration Summer 2012.
1 Standard Test Administration Testing Ethics Training PowerPoint Spring 2011 Utah State Office of Education.
California Assessment of Student Performance and Progress (CAASPP) Guidelines for Submitting Appeals for the 2014 California Smarter Balanced Field Test.
TESTING TIMES Is a daily publication. Contains step by step directions and reminders for the day. Delivered daily via to principals, assistant principals,
2012 General Test Security Resources New Test Security Items Highlights.
Online Test Security Training. Agenda Welcome Communication and Support Policy and Key Terms Scheduling Monitoring Preventing Plagiarism Testing Students.
1 Spring 2014 EOC Test Security April 3, Assessment Administration 2.
KENTUCKY: POLICIES & PRACTICE Preventing, Detecting, and Investigating Test Security Irregularities: A Comprehensive Guidebook On Test Security For States.
District Test Coordinators Training. Organized into short modules Clear objectives and questions for discussion Designed to be delivered to district and.
Title I, Part A Preparing for Federal Program Monitoring Lynn Sodat Virginia Department of Education Office of Program Administration and Accountability.
Testing Accommodations. Allowable Test Administration Procedures and Materials  Available to any student who regularly benefits from the use of these.
District Test Coordinators Training. Policy Contacts.
Paper-Based Test Security Training for Schools. Agenda Welcome Communication and Support Policy and Key Terms Scheduling Monitoring Preventing Plagiarism.
Sample Test Security Training February 11; 2016 Office of the State Superintendent of Education Assessment Team 1.
1 Common Administration Errors Dan Evans Assessment Analyst for Students with Disabilities Office of Educational Assessment & Accountability.
Spring Administration March 7, 2016
Paper-Based Test Security Training for Districts.
OHIO’S STATE TESTS TESTING SECURITY Ohio Law (Ohio Administrative Code (H) and (J))
Test Administrator Training Spring 2016 Online Tests February 2016.
STUDENT ASSESSMENT LISA COTTLE, DIRECTOR OF TEST ADMINISTRATION TEA, CHARTER SCHOOL ADMINISTRATION ©
Test Administrator Training Spring 2016 Paper Tests.
Ensuring Test Data Integrity Tracy Cerda Cheryl Alcaya Minnesota Assessment Conference August 5, 2015 “Leading for educational excellence and equity. Every.
Test Administrator Training EOC & Off-Grade Spring Created by Everett Public Schools 1.
Arizona Department of Education Transitioning from the Past into the Future Prevention, Detection, and Investigation Leila E. Williams, PhD Associate Superintendent.
TESTING SECURITY Ohio Law (Ohio Administrative Code )
Make-Up Testing/Undo Student Test Submissions
Administering Minnesota Assessments
Responsible District and School Codes
Overview of Caveon Data Forensics
Wisconsin Department of Public Instruction
MANUALS READ THE MANUALS!!
Training for New District Test Coordinators
Overview of New State Data Forensics Analysis March 2011
Proctoring In Our Schools St Johns County School District
Proctoring In Our Schools St Johns County School District
District Test Coordinators Training
Understanding and Using Standardized Tests
District Test Coordinators Training
District Test Coordinators Training
Iowa Statewide Assessment of Student Progress
District Test Coordinators Training
Presentation transcript:

Test Irregularity Summary Report

Agenda Overview, Purpose, and Background Test Irregularity Summary Report

Overview and Purpose This presentation is designed to provide district staff with an overview and understanding of the Test Irregularity Summary Report Purpose Through this overview district staff will be able to understand: (a)The components and data that comprise the summary report (b)How the summary report can be used to plan and improve

Background Test Security The Department has annually provided districts with reports of testing irregularities and voids during testing and scoring windows. These reports have traditionally been specific to each assessment. In prior years, the Department has also worked to provide schools and districts greater detail as needed for planning and improvement purposes. This year, in order to greater support LEAs in ensuring test security and integrity the Department will be providing annual summary reports in addition to in window void reports, The Test Irregularity Summary should be used to inform test security training. Test Irregularity Summary Reports can be found in FTP folders.

Bulletin 118 Requires LEAs: Develop and adopt a district test security policies that provide for: The security of the test materials including storage Training of test coordinators and administrators Investigation of irregularities Procedures for monitoring of test sites

Agenda Overview, Purpose, and Background Test Irregularity Summary Report

Test Irregularity Summary Report: Purpose The Test Irregularity Summary Report was designed to provide district leaders with the data and information necessary to improve test security and integrity. The components in the report include: Voids resulted from plagiarism, reported incidents of test security violations, and administrative errors Voids and flags revealed from erasure analysis Retests that were administered due to administrative error EOC test sessions reopening

Using Summary Reports Leaders can use Summary Reports to: Identify areas of security concern (e.g., plagiarism, erasure) Identify areas of administrative concern (e.g., administrative errors, technology) Such information is helpful in: Evaluating current policies and procedures related to test security Providing appropriate support and training in test administration procedures Addressing technology issues as appropriate

Test Irregularity Summary Report: Sections The Summary Report is divided into the following sections: 1.Test Scores Voided for Students and Schools 2.Erasure Voids/Flags for Schools 3.Administrative Error Retests 4.EOC Session Reopened Reports

Section 1: Test Scores Voided Student test scores are voided when violations of test administration policy occur. Student test scores may be voided due to: (a)Plagiarism, (b)reported test security violations, or (c)administrative errors that occur during the testing process. The following tables are examples of summary data that provide the number of schools that had void test scores, the number of tests voided, and the number of voids by reason and by testing program.

Section 1: District Summary ComponentDescription Number of SchoolsNumber of schools with voids Number of TestsNumber of tests voided Plagiarism VoidsNumber of tests voided due to plagiarism identified in the scoring process Reported Prohibited Behavior Number of voids self-reported by the district due to test irregularities Administrative Error VoidsNumber of voids resulting from administrative errors (e.g. scheduling errors, accommodation misuse). Number Voided by Testing Program Number of voids broken down by testing program

Section 1: School Level Detail ComponentDescription Number of TestsNumber of tests voided at each school Reason for VoidReason for void (Admin Error, District, Plagiarism) Test ProgramTest program in which the void occurred GradeGrade level in which the void occurred SubjectSubject area in which the void occurred

Section 2: Erasure Analysis Erasure analysis is a data forensic technique designed to detect possible tampering with student answer sheets through an examination of excessive wrong-to-right erasures on student answer documents. Statistical analyses were conducted to determine where the number of wrong-to-right erasures made was improbable (i.e. <=1 in 10,000 chance). Students who had an improbable number of wrong to right erasures but did not meet the void criteria were flagged. By district and school, the tables below show the number of students who were identified as having wrong-to-right erasures that are significantly higher than the state average, and whose results were voided or flagged for school accountability.

Section 2: District Summary ComponentDescription Number of SchoolsTotal number of schools with erasure analysis voids and flags Number of Tests Voided/Flagged Total number of tests voided or flagged based on erasure analysis LEAP VoidedTotal number of LEAP tests voided for excessive erasures LEAP FlaggedTotal number of iLEAP tests flagged for excessive erasures, but not voided iLEAP VoidedTotal number of LEAP tests voided for excessive erasures iLEAP FlaggedTotal number of iLEAP tests flagged for excessive erasures, but not voided

Section 2: School Level Detail Column Under Subject TestColumn Value Number of Tests VoidedTotal number of tests voided for excessive erasures Number of Tests FlaggedTotal number of tests flagged for excessive erasure, but not voided Test AdministrationTest administration in which the voids and/or flags occurred GradeGrade level at which the voids and/or flags occurred SubjectSubject area in which the voids and/or flags occurred

Section 3: Administrative Error Retest When tests are administered incorrectly, student's results are voided. However, with the high stakes testing programs i.e., LEAP, GEE, LAA2, and EOC, students are afforded the opportunity to take an administrative error retest. Common administrative errors include wrong test administered, errors in accommodation administration, and scheduling errors. The following tables outline the number of schools that gave the administrative error retests, the total number of students who took administrative error retests by the testing program.

Section 3: District Summary Column Under Subject TestColumn Value Number of SchoolsTotal number of schools with administrative error retests Number of TestsTotal number of tests with administrative error retests EOCNumber of EOC tests resulting in administrative error retests LEAPNumber of LEAP tests resulting in administrative error retests

Section 3: School Level Detail Column Under Subject TestColumn Value Number of TestsTotal number of tests with administrative error retests Test ProgramTest program with administrative error retests Test AdministrationTest administration in which the retests occurred GradeGrade level at which the retests occurred SubjectSubject area in which the retests occurred

Section 4: EOC Sessions Reopened End-of-Course exams (EOCs) are administered online for high school subjects. During each administration, test administrators may occasionally need to reopen a test session in order to allow for student completion of test session. Reasons for reopening a session include: technology issues administration of accommodations allowing additional time for completion of the assessment. When an EOC test session has to be reopened, the reasons for reopening the session must be indicated in the EOC test system. Reported in the following tables are the total number and percent of students whose test sessions were reopened during the 2014 May administration, as well as the reasons reported for sessions reopened.

Section 4: Number and Percent of Sessions Reopened Column Under Subject TestColumn Value Number of Sessions OpenedNumber of test sessions opened listed by state, district, and school Number of Sessions ReopenedNumber of test sessions reopened listed by state, district, and school Percent of Sessions ReopenedPercent of test sessions reopened listed by state, district, and school

Section 4: Self-Reported Reasons for Sessions Reopened Column Under Subject TestColumn Value Sessions ReopenedNumber of test sessions reopened listed by state, district, and school Lost Internet ConnectionLost internet connection reported as reason for reopening of session Computer CrashedComputer crash reported as reason for reopening of session Lost PowerPower loss reported as reason for reopening of session More TimeAdditional time need by student reported as reason for reopening of session AccommodationAccommodations reported as reason for reopening of session EmergencyEmergency situation reported as reason for reopening of session IllnessStudent illness reported as reason for reopening of session Makeup TestMakeup test reported as reason for reopening of session Other ReasonsOther reasons reported for reopening of session Multiple ReasonsMultiple reasons reported for reopening of session

Next Steps for Districts Use Summary Reports to: Identify areas of security concern (e.g., plagiarism, erasure) Identify areas of administrative concern (e.g., administrative errors, technology) Such information is helpful in: Evaluating current policies and procedures related to test security Providing appropriate support and training in test administration procedures Addressing technology issues as appropriate