Best Practices in Quality Test Administration

Slides:



Advertisements
Similar presentations
Presented to: CCSSO 2012 National Conference on Student Assessment June 27, 2012 Quality Assurance in a Chaotic World An External Perspective.
Advertisements

MONITORING OF SUBGRANTEES
MSCG Training for Project Officers and Consultants: Project Officer and Consultant Roles in Supporting Successful Onsite Technical Assistance Visits.
Effective Contract Management Planning
Using Home Base/Schoolnet to Deliver Assessments CTE Summer Conference July 14, 2014.
Quality evaluation and improvement for Internal Audit
ENVIRONMENTAL DATA MANAGEMENT & SHALE GAS PROGRAMS INTERNATIONAL PETROLEUM ENVIRONMENTAL CONFERENCE NOVEMBER 14, 2013.
1a Job Descriptions for Personnel Involved in PAT Implementation Materials Developed by The IRIS Center, University of Maryland.
Developing Monitoring and Pre-Scoring Plans for Alternate/Alternative Assessments Virginia Department of Education Division of Student Assessment and School.
Aligning Academic Review and Performance Evaluation (AARPE)
COMPANY CONFIDENTIAL Page 1 Final Findings Briefing Client ABC Ltd CMMI (SW) – Ver 1.2 Staged Representation Conducted by: QAI India SM - CMMI is a service.
Tool for Assessing Statistical Capacity (TASC) The development of TASC was sponsored by United States Agency for International Development.
Software Quality Assurance Activities
NAEP Alliance NAEP Technology Based Assessments National Conference on Student Assessment June 22, :00-3:30.
Update on Progress in Preparations for ANA 2013 Meeting of the National Consultative Forum 20 August 2013.
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
Waiting Room  Today’s webinar will begin shortly. REMINDERS: Dial and enter the passcode # to hear the audio portion of the presentation.
Department of Innovation & Technology City of Boston Five Key Ways to Be a Successful Project Manager March 2014.
Quality Activity Matrix Presented by Sandra Toalston President, SanSeek 1.
Developing a Monitoring and Pre-Scoring Plan for the Virginia Grade Level Alternative (VGLA) Adapted from the Virginia Department of Education Division.
The IEP: Drafting the IEP (Steps 1, 2, 3, and 4) Southwest Ohio Special Education Regional Resource Center Tuesday, November 7, 2006.
SOLUTION What kind of plan do we need? How will we know if the work is on track to be done? How quickly can we get this done? How long will this work take.
Innovation Software Corporation's Cultural Awareness Training Program Presentation by:
Developing Monitoring and Pre-Scoring Plans for Alternate/Alternative Assessments Virginia Department of Education Division of Student Assessment and School.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Making PROGRES Toward Institutional Sustainability Capacity Strengthening Session I Judith B. Seltzer, MPH, MBA Eliana Monteforte, MPH.
BSBPMG501A Manage Project Integrative Processes Manage Project Integrative Processes Project Integration Processes – Part 2 Diploma of Project Management.
Summative Assessment Welcome We will wait a few minutes for participants to log on and call in. –Call in: –Pass code: *6 to.
2012 TELPAS Online Testing & Data Collection. Disclaimer  These slides have been prepared by the Student Assessment Division of the Texas Education Agency.
66 Canal Center Plaza, Suite 700, Alexandria, VA | Phone: | Fax: | Best Practices in Quality Test Administration:
Planning Engagement Kickoff
QUEEN’S UNIVERSITY RECRUITMENT MANAGEMENT SYSTEM
The Butterfly Effect: How Small Changes Improve the Big Picture
Quality assurance in population and housing census SUDAN’s EXPERIANCE in QUALITY assurance of Censuses By salah El din. A . Magid OUR EXPERIANCE IN 5.
Maintaining Windows Server 2008 File Services
Case Management System
Welcome to Gateway to Data (G2D)
ITS Integrator User Certification Program
Introducing Version 9 for Security Suite and SAINT Cloud
Continuous Improvement Planning – Informal Needs Assessment
Sponsored by the RMS Center
Description of Revision
Project Roles and Responsibilities
THE ASSISTMENT SYSTEM DEMO
Introduction to Gateway to Data (G2D)
Students Welcome to “Students” training module..
Family Engagement Coordinator Meeting July 25, 2018
Math-Curriculum Based Measurement (M-CBM)
Measuring Project Performance: Tips and Tools to Showcase Your Results
Vineland Public Schools Our PARCC Experience
Adopting a patient pre-registration process
Usability Research: Lessons Learned For Digital Assessment Delivery Cathy Wendler Educational Testing Service June 21, 2013.
District Test Coordinators Meeting
Regional Meetings for Teachers of the Deaf Spring 2014
By Jeff Burklo, Director
ToolPack Milestones, Documentation, and Support Strategy
THE ASSISTMENT SYSTEM DEMO
Adopting a patient pre-registration process
School Improvement Plans
Standards-based Individualized Education Program Module Four: Assessing and Reporting Student Progress SBIEP Module Four: Assessing and Reporting Student.
GSBPM AND ISO AS QUALITY MANAGEMENT SYSTEM TOOLS: AZERBAIJAN EXPERIENCE Yusif Yusifov, Deputy Chairman of the State Statistical Committee of the Republic.
Fiscal policy program Presented by Cindy Draper, Fiscal Policy Officer – Training Days 2018 Introduce myself This session is to provide an overview of.
{Project Name} Organizational Chart, Roles and Responsibilities
Accommodations Required Content for STC and TA Training
Click Training Safety Module
STEPS Site Report.
Data for PRS Monitoring: Institutional and Technical Challenges
WHERE TO FIND IT – Accessing the Inventory
NMDWS Internship Portal
Presentation transcript:

Best Practices in Quality Test Administration Lessons Learned in the Administration of the National Assessment of Educational Progress (NAEP) Gina Broxterman, National Center for Education Statistics (NCES) June 20, 2016

NAEP Quality Assurance (QA): Purpose Monitoring NAEP processes across the entire assessment program Providing independent checks and feedback for various processes Ensuring continuous improvement of the NAEP program Ensuring standardization of assessment across the nation

NAEP QA: Pre-Administration Item development Items go through a multi-step quality assurance (QA) process before they are used in the operational assessment Training of field staff Field Staff are required to complete a combination of distance and in-person training Transition to digitally based assessments Configuration of equipment Safeguards of delivery mode Safeguards for system challenges

NAEP Assessment Administration

NAEP QA: Administration Best Practices NAEP Activities Series of quality assurance debriefs Item Development Quality assurance reviews of items Design, Analysis and Reporting Review of performance data Sample and Data Collection NAEP field staff fill out debriefing questionnaires Contractors host an in-person debrief meeting with all stakeholders to discuss all facets of NAEP 2016 administration From the observation data, Sampling and Design contractor develops action items that will improve implementation for upcoming administrations

NAEP QA: Administration Best Practices NAEP State Coordinators (NSC) Conduct observations Provide feedback on the uniformity of the administration across all states Embed QA practices in NSC data collection processes Pre- and post-demographic data reviews Convene working groups to revise processes for NAEP administration Identified Issue: 2015 MyNAEP system was cumbersome for users Action Plan: reduce jargon; reduce number of pages; add worksheets to streamline school's data collection; and provide multiple options for updating student lists

NAEP QA: Administration Best Practices HumRRO Facilitate the Quality Assurance Technical Panel (QATP) to provide technical advice to NCES about potentially problematic NAEP issues, and design and conduct quality assurance research studies Review NAEP contractor quality control plans and provide recommendations for improvements to NAEP operations and processes

Lessons Learned

NAEP: Lessons Learned - Translation Issue: Development schedule didn’t allow for thorough translation process Action Plan: Conduct translations earlier in the item development process Issue: Inefficient process for training NAEP scorers Action Plan: Develop an annotated guide that specifies purpose, style, format, and expected level of details, including the extent to which student responses should be quoted

NAEP 2016: Lessons Learned - DBA Issue: Standardization of software and updates across large volume of equipment created potential risk. Action Plan: Associate the tablet's login screen background color with a particular software version so that field staff can easily identify the version type Issue: Tablets and the Pelican case were hard to transport/lift Action Plan: Create a best practices guide to be added to the field staff manual of best ways to transport the pelican case in various school settings Issue: Session script for DBA assessments was not developmentally appropriate for fourth grade students Action Plan: Develop grade-specific session scripts that provides additional guidance for fourth-grade students

NAEP 2016: Lessons Learned - Training Issue: Creation of materials at the field staff level that opens potential for quality assurance issues. Action Plan: Develop additional quick check fact sheets for NAEP 2017 and organize in a central location for field staff Issue: Need more hands-on practice during in-person trainings Action Plan: Incorporate more hands-on practice with the field equipment

Future NAEP Administration: Quality Assurance

Future NAEP Administrations: QA Goals Embed QA with Technology into processes Quality Assurance Dashboard: Develops set of metrics used to track and evaluate success of assessment adminstration Capitalizes on technology to track assessments in progress and understand when and why issues arise.

Future NAEP Administrations: QA Goals Embed QA with Technology into processes Quality Assurance Dashboard: Develops set of metrics used to track and evaluate success of assessment adminstration Capitalizes on technology to track assessments in progress and understand when and why issues arise.