66 Canal Center Plaza, Suite 700, Alexandria, VA | Phone: | Fax: | Best Practices in Quality Test Administration:

Slides:



Advertisements
Similar presentations
Presented to: CCSSO 2012 National Conference on Student Assessment June 27, 2012 Quality Assurance in a Chaotic World An External Perspective.
Advertisements

Wisconsin Knowledge & Concepts Examination (WKCE) Test Security Training for Proctors Wisconsin Department of Public Instruction Office of Educational.
Field Tests … Tests of the test questions Jeff Nellhaus, PARCC, Inc. Louisiana Common Core Implementation Conference February 19,
Wisconsin Knowledge & Concepts Examination (WKCE) Test Security Training for Proctors Wisconsin Department of Public Instruction Office of Educational.
Wisconsin Knowledge & Concepts Examination (WKCE) Test Security Training for Proctors Wisconsin Department of Public Instruction Office of Educational.
California Assessment of Student Performance and Progress Program Smarter Balanced Coordinator Training January – February Smarter.
– 14 CASSP CALIFORNIA ASSESSMENT OF STUDENT PERFORMANCE AND PROGRESS TEST EXAMINER TRAINING Whittier City School District.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
NYS Assessment Updates & Processes for New Social Studies Regents Exams September 18, 2014 Candace Shyer Assistant Commissioner for Assessment, Standards.
PARCC Accommodation: Text-to-Speech, Screen Reader Version, ASL Video, Human Reader/Human Signer For the ELA/Literacy Assessment December 2014.
Introduction & Background Laurene Christensen National Center on Educational Outcomes National Center on Educational Outcomes (NCEO)
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Wisconsin Knowledge & Concepts Examination (WKCE) Test Security Wisconsin Department of Public Instruction Office of Educational Accountability 06/26/2013.
 Inclusion and the Common Core State Standards  Inclusion and State Assessment  Inclusion and Teacher Evaluation  Results Driven Accountability 
Moving to the Common Core Janet Rummel Assessment Specialist Indiana Department of Education.
MONITOR TRAINING 2011 FCAT / FCAT 2.0 and SAT-10 Administrations.
Interstate New Teacher Assessment and Support Consortium (INTASC)
January 11, A few FAQS from districts regarding the 2013 pilot.
Assessment Cycle California Lutheran University Deans’ Council February 6, 2006.
CLASS Keys Orientation Douglas County School System August /17/20151.
Welcome to PARCC Field Test Training! Presented by the PARCC Field Test Team.
Human Resources Research Organization (HumRRO) 66 Canal Center Plaza, Suite 700 Alexandria, Virginia | Phone: | Fax:
CCSSO Criteria for High-Quality Assessments Technical Issues and Practical Application of Assessment Quality Criteria.
Changes in Community Informational Meeting March 10, 2014.
ND Topical Call Subgrantee Monitoring Tools: Meeting the Civil Rights Obligations to Students (Call 1) January 14, 2015.
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
ISAT FREQUENTLY ASKED QUESTIONS. Why is OSBE making so many changes to the ISAT? The contract between NWEA & OSBE was due to expire. Even if the contractor.
GED Testing. Objectives: The Official Testing Center: –Procedures for opening a new one –Policies and procedures for operating one –The staff required.
Race to the Top Assessment Program: Public Hearing on High School Assessments November 13, 2009 Boston, MA Presenter: Lauress L. Wise, HumRRO Aab-sad-nov08item09.
Session 1 Achievement Testing Pre-Test Training
PARCC Accessibility Features and Accommodations Manual Training for Parents Presented on November 20, 2014 Presented by the: Office of Special Education.
Understanding the 2015 Smarter Balanced Assessment Results Assessment Services.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction 2014 NATIONAL CONFERENCE ON STUDENT ASSESSMENT Re-envisioning.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
Paper-Based Test Security Training for Schools. Agenda Welcome Communication and Support Policy and Key Terms Scheduling Monitoring Preventing Plagiarism.
Testing Liaison Basic Training. Who can be a Testing Liaison? ONE RULE: INSTRUCTORS AND INSTRUCTIONAL AIDES CANNOT BE TESTING LIAISONS OR PROCTORS Typically,
Monitor Training 2016 SAT-10; FSA; FCAT 2.0; and EOC Assessments.
Understanding the Common Core State Standards and Literacy Standards.
Updated Section 31a Information LITERACY, CAREER/COLLEGE READINESS, MTSS.
1 BUILDING QUALITY LEARNING USING PERIODIC ASSESSMENTS Session Outcomes: Use diagnostic Periodic Assessments as instructional tools for quality enhancement.
MONITOR TRAINING 2010 FCAT and SAT-10 Administrations.
Paper-Based Test Security Training for Districts.
STUDENT ASSESSMENT LISA COTTLE, DIRECTOR OF TEST ADMINISTRATION TEA, CHARTER SCHOOL ADMINISTRATION ©
Charter District Participation in the Student Assessment Program Lisa Cottle, Director of Test Administration Student Assessment Division Texas Education.
CSDCDecember 8, “More questions than answers.” CSDC December 8, 2010.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction CISC General Membership Meeting March Keric Ashley,
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction CISC General Membership Meeting March (Excerpted version.
1 1 Overview of PARCC Field Test The following slides are from a MA DESE presentation to districts. 1.
2016 Back-to-School Update Next-generation MCAS and updating our learning standards Westfield Public School District August 23, 2016.
Project Update: Next-generation MCAS
New Developments in NYS Assessments
Research from the NCCSD: What’s new and exciting?
Wisconsin Department of Public Instruction
Welcome to the Nevada Test Administration Training and Q&A Session
MANUALS READ THE MANUALS!!
Usability Research: Lessons Learned For Digital Assessment Delivery Cathy Wendler Educational Testing Service June 21, 2013.
Best Practices in Quality Test Administration
National Conference on Student Assessment June Eric E
CURRICULUM & INSTRUCTION
Interim Assessment Training NEISD Testing Services
Text Dependent Analysis: Building teacher capacity to instruct for a new item type Susan Lyons, Ph.D. CCSSO 2016 National Conference on Student Assessment.
Accessibility Supports Training
Instructional Services Division of Educational Services
Instructional Services Division of Educational Services
Jeanie Behrend, FAST Coordinator Janine Quisenberry, FAST Assistant
Accessibility Supports Training
Instructional Services Division of Educational Services
Presentation transcript:

66 Canal Center Plaza, Suite 700, Alexandria, VA | Phone: | Fax: | Best Practices in Quality Test Administration: Perspectives from Three Diverse Assessment Systems Presented at: Council of Chief State School Officers National Conference on Student Assessment Philadelphia, PA Presenters :Andrea Sinclair, HumRRO Robert Lee, MCAS Chief Analyst & PARCC Coordinator Eric Zilbert, CAASPP Evaluation Contract Monitor Gina Broxterman, NCES June 20, 2016

Presentation Overview ● Session objectives ● HumRRO’s third-party evaluator role ● HumRRO’s approach to investigating quality test administration for: –Partnership for Assessment for College and Careers Readiness (PARCC) –California High School Exit Examination (CAHSEE) –National Assessment of Educational Progress (NAEP) ● Lessons learned and best practices in test administration as presented from the perspective of representatives from... –PARCC –CAHSEE –NAEP ● Q&A 2

Session Objectives ● Discuss approaches to investigating quality test administration ● Discuss lessons learned regarding best practices in test administration from three diverse assessment systems The usefulness and interpretability of test scores require that tests be administered according to established procedures; without such standardization of test administration, the accuracy and comparability of score interpretation is impaired, threatening the validity of interpretations of these scores for their intended uses (AERA, APA, NCME, 2014, p. 111). 3

HumRRO’s Third-Party Evaluator Role ●H●HumRRO –A–An independent non-profit applied research organization –D–Diverse staff of education researchers, statisticians, industrial/organizational psychologists ●H●HumRRO is well respected for conducting... IIndependent verification –R–Replicate psychometric processing of test scores and other results PProgram evaluation –I–Impact and fidelity of educational programs VValidity studies –T–Test content, e.g., alignment studies –T–Test results, e.g., classification accuracy –T–Test impact, e.g., intended/unintended consequences QQuality assurance and audits –V–Verify processes are effective, complete, and efficient –I–Identify risks and make recommendations to strengthen systems 4

PARCC: Approach to Investigating Test Administration ● Mixed-methods approach, involving: –Survey of Test Administrators “Item Try Outs,” summer 2013 Field Test, spring 2014 First operational year, spring 2015 –Survey of Student Test Takers “Item Try Outs,” summer 2013 Field Test, spring 2014 First operational year, spring 2015 –Observations of administrations of computer-based tests, paper- based tests, and accommodated sessions Field Test, spring 2014 First operational year, spring 2015 –On-site interviews with test administrators Field Test, spring 2014 First operational year, spring

Data collection activities guided by PARCC TOA 6 Figure 1. Claims from the Administration Phase of the PARCC Theory of Action (TOA). *The Quality of Test Administration research studies focus specifically on Test Administrators (TAs). Consequently, Claims 1 and 2 were specifically investigated specifically for TAs.

Research questions mapped onto claims from the TOA ● Do the test administrators (TAs) find the instructions clear, sufficiently detailed, and easy to follow? (addresses Standard 4.15) ● Do the TAs follow the protocols and instructions? (addresses Standards 6.1 & 6.2) ● Do the students appear to understand the instructions provided to them? (addresses Standards 4.16 & 6.5) ● Are students engaged in taking the test? (addresses Standards 4.16, 6.3, 6.4, & 6.5) ● Is there disruptive student behavior during the session? (addresses Standard 6.4) ● Any interruptions (e.g., technology-related problems) during the session? (addresses Standards 6.3 & 6.4) ● What type of questions do students ask during the test administration? (addresses Standards 4.16 & 6.5) ● If interruptions or other problems occurred did the TAs deal with the issue appropriately and effectively? (addresses Standards 6.1 & 6.3) ● Was security of test materials maintained at all times? (addresses Standards 6.6 & 6.7) 7

Challenges for PARCC Test Administration Investigations ● Scope of the project ● Lots of interested parties with varying viewpoints ● High profile 8

Approach to Investigating CAHSEE Test Administration ● CAHSEE tests –ELA test: multiple choice items and essay question –Mathematics test: all multiple choice –Both tests were required to be passed, in addition to satisfying local graduation requirements, to achieve a high school diploma –Tests administered annually to grade ten students (census) for accountability –Tests were administered multiple times a year to grade eleven and twelve students ● Mixed-methods approach supplemented formal audits by test administration contractor, involving –Observations of administrations of paper-based tests Initial observations at program start up Variety of test settings (e.g., classroom, cafeteria, library, gym) Targeted types of sessions, including accommodated sessions for students with disabilities and sessions with test variations for English learners –In-person interviews with test site coordinators 2008  2015, during census administrations, annually February or March 9

Topics Investigated by CAHSEE Data Collection Activities (1) ● Do the test administrators (TAs) find the instructions clear, sufficiently detailed, and easy to follow? (addresses Standard 4.15) –Is in-person training provided in addition to manuals for administration? –Are instructions for providing testing variations and accommodations clear? ● Do the TAs follow the protocols and instructions? (addresses Standards 6.1 and 6.2) – Are TAs appropriately allowing for additional time for each part of each test? – Are TAs providing testing variations and accommodations in accordance with protocols? ● Do the students appear to understand the instructions provided to them? (addresses Standards 4.16 and 6.5) ● Are students engaged in taking the test? (addresses Standards 4.16, 6.3, 6.4, and 6.5) ● Are the testing conditions adequate? (addresses Standard 6.4) –Do test settings provide adequate work space, lighting, and noise control? –Are large group test settings adequately monitored? –Is there any disruptive student behavior during testing? 10

Topics Investigated by CAHSEE Data Collection Activities (2) ● Any attempts to record or copy test materials, including the test questions, by students or others? (addresses Standards 6.6 and 6.7) ● What type of questions, if any, do students ask during the test administration? (addresses Standards 4.16 and 6.5) ● Are there any interruptions during the session? (addresses Standards 6.3 and 6.4) ● If any disruptions, interruptions, or other problems occurred, did the TAs deal with the issue appropriately and effectively? (addresses Standards 6.1 and 6.3) ● Was security of test materials maintained at all times? (addresses Standards 6.6 and 6.7) 11

Challenges for CAHSEE Test Administration Investigations ● Contract called for very limited quantity of observations, one or two schools per year ● Site visits required coordination with operational test vendor to avoid visiting a school that was being formally audited ● Cycle for printing test administration manuals precluded rapid incorporation of recommendations 12

Approach to Investigating NAEP Test Administration ● NAEP assessment administrations –Conducted annually from late January to early March –National and state samples –Various content areas including mathematics, reading, science, U.S. history, civics, geography, and arts –Currently transitioning from paper based to digitally based assessments ● Original approach involved random sample of schools (e.g., geographical location, grade, content area, type of assessment) to observe fidelity of administration procedures completed by NAEP field staff ● Current approach is research-based and uses observations to target specific questions of interest (e.g., extent to which use of tablets might result in unexpected differential outcomes by schools with differing characteristics, has decreased time spent on arts prevented students from enhancing knowledge in this subject?) 13

Topics Investigated by NAEP Data Collection Activities ● Original Approach –Did field staff prepare and arrange the room to ensure a conducive testing environment? –Did field staff administer NAEP according to the procedures on which they were trained? –Did field staff maintain the security of testing materials at all times? ● Current Approach –Digitally based assessment What types of tablet software and/or hardware issues arose in low and high SES schools? What types of student interactions with the tablets occurred in low and high SES schools? Did students in low and high SES school ask the same types of questions regarding testing mode during the administration? Were administration set-up procedures differentially operationalized by different SES level jurisdictions? –Arts paper based assessment What types and with what frequency did student questions occur about arts content and/or associated materials? How did students engage with the arts assessment (e.g., actively used materials, answered test questions, appeared distracted, did not engage with materials)? 14

Challenges for NAEP Test Administration Investigations ● Contract includes limited number of observational site visits each year, making it difficult to identify patterns or emerging issues based on a snapshot of administrations ● Investigation of certain topics is limited due to restriction of conducting unobtrusive observations 15

Next up: Lessons Learned PARCC CAHSEE NAEP Next up: Lessons Learned PARCC CAHSEE NAEP