Presentation is loading. Please wait.

Presentation is loading. Please wait.

66 Canal Center Plaza, Suite 700, Alexandria, VA | Phone: | Fax: | Best Practices in Quality Test Administration:

Similar presentations


Presentation on theme: "66 Canal Center Plaza, Suite 700, Alexandria, VA | Phone: | Fax: | Best Practices in Quality Test Administration:"— Presentation transcript:

1 66 Canal Center Plaza, Suite 700, Alexandria, VA 22314-1578 | Phone: 703.549.3611 | Fax: 703.549.9025 | www.humrro.org Best Practices in Quality Test Administration: Perspectives from Three Diverse Assessment Systems Presented at: Council of Chief State School Officers National Conference on Student Assessment Philadelphia, PA Presenters :Andrea Sinclair, HumRRO Robert Lee, MCAS Chief Analyst & PARCC Coordinator Eric Zilbert, CAASPP Evaluation Contract Monitor Gina Broxterman, NCES June 20, 2016

2 Presentation Overview ● Session objectives ● HumRRO’s third-party evaluator role ● HumRRO’s approach to investigating quality test administration for: –Partnership for Assessment for College and Careers Readiness (PARCC) –California High School Exit Examination (CAHSEE) –National Assessment of Educational Progress (NAEP) ● Lessons learned and best practices in test administration as presented from the perspective of representatives from... –PARCC –CAHSEE –NAEP ● Q&A 2

3 Session Objectives ● Discuss approaches to investigating quality test administration ● Discuss lessons learned regarding best practices in test administration from three diverse assessment systems The usefulness and interpretability of test scores require that tests be administered according to established procedures; without such standardization of test administration, the accuracy and comparability of score interpretation is impaired, threatening the validity of interpretations of these scores for their intended uses (AERA, APA, NCME, 2014, p. 111). 3

4 HumRRO’s Third-Party Evaluator Role ●H●HumRRO –A–An independent non-profit applied research organization –D–Diverse staff of education researchers, statisticians, industrial/organizational psychologists ●H●HumRRO is well respected for conducting... IIndependent verification –R–Replicate psychometric processing of test scores and other results PProgram evaluation –I–Impact and fidelity of educational programs VValidity studies –T–Test content, e.g., alignment studies –T–Test results, e.g., classification accuracy –T–Test impact, e.g., intended/unintended consequences QQuality assurance and audits –V–Verify processes are effective, complete, and efficient –I–Identify risks and make recommendations to strengthen systems 4

5 PARCC: Approach to Investigating Test Administration ● Mixed-methods approach, involving: –Survey of Test Administrators “Item Try Outs,” summer 2013 Field Test, spring 2014 First operational year, spring 2015 –Survey of Student Test Takers “Item Try Outs,” summer 2013 Field Test, spring 2014 First operational year, spring 2015 –Observations of administrations of computer-based tests, paper- based tests, and accommodated sessions Field Test, spring 2014 First operational year, spring 2015 –On-site interviews with test administrators Field Test, spring 2014 First operational year, spring 2015 5

6 Data collection activities guided by PARCC TOA 6 Figure 1. Claims from the Administration Phase of the PARCC Theory of Action (TOA). *The Quality of Test Administration research studies focus specifically on Test Administrators (TAs). Consequently, Claims 1 and 2 were specifically investigated specifically for TAs.

7 Research questions mapped onto claims from the TOA ● Do the test administrators (TAs) find the instructions clear, sufficiently detailed, and easy to follow? (addresses Standard 4.15) ● Do the TAs follow the protocols and instructions? (addresses Standards 6.1 & 6.2) ● Do the students appear to understand the instructions provided to them? (addresses Standards 4.16 & 6.5) ● Are students engaged in taking the test? (addresses Standards 4.16, 6.3, 6.4, & 6.5) ● Is there disruptive student behavior during the session? (addresses Standard 6.4) ● Any interruptions (e.g., technology-related problems) during the session? (addresses Standards 6.3 & 6.4) ● What type of questions do students ask during the test administration? (addresses Standards 4.16 & 6.5) ● If interruptions or other problems occurred did the TAs deal with the issue appropriately and effectively? (addresses Standards 6.1 & 6.3) ● Was security of test materials maintained at all times? (addresses Standards 6.6 & 6.7) 7

8 Challenges for PARCC Test Administration Investigations ● Scope of the project ● Lots of interested parties with varying viewpoints ● High profile 8

9 Approach to Investigating CAHSEE Test Administration ● CAHSEE tests –ELA test: multiple choice items and essay question –Mathematics test: all multiple choice –Both tests were required to be passed, in addition to satisfying local graduation requirements, to achieve a high school diploma –Tests administered annually to grade ten students (census) for accountability –Tests were administered multiple times a year to grade eleven and twelve students ● Mixed-methods approach supplemented formal audits by test administration contractor, involving –Observations of administrations of paper-based tests Initial observations at program start up Variety of test settings (e.g., classroom, cafeteria, library, gym) Targeted types of sessions, including accommodated sessions for students with disabilities and sessions with test variations for English learners –In-person interviews with test site coordinators 2008  2015, during census administrations, annually February or March 9

10 Topics Investigated by CAHSEE Data Collection Activities (1) ● Do the test administrators (TAs) find the instructions clear, sufficiently detailed, and easy to follow? (addresses Standard 4.15) –Is in-person training provided in addition to manuals for administration? –Are instructions for providing testing variations and accommodations clear? ● Do the TAs follow the protocols and instructions? (addresses Standards 6.1 and 6.2) – Are TAs appropriately allowing for additional time for each part of each test? – Are TAs providing testing variations and accommodations in accordance with protocols? ● Do the students appear to understand the instructions provided to them? (addresses Standards 4.16 and 6.5) ● Are students engaged in taking the test? (addresses Standards 4.16, 6.3, 6.4, and 6.5) ● Are the testing conditions adequate? (addresses Standard 6.4) –Do test settings provide adequate work space, lighting, and noise control? –Are large group test settings adequately monitored? –Is there any disruptive student behavior during testing? 10

11 Topics Investigated by CAHSEE Data Collection Activities (2) ● Any attempts to record or copy test materials, including the test questions, by students or others? (addresses Standards 6.6 and 6.7) ● What type of questions, if any, do students ask during the test administration? (addresses Standards 4.16 and 6.5) ● Are there any interruptions during the session? (addresses Standards 6.3 and 6.4) ● If any disruptions, interruptions, or other problems occurred, did the TAs deal with the issue appropriately and effectively? (addresses Standards 6.1 and 6.3) ● Was security of test materials maintained at all times? (addresses Standards 6.6 and 6.7) 11

12 Challenges for CAHSEE Test Administration Investigations ● Contract called for very limited quantity of observations, one or two schools per year ● Site visits required coordination with operational test vendor to avoid visiting a school that was being formally audited ● Cycle for printing test administration manuals precluded rapid incorporation of recommendations 12

13 Approach to Investigating NAEP Test Administration ● NAEP assessment administrations –Conducted annually from late January to early March –National and state samples –Various content areas including mathematics, reading, science, U.S. history, civics, geography, and arts –Currently transitioning from paper based to digitally based assessments ● Original approach involved random sample of schools (e.g., geographical location, grade, content area, type of assessment) to observe fidelity of administration procedures completed by NAEP field staff ● Current approach is research-based and uses observations to target specific questions of interest (e.g., extent to which use of tablets might result in unexpected differential outcomes by schools with differing characteristics, has decreased time spent on arts prevented students from enhancing knowledge in this subject?) 13

14 Topics Investigated by NAEP Data Collection Activities ● Original Approach –Did field staff prepare and arrange the room to ensure a conducive testing environment? –Did field staff administer NAEP according to the procedures on which they were trained? –Did field staff maintain the security of testing materials at all times? ● Current Approach –Digitally based assessment What types of tablet software and/or hardware issues arose in low and high SES schools? What types of student interactions with the tablets occurred in low and high SES schools? Did students in low and high SES school ask the same types of questions regarding testing mode during the administration? Were administration set-up procedures differentially operationalized by different SES level jurisdictions? –Arts paper based assessment What types and with what frequency did student questions occur about arts content and/or associated materials? How did students engage with the arts assessment (e.g., actively used materials, answered test questions, appeared distracted, did not engage with materials)? 14

15 Challenges for NAEP Test Administration Investigations ● Contract includes limited number of observational site visits each year, making it difficult to identify patterns or emerging issues based on a snapshot of administrations ● Investigation of certain topics is limited due to restriction of conducting unobtrusive observations 15

16 Next up: Lessons Learned PARCC CAHSEE NAEP Next up: Lessons Learned PARCC CAHSEE NAEP


Download ppt "66 Canal Center Plaza, Suite 700, Alexandria, VA | Phone: | Fax: | Best Practices in Quality Test Administration:"

Similar presentations


Ads by Google