Presentation is loading. Please wait.

Presentation is loading. Please wait.

Learning about the Item Review Process: An Overview

Similar presentations


Presentation on theme: "Learning about the Item Review Process: An Overview"— Presentation transcript:

1 Learning about the Item Review Process: An Overview

2 Item Development Timeline
The first phase of item development contracts were awarded to ETS and Pearson in 2012. Phase I of item development began in the fall of 2012 and was completed in late summer 2013. Phase II began in October 2013 and will run through fall 2014.  2

3 Who are PARCC’s item reviewers?
PARCC item reviewers come from PARCC governing and participating states K-16 educators, state department of education staff, and external experts Deep content expertise Experience with students from various backgrounds Many have participated in item reviews for their own states Experience in various geographic and educational settings

4 PARCC Item Review Committees
Various review teams meet, both in person and virtually, to evaluate the items over an extended period of time. Only items that are approved by these teams of reviewers will appear on the PARCC summative assessments. Group Membership Charge Core Leadership Groups PARCC state DOE staff and HE faculty; approx. 60 members in each content area Review all test items developed for the PARCC summative assessments for suitability of content, age-appropriateness, and alignment to CCSS. Approve recommended revisions to items. State Educator Reviewers K-12 LEA staff and HE faculty; approx. 80 members in each content area, plus 45 passage reviewers Ensure that items are age-appropriate and are measuring the content of the Common Core State Standards for a given grade level. Bias & Sensitivity Reviewers Citizens and educators from various backgrounds; approx. 50 members in each content area Ensure that items and passages are fair, unbiased, and age-appropriate for a given grade level. Core Leadership groups are content area-based—there is one for ELA and one for Math State Educator Committees include: ELA Item—about 80 members ELA Passage—about 45 members Math Item--about 80 members Bias & Sensitivity Committees are also content area-based Operational Working Groups also play a large role in item review (next slide) 4

5 Operational Working Groups
PARCC Operational Working Groups (OWGs) involved in item review: ELA/Literacy Mathematics Accessibility, Accommodations, and Fairness (AAF) OWG members are content experts from state departments of education After each round of item review, OWGs participate in “item reconciliation” to review suggested revisions from the item review committees Reconciliation following bias & sensitivity review includes members from content OWGs and AAF OWG

6 PARCC Item Review Process
Items Developed Core Leadership Review State Educator Review Bias & Sensitivity Review Field Test Item Bank OWG Reconciliation Life cycle of an item Before items go into the item bank, they also undergo review and testing of their embedded technology

7 PARCC Item Review Meetings
PARCC item review meetings are conducted both in-person and virtually, and last 3 – 5 days Meetings consist of independent review of items followed by group discussion ELA teams are organized by grade level (grades 9/10 are combined) Math and Bias & Sensitivity teams are organized by grade band Core Leadership groups have 4-5 reviewers per team; State Educator and Bias & Sensitivity groups have 6-8 reviewers per team Item review teams are facilitated by contractor staff Each team includes an OWG member

8 ELA Review Considerations/Criteria
Does the item allow for the student to demonstrate the intended evidence statement(s) and to demonstrate the CCSS to be measured? Is the wording of the item clear, concise, and appropriate for the intended grade level? Does the item provide sufficient information and direction for the student to respond completely? Is the item free from internal clueing and miscues? Do the graphics and stimuli included as part of the item accurately and appropriately represent the applicable content knowledge? Are any graphics included as part of the item clear and appropriate for the intended grade level? If the item has a technology-based stimulus or requires a technology-based response, is the technology design effective and grade appropriate? Is the scoring guide/rubric clear, correct, and aligned with the expectations for performance that are expressed in the item or task? If the item is part of a PBA task, does it contribute to the focus and coherence of the task model? This slide lists general item review guidelines for ELA reviewers

9 Mathematics Review Considerations/Criteria
Does the task measure the intended evidence statement(s)? Does the task measure the intended mathematical practice(s)? Is the task mathematically correct and free from errors? Is the wording of the task clear, concise, and grade-level appropriate? Are the graphics/stimuli in the task clear, accurate, appropriate for the task, and appropriate for the grade? Do each prompt and all associated graphics/stimuli contribute to the quality of the task? Is the scoring guide/rubric clear, correct and aligned with the expectations for performance that are expressed in the task? General review criteria for math reviewers

10 Bias & Sensitivity Review Considerations/Criteria
Does the item disadvantage any population (gender, race, ethnicity, language, religion, socioeconomic status, disability or geographic region) for non-educationally relevant reasons? Does the item contain controversial or emotionally charged subject matter that is not supported by the Common Core State Standards? Is the item potentially offensive, demeaning, insensitive, or negative toward any population? Does the item depict any population in a stereotypical manner? General review criteria for bias reviewers

11 Questions?


Download ppt "Learning about the Item Review Process: An Overview"

Similar presentations


Ads by Google