Download presentation
Presentation is loading. Please wait.
Published byRonald Richardson Modified over 9 years ago
1
Distributed Scoring of Regents Exams: NYC 2012 Pilots
2
Background In October 2011, the NYS Board of Regents voted that beginning in school year 2012-13, teachers can no longer score their own students’ state assessments. voted New York City already meets the new SED requirement for the grades 3-8 ELA and math tests and NYSAA using a Regional scoring model. Regents exams are currently administered and scored locally (i.e., within the testing school). 2
3
New York City High Schools Over 923,000 Regents exams are administered annually* to students in NYC, making it the largest assessment program in the city. Approx. 250,000 students are enrolled in grades 9-12 460 high schools 386 – High schools (grades 9-12) 71 – Secondary schools (grades 6-12) 3 – K-12 schools (all grades) High schools exist both in stand alone and campus setting Stand Alone 194 high schools Campus 206 high schools are located in buildings with 3 or more other high schools (max campus – 7 HSs in one building) Approx. 350 middle schools also administer Regents exams in June; over 800 schools administer Regents exams in total 3 *Includes January, June, and August administrations.
4
Scoring Model Options Considered Implementation Model Brief Description Electronic ScoringScanned images of student responses are electronically distributed to scorers. Scoring may be completed at central scoring sites and/or at individual schools. Regional ScoringStaff members from multiple schools gather at a central scoring site to collectively score exams. Campus ScoringStaff members located in one building with multiple high schools remain in their building to score exams. Exam (Staff) TradingExams (or staff) are traded amongst different schools for scoring. In-School ScoringMultiple scoring committees are formed within a school. Exams are distributed to scoring committees appropriately to ensure teachers do not score their own student’s papers. 4
5
Non-Electronic Distributed Scoring Pilots January 2012 Model (2 total) Site (7 total) # Schools (27 total*) Exams* (3 total) Campus417 Comprehensive English; Integrated Algebra; Living Environment Regional310 5 Only schools with grades 9-12 selected for January 2012 pilot Approx. 4,000 Comprehensive English, Integrated Algebra, and Living Environment exams scored across all 7 sites *Excludes scoring of alternate language exams
6
January 2012 Regents Exam Schedule 6 Jan. 24 Tues. Jan. 25 Wed. Jan. 25 Thurs. Jan. 27 Fri. Jan. 30 Mon. Jan. 31 Tues. 9:15 a.m. Ratings Day Spring Term Begins Integrated Algebra RCT in Global Studies Living Environment RCT in U.S. History & Gov’t. U.S. History & Gov’t. Geometry RCT in Science Global History & Geography RCT in Writing 1:15 p.m. Comprehensive English Physics Algebra 2/Trigonometry RCT in Mathematics Earth Science Chemistry RCT in Reading
7
7 Scoring Site Staff Structure Site Supervisor Scoring Content Leaders (4 per site) Scorers Organizational Team Leader (1 per site) Organizational Team Members
8
Distributed Scoring Pilots: January 2012 Implementation Successes: Collaboration across schools to plan for scoring and problem solve Site Supervisor role and leadership Deeper scorer training and norming All scoring completed on or ahead of schedule Challenges: School selection and recruitment Logistics and distribution of exams Norming training across sites and models Scorer identification and assignment Proctoring vs. scoring needs Specialized expertise (e.g., scoring alternate languages, upper-level science and math scoring) 8
9
Scope and Plans for Scale-Up Administration# Schools# Sites# Exams January 2012* 27 73 June 2012* 164 263 or 4** August 2012 TBD January 2013 460*** TBD10 June 2013800+****TBD10 9 *In January and June 2012, only English language versions of exams included **Depends on whether the school is participating in an electronic or non-electronic model ***Includes all 9-12, 6-12, and K-12 schools ****Includes all schools administering Regents exams.
10
Goals of an Electronic Scoring Model Projected benefits of electronic scoring include: Increased accuracy and consistency of scoring student responses. A faster scoring rate (compared to a paper and pencil model), which is expected to reduce the impact on schools. Obtain data on Regents exam scoring rate by exam. 10
11
11 Regents Exam Scoring An increasing percentage of Regents exam scoring will use a distributed (electronic or non-electronic) method ahead of a planned move to computer- based testing beginning in the 2014-15 school year. * Refers to calendar year, not school year
12
Distributed Regents Scoring Discussion Questions How has Regents scoring typically been organized in your districts? How are your districts planning to meet the new SED requirement? For Regents exams? NYSESLAT? Science? What types of assistance are you providing to districts? How are you helping districts balance the simultaneous need for proctors and scorers? How do your districts handle the scoring of alternate language or higher-level science exams(e.g., Physics)? What scoring rates do your districts use for each exam for non- electronic and electronic scoring methods? Will you be monitoring scoring? What level of oversight are you planning to use? How has the new Regents calendar affected your approach to developing a solution? 12
13
APPENDIX 13
14
Scoring Site Management Responsibilities RoleEligible StaffResponsibilities Selection Process Site Supervisor Appointed supervisor or Education Administrator (e.g., assistant principal) Manage scoring site and supervise all activities related to the scoring facility; be on- site for duration of scoring Selected by committee of participating schools’ principals at information session Scoring Content Leader (1 per subject) Content expert (e.g., subject area AP, department chair); supervisory license preferred Train scorers and oversee scoring; be on-site for duration of scoring for particular subject Providing school is determined by committee of participating schools’ principals at information session Organizational Team (size equivalent to number of schools in site) Organizational Team Lead (1 per site) Staff with ATS* access; experience with Regents scanning preferred Scan all answer documents for scoring site; manage error correction with oversight of site supervisor; additional site logistics support Organization Team Members Assist scoring site supervisors and content leaders as needed to check-in, distribute, and check-out test materials 14 *ATS is the information system used to capture and process results from scanned Regents exams.
15
School Selection: January 2012 Factors that contributed to school selection and matching for January 2012 distributed Regents scoring included: Selection Serve only grades 9-12 Administer the exam titles included in the pilot and in a sufficient quantity Interest in piloting a distributed scoring model Matching Located in close proximity to other high schools that are administering the same exams Expected to administer a quantity of exams in January 2012 (based on order data for that administration) that was roughly equivalent to other nearby candidate schools 15
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.