Download presentation
Presentation is loading. Please wait.
Published byDeborah Rich Modified over 9 years ago
1
Setting Cut Scores on Alaska Measures of Progress Presentation to Alaska Superintendents Marianne Perie, AAI July 27, 2015
2
Cut scores are set based on ALDs Achievement Level Descriptors (ALDs) were drafted by a committee of 35 Alaska educators in September 2014. ALDs define what students should know and be able to do at each of four achievement levels. ALDs are written to be fully aligned to Alaska standards and are specific to each grade and subject. Content standards define what students should know and be able to do; achievement level descriptors articulate how much they should know and be able to do at each achievement level category. 2
3
Standard Setting Using those ALDs and a research-based standard setting methodology, panels of Alaska educators recommended three cut scores on each of 16 assessments (grades 3–10 in ELA and mathematics). AMP Scale Cut Scores Level 1 Level 2 Level 3Level 4 3
4
“Just Barely” Performance Partially Meets Standards Meets Standards 4
5
Bookmark Method Research-based procedure developed in the 1990s Adequate precedence, used in over thirty states Used previously in Alaska Test content is structured so that increasing knowledge and competencies can be evaluated directly against academic standards 5
6
The Task Using an ordered item set—with one form’s worth of items ordered by statistical difficulty—content experts find the location in the item set that separates groups of examinees into categories and then literally place a bookmark at that location in the ordered item set. Typically, panelists are told to find the first in a series of items that two out of three students at the borderline of the achievement level would not be able to answer correctly. This placement task takes place over three rounds of discussion and deliberation. 6
7
Ordered Item Booklet 7
8
Major Steps in the Process 1.Panelists take the test as though they were students. 2.Panelists discuss in the small groups (tables) what knowledge, skills, and abilities are involved in each item. 3.Panelists describe what it takes to just barely make it into the achievement level (use ALDs to write borderline descriptors) 4.Panelists work through the OIB in difficulty order, making notes to help later. Key questions include: What does each item measure? What makes each item more difficult than the one before it? 8
9
Major Steps (con’t) Round 1: Panelists individually review the ordered item booklet and place a bookmark after the last item they think two out of three borderline students can answer correctly. Repeat for each achievement level. Round 2: Panelists discuss the placement of the Round 1 bookmarks with others at their table. Panelists focus on the items between the lowest and highest bookmarks at their table. Panelists can move any or all of their bookmarks if they think it is appropriate when making round 2 judgments. Round 3: Panelists discuss the placement of the Round 2 bookmarks with others in the room, focusing on items between the lowest and highest bookmarks selected. Panelists discuss the percent of students expected to fall in each achievement level (“impact”) using the median bookmark from the room. Panelists can move any or all of their bookmarks if they think it is appropriate. 9
10
Impact Data After round 2, panelists were given “impact data” or the percentage distribution of students in each achievement level as determined by Median cut scores after round 2 Data from the 2015 assessments In addition, panelists were given the following information to help put those numbers in context: Percentage of students scoring in the four NAEP performance levels in 2013 at grades 4 and 8 Percentage of students taking the ACT and/or SAT in 2014 and the percentage meeting the ACT/SAT college-ready benchmarks (high school cut scores only) 10
11
Articulation About three members from each panel remained for the fourth day to review all results together. The goal was to look at the progression of cut scores from grades 3 through 10 to determine if there are any grades where the cut scores look out of line (exceptionally rigorous or lenient). The articulation panel made their final recommendation to EED. 11
12
Who participated? 126 educators participated in the 3-day standard setting from across the state. Of those, 50 remained for the articulation portion. Panelists collectively represented 51 Alaskan communities, 31 districts, and 3 university campuses. Educators were recruited to have a range of teaching experience Number of years Population taught Geography Focused on ELA or math expertise Included experts on special populations 12
13
State Representation 13
14
Panel Organization Grade 4 (27) Grade 3 (13) Grade 5 (14) Grade 7 (25) Grade 6 (13) Grade 8 (14) Grade 9 (16) Grade 10 (14) For each subject: (ELA and Math) 8–9 educators from each of grades 3–8 12 high school educators 2 higher education faculty 2 career and technical education representatives 14
15
Sample Table of Rounds Level 2 (partially meets)Round 1Round 2Round 3 Low215220222 High240232230 Median225226 SEJ4.23.12.8 Level 3 (meets) Low240248252 High284264260 Median256 SEJ5.33.81.8 Level 4 (meets) Low300315318 High340332 Median321324325 SEJ6.34.83.1 15
16
Sample Table of Results Mathematics Cut 1/2Cut 2/3Cut 3/4 GradeMinMedian MaxSEMMinMedian MaxSEMMinMedian MaxSEM 32302342371.22682742781.12923003041.8 4############################## 5############################## 6############################## 7############################## 8############################## 9 ############################## 10 ############################## 16
17
Sample Impact Data Mathematics Grade Level 1 Level 2 Level 3 Level 4 322.547.419.910.2 4##.# 5 6 7 8 9 10 ##.# 17
18
Sample articulation data 18
19
Sample Impact Data by Reporting Groups Cut score +/- 2SEM Percentage of 3 rd -grade students classified as Level 3 or higher Total Alaska Native AsianBlack Hispanic White ELLSWD 20440% 24% 52%26% 22% 48% 21%16% 20342%... 20145%... 20046%... 198*50%... 19753%... 19655%... 19456%... 19258%... 19 *Recommended cut score
20
What next? DEED will consider all of the previously mentioned tables of data. Honoring the work done by educators is the first priority Considering Type I/Type II errors is important Every assessment has some measurement error associated with it. Student scores can be +/- 3 points of their reported score Is it better to place a student in Level 3 when their performance is actually in Level 2 or to place a student in Level 2 when their performance is actually in Level 3? We are going to make one of those errors – which does the least harm? Revisiting all panelist recommendations and considering the range, including frequent recommendations above or below the median can also inform the final recommendation. 20
21
Internal and Public Review The Alaska Technical Advisory Committee will review the process for determining recommended cut scores on August 7, providing approval of process or recommendation for other considerations Commissioner Hanley will present the recommended cut scores to State Board of Education on August 24. Cut scores go out for public comment at this time. State Board will vote to approve/reject recommended cut scores on October 9. Reports will be released in mid- to late-October after first vetting with district superintendents. 21
22
Questions? 22 ???
23
Timeline of Events & Support for AMP Results 23 August 24: State Board Meeting to put proposed cut scores out for public comment August 24 – September 24: Anticipated public comment period September 23 – 26: ASA Fall Meeting in King Salmon. EED/AAI will provide profession development for how to read district and school AMP score reports. October 9: State Board Meeting to approve proposed cut scores
24
Timeline of Events & Support for AMP Results, continued 24 October 13: Superintendent Communications Workshop to assist with PR/Messaging related to district and school results October 16: District, school, and student reports available in Educator Portal October 17 – 19: Principals’ Conference. A full-day pre- conference will provide training to principals for how to access and read AMP reports, along with tips for messaging/PR regarding the results. October 19: Public release of AMP results.
25
For questions/support: Margaret MacKinnon, Director of Assessment & Accountability. Margaret.mackinnon@Alaska.gov. (907) 465-2970.Margaret.mackinnon@Alaska.gov Elizabeth Davis, Assessment Administrator. Elizabeth.davis@Alaska.gov. (907) 465-8431. Elizabeth.davis@Alaska.gov Brian Laurent, Data Management Supervisor. Brian.Laurent@Alaska.gov. (907) 465-8418 Brian.Laurent@Alaska.gov Eric Fry, Public Information Officer. Eric.fry@Alaska.gov. (907) 465-2851.Eric.fry@Alaska.gov 25
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.