Presentation is loading. Please wait.

Presentation is loading. Please wait.

Xumei Fan, Ashlee A. Lewis & Amanda Moon, Office of Program Evaluation

Similar presentations


Presentation on theme: "Xumei Fan, Ashlee A. Lewis & Amanda Moon, Office of Program Evaluation"— Presentation transcript:

1 Examining the Scoring Procedures of Performance Tasks across A Variety of Content Areas and Rubrics
Xumei Fan, Ashlee A. Lewis & Amanda Moon, Office of Program Evaluation University of South Carolina

2 Today, we will… describe the procedures used to train raters to score tasks based on Literacy Design Collaborative rubrics for the Common Assignment Study. describe the challenges faced in developing training procedures and how we overcame those challenges

3 Common Assignment Study (CAS)
Three-year effort led by The Colorado Education Initiative and The Fund for Transforming Education in Kentucky Financial support from the Bill and Melinda Gates Foundation Because of Common Core, teachers must be enabled to work and learn together to strengthen curriculum design, classroom practices, and student work products. Participating teachers develop and teach two instructional units per year that exemplify the content knowledge and skills embedded Colorado and Kentucky curriculum standards (based upon Common Core) CAS Units: Modules from the Literacy Design Collaborative (LDC), an instructional approach being used by districts in both states. Common performance tasks for students

4 Common Assignment Study: Organizational Roles
Bill and Melinda Gates Foundation (funder) Teachers in Kentucky and Colorado (content and assessment creators) Stanford Center for Assessment, Learning, and Equity (SCALE) – (content-specific technical assistance/unit design) Westat (technical assistance and support, organizing student papers) Center for Assessment (analyze student work to answer research questions) Office of Program Evaluation at the University of South Carolina (digitize student work, create rating system, train raters to score student work)

5 We (OPE) were tasked with:
Creating an online scoring system Scanning student work and loading it into the system Determining benchmark scores for student samples for training and monitoring scoring process Creating annotated student samples to train raters Preparing raters to score student work in English language arts (ELA), history, and science Recruiting raters and overseeing scoring progress

6 We (OPE) were tasked with:
Creating an online scoring system Scanning student work and loading it into the system Determining benchmark scores for student samples for training and monitoring scoring process Creating annotated student samples to train raters Preparing raters to score student work in English language arts (ELA), history, and science Recruiting raters and overseeing scoring progress

7 Developing training/scoring procedures:
Challenges Inconsistent materials received from stakeholders Changes in units and tasks across years Inconsistent messages about rubric interpretation Interpreting rubrics for tasks that we did not develop Limited external support for rubric interpretation and scoring

8 Purpose of Rater Bias Training:
To inform raters of potential biases common among raters of student writing.

9 RATER BIAS TRAINING Type of Bias Tendency for a rater to . . .
Leniency/Severity effect Score too easily or harshly Central tendency Assign scores primarily around the scale midpoint Clashing standards Score lower because his or her personal grading standards conflict with standards expressed in the rubric Fatigue Allow scores to be affected by being tired Handwriting Allow handwriting to influence his or her scores Language Score according to language usage when other dimensions are the focus of the rating Length Score lengthy responses higher Repetition factor Lower a score because she or he has read about a topic or viewed a response repeatedly Skimming Scan the document and not read the whole response Sympathy score Allow the content to appeal to his or her emotions or allow a sparse performance to elicit his or her desire to reward effort Test-to-test carryover Score lower a response that meets the pre-stated expectations, but the response appears somewhat lackluster as compared to exemplary responses that preceded it Double dinging Scoring the same aspect of the student’s paper twice

10 Example Training Slide:
Central Tendency Central Tendency A rater’s tendency to assign scores primarily around the scale midpoint. A rater’s tendency to assign scores primarily around the scale midpoint.

11 CROSS-CURRICULAR ARGUMENTATIVE and INFORMATIONAL RUBRICS
Rubric elements and proficiency levels grew and collapsed over project development due to classroom experiences and collaborative calibration. Rubrics are mostly analytic, with some being holistic. Rubrics apply to all three content areas: ELA, Science, Social Studies. The scoring rubric consists of four to seven (7) scoring elements, each scored on a rating scale from 1 to 4, with 1 representing student work that has “Not Yet” Met Expectations and 4 representing student work that is considered “Advanced”, having exceeded expectations.

12 Approaches Expectations
Elements 1 through 4 of 7: Scoring Elements Not Yet Approaches Expectations Meets Expectations Advanced 1 2 3 4 Focus Attempts to address prompt but lacks focus or is off task. Addresses prompt appropriately and establishes a position but focus is uneven. D: Addresses additional demands superficially. Addresses prompt appropriately and maintains a clear, steady focus. Provides a generally convincing position. Addresses all aspects of prompt appropriately with a consistently strong focus and convincing position. Controlling Idea Attempts to establish a claim but lacks a clear purpose. Establishes a claim. Establishes a credible claim. Establishes and maintains a substantive and credible claim or proposal. Reading/ Research Attempts to reference reading materials to develop response but lacks connections or relevance to the purpose of the prompt. Presents information from reading materials relevant to the purpose of the prompt with minor lapses in accuracy or completeness. Accurately presents details from reading materials relevant to the purpose of the prompt to develop argument or claim. Accurately and effectively presents important details from reading materials to develop argument or claim. Development Attempts to provide details in response to the prompt but lacks sufficient development or relevance to the purpose of the prompt. Presents appropriate details to support and develop the focus, controlling idea, or claim, with minor lapses in the reasoning, examples, or explanations. Presents appropriate and sufficient details to support and develop the focus, controlling idea, or claim. Presents thorough and detailed information to effectively support and develop the focus, controlling idea, or claim. The scoring rubric for this tasks consists of seven (7) scoring elements, each scored on a rating scale from 1 to 4, with 1 representing student work that has “Not Yet” Met Expectations and 4 representing student work that is considered “Advanced”, having exceeded expectations… The scoring elements for Task 80 are: Focus Controlling Idea Reading/Research Development

13 Approaches Expectations
Elements 5 through 7 of 7: Scoring Elements Not Yet Approaches Expectations Meets Expectations Advanced 1 2 3 4 Organization Attempts to organize ideas but lacks control of structure. Uses an appropriate organizational structure for development of reasoning and logic, with minor lapses in structure and/or coherence. Maintains an appropriate organizational structure to address specific requirements of the prompt. Structure reveals the reasoning and logic of the argument. Maintains an organizational structure that intentionally and effectively enhances the presentation of information as required by the specific prompt. Structure enhances development of the reasoning and logic of the argument. Conventions Attempts to demonstrate standard English conventions but lacks cohesion and control of grammar, usage, and mechanics. Sources are used without citation. Demonstrates an uneven command of standard English conventions and cohesion. Uses language and tone with some inaccurate, inappropriate, or uneven features. Inconsistently cites sources. Demonstrates a command of standard English conventions and cohesion with few errors. Response includes language and tone appropriate to the audience, purpose, and specific requirements of the prompt. Cites sources using appropriate format with only minor errors. Demonstrates and maintains a well-developed command of standard English conventions and cohesion with few errors. Response includes language and tone consistently appropriate to the audience, purpose, and specific requirements of the prompt. Consistently cites sources using appropriate format. Content Understanding Attempts to include disciplinary content in argument but understanding of content is weak; content is irrelevant, inappropriate, or inaccurate. Briefly notes disciplinary content relevant to the prompt; shows basic or uneven understanding of content; minor errors in explanation. Accurately presents disciplinary content relevant to the prompt with sufficient explanations that demonstrate understanding. Integrates relevant and accurate disciplinary content with thorough explanations that demonstrate in-depth understanding. Organization Conventions and Content Understanding

14 Example Training Slide: Rubric Element 3: Reading/Research
Scoring Elements Not Yet Approaches Expectations Meets Expectations Advanced 1 2 3 4 Reading/ Research Attempts to reference reading materials to develop response but lacks connections or relevance to the purpose of the prompt. Presents information from reading materials relevant to the purpose of the prompt with minor lapses in accuracy or completeness. Accurately presents details from reading materials relevant to the purpose of the prompt to develop argument or claim. Accurately and effectively presents important details from reading materials to develop argument or claim. Key questions: 1. Do students use multiple readings as quotes to provide important, relevant evidence to support and develop the claim? 2. Did the reading/research quotes provide relevant evidence to support and develop the claim? 3. Did the student discuss the credibility of the author or the worthiness of the reading source? For Reading/Research, the student mainly relies on quotes and in-text citations. The quotes should be relevant to the task and provide the most important evidence possible in supporting the focus and controlling idea. For this task, writing should not rely on personal anecdotes, but should be supported by information obtained from the reading.

15 Training Manuals Rubric Training Manuals:
LDC Informational Rubric Training Manual LDC Argumentative Rubric Training Manual Task Training Manuals: ELA: 9 Manuals History: 12 Manuals Science: 9 Manuals

16 High School ELA Middle School ELA Manual Task ID Unit Name Year
Manual Task ID Unit Name Year Student papers Rubric Type Words Matter Creative Project (Task 82) Task 82 1 (Words Matter) 1 108 LDC 7 12 Angry Men (Task 54) Task 54 140 Shooting an Elephant (Task 112) Task 112 2 240 Op-Ed (Task 85/Task 155) Task 85 2 (Period is Pissed) 316 Task 155 Middle School ELA Manual Task ID Unit Name Year Student papers Rubric Type Should Mom be Your Facebook Friend? (Task 80) Task 80 1 (CBOR) 1 175 LDC 7 Should Schools Monitor Student's Social Media Accounts?(Task 116) Task 116 2 133 LDC 4 Children have the Right to… (Task 52/Task 117) Task 52 359 Task 117 Gary Soto (Growing Up is Hard to do) (Task 101/Task 120) Task 101 2 (Lit Analysis) 286 Task 120 The Power of Hello (Task 100/Task 158) Task 100 333 Holistic Task 158

17 High School History Middle School History Manual Task ID Unit Name
Manual Task ID Unit Name Student papers Rubric Type Andrew Carnegie: Robber Baron or Captain of Industry (Task 123) Task 123 1 (Industrial) 272 LDC 4 McCarthy (Task 147) Task 147 2 (Cold War) 222 LDC 5 Andrew Carnegie: Robber Baron or Captain of Industry (Task 71) Task 71 70 LDC 2 McCarthy (Task 95) Task 95 82 LDC 7 The Cost of Progress (Task 70) Task 70 120 LDC 6 Cartoon Analysis(Industrialization/McCarthyism) Task 122 700 Holistic Task 146 Task 72 Task 94 Middle School History Manual Task ID Unit Name Student papers Rubric Type Jamestown (Task 124) Task 124 1 (Colonization) 182 LDC 7 Summative - Which colony would you want to live in? (Task 126) Task 126 Holistic Jamestown (Task 64) Task 64 163 Similarities/Differences Among Colonies (Task 68) Task 68 140 LDC 6 Mexican American War (Task 150/Task 90) Task 150 2 (Westward Expansion) 217 Task 90 Texas Independence – Extended Response (Task 149/Task108) Task 108 (Q4) 206 Task 149

18 High School Science Middle School Science Manual Task ID Unit Name
Manual Task ID Unit Name Year Student papers Rubric Type Are You What You Eat? (Task 127 ) Task 127 1 (Biochem) 2 208 LDC 7 Human Impacts on Biodiversity (Task 162) Task 162 2 (Ecology) 186 Human Impacts on Biodiversity (Task 98) Task 98 1 83 LDC 5 Middle School Science Manual Task ID Unit Name Year Student papers Rubric Type Genetic testing/Designer Babies (Task 133) Task 133 1 (Genetics) 2 102 LDC 4 Genetic testing/Designer Babies (Task 58) Task 58 1 185 LDC 7 Ecosystem (Task 170) Task 170 2 (Life Science) 80 Ecosystem (Task 106) Task 106 44 Roller Coaster (Task 137/166) Task 137/166 2 (Phys Science) 214 LDC 5 Roller Coaster (Task 103) Task 103 48

19 A rubric training manual looks like:
LDC Argumentative Rubric Training Manual.pdf LDC Informational Rubric Training Manual.pdf A task training manual looks like: Task training manual_HS Science.pdf Task training manual_MS ELA.pdf Task training manual_MS History.pdf

20 When developing rater training and scoring procedures:
Develop clear lines of communication Maintain consistency in processes across all content areas Ensure involvement with all levels of task creation, rubric creation, and rubric interpretation Negotiate all roles and expectations in advance as much as possible

21 Thank you! Comments? Questions? Suggestions? lewisaa2@mailbox.sc.edu


Download ppt "Xumei Fan, Ashlee A. Lewis & Amanda Moon, Office of Program Evaluation"

Similar presentations


Ads by Google