Presentation is loading. Please wait.

Presentation is loading. Please wait.

Getting More out of Rubric Ratings with Inter-Rater Analysis David Eubanks Furman University.

Similar presentations


Presentation on theme: "Getting More out of Rubric Ratings with Inter-Rater Analysis David Eubanks Furman University."— Presentation transcript:

1 Getting More out of Rubric Ratings with Inter-Rater Analysis David Eubanks Furman University

2 Project Home: github.com/stanislavzza/Inter-Rater-Facets

3 Case 1: Wine Tasting

4 Graphing Agreement The table and graph show the expected outcome of two “raters” who are flipping coins to decide between “good” and “poor” outcomes for the rated works.

5 X

6 Key: X = No metal 2 = Bronze Metal 3 = Silver Metal 4 = Gold Metal

7 Writing “Arrangement” 1 = Excellent 5 = Unacceptable Case 2: Rubric Ratings of Student Writing

8 Statistical Significance (p-value to reject random distribution) * <.05 ** <.01 *** <.001 Key: 1 = Excellent 5 = Unacceptable

9 Organization Key: 0 = incomplete 1 = poor 4 = superior Student Peer Review in Composition Class

10 Organization Key: 0 = poor / incomplete 4 = superior

11 Quick End-of-Course Assessments of Writing Ability

12

13 Portfolio Review Traits: Rhetorical Knowledge Critical Thinking Writing Processes Knowledge of Conventions Composing in Electronic Environments Holistic Score Scale: 1 = Unacceptable to 6 = Excellent

14 Portfolio Review

15

16 Case 3: Ratings of Student Effort 0 = little or no effort 1 = good effort 2 = exceptional effort

17 A, B, C Course GradesRatings of Student Effort

18 The instructor of this course: - Explained material clearly ( 1 = SD to 5 = SA) Case 4: Course Evaluations Amazon.com on-demand video ratings Course evaluations for a STEM major at a public university

19 Course EvaluationsAmazon.com on-demand video ratings

20 Case 5: Course Grades

21 Concurrent Grades All Grades

22 Discipline vs University GPA Foreign Language University

23 Unnamed MajorUniversity Discipline vs University GPA

24 Agreement and Complexity

25 Q: If five people agree that a pizza is delicious, how many agreements is this? 1.Five, because there are five people agreeing 2.One, because there is one pizza 3.Ten, because there are ten distinct pairs of people who agree (4 + 3 + 2 + 1 = 10) Calculating Agreement

26 Calculating Asymptotic Agreement Probability that the two ratings are the same Match probability = 9% + 36% + 1% = 46% First independent rating Second independent rating

27 Calculating Correctness Probability of correct assessment Distribution of assessments Probability that a rating is the correct one Probability that an individual rating is correct = 46%

28 A Sample Beer Tasting Rubric

29 David.Eubanks@Furman.edu


Download ppt "Getting More out of Rubric Ratings with Inter-Rater Analysis David Eubanks Furman University."

Similar presentations


Ads by Google