Web-Based Student Peer Review: A Research Summary

Slides:



Advertisements
Similar presentations
General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Advertisements

Alternative InstructionalManagementSystem. AIMS is… Individualized Individualized You may work at your own paceYou may work at your own pace You can take.
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Karthik Raman, Thorsten Joachims Cornell University.
Jonathan Huang Chuong Do Daphne Koller Zhenghao Chen Andrew Ng Chris Piech Stanford Coursera Tuned Models of Peer Assessment in MOOCs.
Using Rubrics for Assessment: A Primer Marcel S. Kerr Summer 2007 
ICE Evaluations Some Suggestions for Improvement.
Introduction.  Classification based on function role in classroom instruction  Placement assessment: administered at the beginning of instruction 
The Metacognitive Benefits of Self- and Peer Review Edward F. Gehringer Department of Computer Science North Carolina State University Our work in peer.
C.E.L.T Developing a Comprehensive Faculty Evaluation System Jeff Schlicht, Ph.D. Associate Professor Department of Health Promotion and Exercise Sciences.
Assessment 3 Write a goal for your future personal health. Underneath the goal, list two objectives you would like to meet related to this goal. Grading.
Automated Scoring: Smarter Balanced Studies CCSSO- NCSA San Diego, CA June, 2015.
Best Practices in Classroom Peer Review Edward F. Gehringer Department of Computer Science North Carolina State University The Expertiza project has been.
Expertiza: Rubric-Based Peer Assessment of Team Projects Edward F. Gehringer Dept. of Computer Science North Carolina State University Supported by NSF.
Aug. 9, 2007Gehringer: Improving Course Materials … Through Peer Review … Expertiza: Improving Course Materials and Learning Outcomes through.
Grading and Analysis Report For Clinical Portfolio 1.
University of Delaware Using Groups Institute for Transforming Undergraduate Education Courtesy of Hal White and Deb Allen.
Mrs. Sills, Placement Specialist Holland Public Schools Please sign in and write down your address (clearly) so that I can contact you easier (won’t.
Building Resources for Teaching Computer Architecture Through Peer Review Edward F. Gehringer Dept. of Electrical & Computer Engineering Dept. of Computer.
An Institutional Writing Assessment Project Dr. Loraine Phillips Texas A&M University Dr. Yan Zhang University of Maryland University College October 2010.
The collections  SERC: Science Education Resource Center at Carleton College  SERC office and staff helps develop and manage web resources for many projects.
Eliciting Effective Peer Feedback Edward F. Gehringer Department of Computer Science North Carolina State University The Peerlogic project has been funded.
“Crowdsourcing” a Textbook: 120 Student Authors Writing on a Wiki
Some Suggestions for Improvement
David Allard Texas A&M University-Texarkana
Group 1 Pages
RWTH Aachen University
PORTFOLIO ASSESSMENT (Undergraduate Medical Education)
Computer Mediated Peer Review
Jane Schmidt-Wilk, Ph.D. Maharishi University of Management
Students as Self Assessors Teachers as Focused Coaches
9th and 10th Grade Parent Information Night
Unit 3 – Web design Final Project
The Many Uses of Self-Assessment
Performance Review Week 3 Day 3 Spring Quarter.
Why Peer Review? Rationale #4
CGS 3066: Web Programming and Design Spring 2017
Evaluating Learners Jennifer L. Middleton, MD, MPH, FAAFP
Assessing Learning Outcomes:
The Addie and Arc Models
ASSESSMENT OF STUDENT LEARNING
Rubrics.
Students as Self Assessors Teachers as Focused Coaches
Performance Review Week 3 Day 3 Spring Quarter.
A Way to Approach the Research Design Criteria
RDG 415Competitive Success/snaptutorial.com
RDG 415 Education for Service-- snaptutorial.com
Online Composition with Georgie Ziff
The purposes of grading student work
Peer Evaluation of Teammates
Active dormant extinct evidence nuclear waste risk trade-offs.
9th and 10th Grade Parent Information Night
The Benefits of Self-Review Rationale #8 Week 12
What Are Rubrics? Rubrics are components of:
Faculty use of digital resources and its impact on digital libraries
Effective Use of Rubrics to Assess Student Learning
Jesus Jr. Camposeco ECE 682 Dr. David L. Brown
Teaching & learning with TECHNOLOGY
Rubrics for academic assessment
Collecting Data to Assess Capstone and Culminating Courses
Strategies for Building Course Web Sites
Developing a Rubric for Assessment
Using Groups Courtesy of Hal White and Deb Allen
Performance Management
Rubrics for evaluation
Using Groups Courtesy of Hal White and Deb Allen
Curriculum Coordinator: Pamela Quinn Date of Presentation: 1/19/18
EDUC 2130 Quiz #10 W. Huitt.
Using Groups Courtesy of Hal White and Deb Allen
Constructing a Test We now know what makes a good question:
Presentation transcript:

Web-Based Student Peer Review: A Research Summary Edward F. Gehringer Department of Computer Science North Carolina State University The Expertiza project has been funded by the National Science Foundation Please visit our Web site: http://tinyurl.com/expertiza-site

Credits … Arlene Russell, Calibrated Peer Review Chris Schunn, SWoRD Steve Joordens & Dwayne Paré, Peer Scholar Eric Ford & Dmytro Babik, Mobius SLIP Luca de Alfaro, CrowdGrader Helen Hu and David McNaughton, uJudge Gehringer, Classroom peer review efg@ncsu.edu

Outline What’s good about peer review? Rubrics Rating vs. ranking Formative vs. summative Quality control Who reviews whom? Online apps for peer review Examples of ethical analyses Gehringer, Classroom peer review efg@ncsu.edu`

Advantages of peer review? Gehringer, Classroom peer review efg@ncsu.edu

Some advantages of peer review Feedback is more extensive quicker scalable Can’t blame the reader! Forces students to think metacognitively Gehringer, Classroom peer review efg@ncsu.edu

Rubrics Why use a rubric? Students can help create the rubric Tell students what to look for “Fairness” in assessment Students can help create the rubric How detailed? Longer rubric: Draw attention to more criteria But also more fatigue Probably less textual feedback Short rubric: More textual feedback Students can miss things Gehringer, Classroom peer review efg@ncsu.edu

Rubric advice

Rating vs. ranking Should students rate others’ work on a Likert scale, or rank students against each other? Rating Easier to rate than rank using a rubric Can give 2 students the same rating Ranking May not be compatible with F2F review More robust when reviewers are not experts Can use a slider to show nearness Gehringer, Classroom peer review efg@ncsu.edu

Mobius SLIP’s approach to ranking Gehringer, Classroom peer review efg@ncsu.edu

Formative vs. summative peer review Formative—text feedback Summative—Likert scale Should peer review be used summatively? Gehringer, Classroom peer review efg@ncsu.edu

Quality control You can’t take review quality for granted. Approaches Metareviewing Calibration Reputation system Gehringer, Classroom peer review efg@ncsu.edu

Metareviewing “Review the reviewer” “Rate the rater” Who performs metareviews? 3 choices Author? Instructor? 3rd party? Can we automate the process? Gehringer, Classroom peer review efg@ncsu.edu

Calibration Basic idea: Training course for reviewers How they do  how much credence they get Before students review peers, they get 3 works to review 1 exemplary Their agreement with instructor  Reviewer Competency Index Others have known defects Gehringer, Classroom peer review efg@ncsu.edu

Gehringer, Classroom peer review efg@ncsu.edu Reputation algorithm s1 s2 s3 s4 s5 r1 0.6 — r2 0.3 0.4 r3 r4 r5 s1 s2 s3 s4 s5 r1 0.6 0.4 — r2 0.3 0.2 r3 r4 0.5 r5 s1 gets the same scores from reviewers in both situations. Should it get the same grade? Gehringer, Classroom peer review efg@ncsu.edu Gehringer, Classroom peer review efg@ncsu.edu

Reputation algorithm, cont. s1 s2 s3 s4 s5 r1 0.6 — r2 0.3 0.4 r3 r4 r5 s1 s2 s3 s4 s5 r1 0.6 0.4 — r2 0.3 0.2 r3 r4 0.5 r5 r2 and r3 agree with their co-reviewers … Gehringer, Classroom peer review efg@ncsu.edu

Reputation algorithm, cont. s1 s2 s3 s4 s5 r1 0.6 — r2 0.3 0.4 r3 r4 r5 s1 s2 s3 s4 s5 r1 0.6 0.4 — r2 0.3 0.2 r3 r4 0.5 r5 r2 and r3 agree with their co-reviewers … while r1 gives higher scores. So, s1’s grade may be inflated. Gehringer, Classroom peer review efg@ncsu.edu

Reputation algorithm, cont. s1 s2 s3 s4 s5 r1 0.6 — r2 0.3 0.4 r3 r4 r5 s1 s2 s3 s4 s5 r1 0.6 0.4 — r2 0.3 0.2 r3 r4 0.5 r5 In this situation, r1 agrees with his co-reviewers, while r2 and r3 give lower scores. So in this case, s1 was reviewed by “harder” graders, and thus deserves a higher grade. Gehringer, Classroom peer review efg@ncsu.edu

Reputation systems—how reliable? Two studies on Coursera MOOC [2013] Piech et al.: ≥ 26% of grades ± 5% from “ground truth.” Kulkarni et al.: 40% of grades off by 1 letter grade! But … difficulty affects accuracy, brief calibration, no metareviewing this was, after all, a MOOC Gehringer, Classroom peer review efg@ncsu.edu

Reputation systems: Crowdgrader Compute consensus grade As in Olympics, discard highest & lowest ¼ of grades; average the rest Compute an accuracy grade, based on how close the student’s grade is to the consensus grade Compute a helpfulness grade, based on ratings from reviewees (on a scale from –2 to +2) Weight these 3 factors as desired by instructor

Who reviews whom? Simplest: Each student reviews k other students Reviewing in groups—case study, etc. Individuals review teams Dynamic assignment, to make sure all get reviewed Dynamic assignment may require you to complete one review before it assigns you the next. Or it may be done like Expertiza … Gehringer, Classroom peer review efg@ncsu.edu

The PR app landscape Most widely used: CPR Sharable assignments for many disciplines But, you probably want to adapt. Gehringer, Classroom peer review efg@ncsu.edu

SWoRD Perhaps the most-researched system … from Pitt’s Learning Resource Development Ctr. Gehringer, Classroom peer review efg@ncsu.edu

Peer Scholar Came from U. of Toronto Now sold by Pearson in Canada Free (for now) in the US Supports (& recommends) revision and resubmission Gehringer, Classroom peer review efg@ncsu.edu

Mobius SLIP Origin in case-study courses Based on ranking “Double loop” Gehringer, Classroom peer review efg@ncsu.edu

Expertiza “Reusable learning objects through peer review” Supports signing up for topics/parts of a project Students (or instructor) form teams Individuals review teams Teammates review each other Gehringer, Classroom peer review efg@ncsu.edu

Signup sheet Gehringer, Classroom peer review efg@ncsu.edu

Viewing results Gehringer, Classroom peer review efg@ncsu.edu

Summary Reasons for doing peer review Rubrics are important Rating vs. ranking Formative vs. summative Quality control Who reviews whom? Online apps for peer review Examples of ethical analyses Gehringer, Classroom peer review efg@ncsu.edu`