COAS: Drexel University waypoint online assessment and structured peer review Andrew J. McCann visiting professor of english, drexel university founder and president, subjective metrics, inc. www.gowaypoint.com ajmccann@subjectivemetrics.com 215.713.9393
Agenda Background Introduce waypoint Brief overview Drexel applications Live Demonstration Evaluate Manage Libraries
Problem: Feedback Handwritten and/or manually typed “Lost” Little accountability Consistency issues No data Needed: not AI – but a technological tool
Waypoint: Evaluation Tool Encourages pre-written feedback: clearer, more detailed explanations of key concepts Facilitates sharing of assessments amongst instructors Quantifies evaluations by skill Archives all feedback Web-based: cross-platform backed up zero maintenance
Waypoint Process
Develop Assessment
Develop Assessment
Receive Paper or Exam (Hard copy shown here)
Receive Paper or Exam (Hard copy shown here)
Evaluate Against Skills
Evaluate Against Skills
Respond to Student
Respond to Student
Snapshot Analysis
Snapshot Analysis
Longitudinal Cohort Analysis
Process Summary
Drexel Applications tDec Humanities (Dr. Valarie Arms) Evaluation of all major writing assignments (600 students) Writing Center (Harriet Millan) Customized assessments for WITs College of Engineering (Kevin Scoles & Adam Fontecchio) Evaluation of lab reports with WITs and TAs College of Business (Frank Linnehan) Structured response to student writing & accreditation data generation Engineering Management (Mike Scheuerman) Peer review
Demonstration www.gowaypoint.com Evaluate “Final Report” Peer Review Manage Quantification Sorting Data analysis
Additional Features Self-assessment Collaborative assessment Multiple instructors can contribute to evaluation
Questions and Discussion
Testimonials “I've been doing this for twenty years and have my own system of grading and commentary…and found little need to improve. And then Waypoint came along. Somehow--and I really can't even explain it--but my grading time has been cut down by more than half and my students are actually thanking me for the in-depth commentary.” Professor Ken Bingham “Usually I have trouble criticizing a peer’s paper if I’m not given certain criteria to judge. During this peer review, I was actually focused and excited about judging a peer’s paper.” Erin Williams, COE 2008 “If this peer review program was available in high school, I would have probably done a lot better in English.” Ed Itaas, COE 2008
Testimonials “An innovative teacher-friendly, student-friendly, efficient approach to grading writing—the most creative and time-saving method for evaluating writing I have ever found or used.” Gayle, a high school English teacher with 35 years’ experience “I don’t know how I ran a writing program for three years without it.” Harriet Millan, Director of the University Writing Program
Quantitative Data Anonymous survey of Peer Review process 114 freshman engineers: 51 use waypoint, 63 the ‘old’ method 1-5 Scale (1 – strongly disagree, 5 – strongly agree) wp (N=51) Mean ‘old’ (N=63) P value (t-test, 95% CL) 1. I received clear and helpful criticism of my draft. 3.90 3.52 p<0.025 2. Evaluating other students’ papers helped me better understand the assignment and the play. 3.86 3.48 p<0.02 3. I made significant changes to my first draft based on peer review feedback. 3.81 3.25 p<0.005 4. I made significant changes to my first draft independent of peer review feedback. 3.49 3.43 p=0.38