Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Presented at the 2005 CLEAR Annual Conference.

Similar presentations


Presentation on theme: "Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Presented at the 2005 CLEAR Annual Conference."— Presentation transcript:

1 Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

2 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona The Players F. Jay Breyer, PhD Thomson Prometric Ron Bridwell, PE National Council of Examiners for Engineering & Surveying Beth Sabin, DVM, PhD American Veterinary Medical Association Ron Rodgers, PhD CTS/Employment Research & Development Elizabeth Witt, PhD American Board of Emergency Medicine

3 Scoring Procedures for STRUCTURAL II September 2005 Ron Bridwell, P.E.

4 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona Introduction The Exam Before the Scoring Session The Scoring Process The Cut Score (Passing Point) Process

5 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona Introduction The Exam Before the Scoring Session The Scoring Process The Cut Score (Passing Point) Process

6 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona Introduction The Exam Before the Scoring Session The Scoring Process The Cut Score (Passing Point) Process

7 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona Introduction The Exam Before the Scoring Session The Scoring Process The Cut Score (Passing Point) Process

8 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona The Exam Scoring Protocol Development  Need to Standardize the STR II Scoring guidelines using a benchmark holistic method.  Scoring can drift due to fatigue or anger. Scoring Criteria Development  Developed by the exam committee as the problems are developed.  Candidates may respond differently.

9 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona Before the Scoring Session Tasks Identifying Scoring Committee Members  Most familiar with problems  Coordinators work with staff  Empowered to modify criteria as needed.

10 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona Before the Scoring Session Tasks Identify Sample Papers  5 benchmarks for training  Range finders for training  5 benchmarks for certification

11 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona The Scoring Process Tasks Certifying Scorers  5 benchmark papers are given to scorers as test  Pass or Fail  Scorers have two chances to be certified Training the Scorers  Scorers should be skilled at assigning scores to specific problems  Scorers are trained with benchmark papers

12 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona The Scoring Process Tasks Scoring  Care is taken to insure the scorers do not know the names or jurisdictions of the examinees  Papers are scored blind as if by machine  Each paper is scored by two scorers  If the scores agree or are off by no more than 1 the score is assigned (averaged)  If off by more than 1, the coordinator adjudicates  Any scorer can be replaced by any other and the same score would result  Database provides feedback

13 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona Kinds of Information Discrepancy Agreement Summary  Shows how many papers scored  Shows consistency  Shows consistency of each scorer paired with all partners  Useful for Scoring Reliability  Aggregate & Separate  Number, Mean, SD across entire test for each scorer and coordinator  Book keeping  Records Road To Fair & Quality Scores  Number of Papers to be adjudicated  Total Required Adjudications by Scorer  Re-Training may be necessary if too many  Number, Mean, SD read by each scorer & coordinator by problem Monitoring Solution for Fair Scoring: Report Components AdjudicationResolutionTraining

14 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona Overview of Standard Setting Process Report Results Definition of Competence Practice Session Real Rating 3 Uniform Solution Samples Selected Training undertaken Assign candidates to PASS/FAIL status based on comparison of total performance to Standard

15 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona Questions?


Download ppt "Sterling Practices in Design & Scoring of Performance-Based Exams #156 F. Jay Breyer Presented at the 2005 CLEAR Annual Conference."

Similar presentations


Ads by Google