How to Fail a Student Lisa M. Beardsley-Hardy, PhD, MPH, MBA Director of Education General Conference of Seventh-day Adventists.

Slides:



Advertisements
Similar presentations
Test Development.
Advertisements

What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
Copyright © 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved
M AKING A PPROPRIATE P ASS- F AIL D ECISIONS D WIGHT H ARLEY, Ph.D. DIVISION OF STUDIES IN MEDICAL EDUCATION UNIVERSITY OF ALBERTA.
Advanced Topics in Standard Setting. Methodology Implementation Validity of standard setting.
Chapter Fifteen Understanding and Using Standardized Tests.
MSL Update ERPD December 2012 HS Social Studies Teachers.
Medical school attendedPassing grade Dr JohnNorthsouth COM (NSCOM)80% Dr SmithEastwest COM (EWCOM)50% Which of these doctors would you like to treat you?
Standard Setting for Professional Certification Brian D. Bontempo Mountain Measurement, Inc. (503) ext 129.
Assessed: 2010, 2011, SLO 4.1: Identify and describe the impact of the global economy on business decisions. SLO 4.2: Explain and apply a global.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Binomial Distributions
Setting Performance Standards Grades 5-7 NJ ASK NJDOE Riverside Publishing May 17, 2006.
A Process for the Direct Assessment of Program Learning Outcomes Based on the Principles and Practices of Software Engineering Robert W. Lingard California.
Computer Science Department Middle States Assessment Computer Science has 4 programs (minor, bachelor’s, master’s and doctorate) and therefore 4 different.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Standard Setting Inclusive Assessment Seminar Marianne.
SETTING & MAINTAINING EXAM STANDARDS Raja C. Bandaranayake.
Standard Setting Different names for the same thing Standard Passing Score Cut Score Cutoff Score Mastery Level Bench Mark.
© 2008 McGraw-Hill Higher Education. All rights reserved. CHAPTER 16 Classroom Assessment.
SETTING & MAINTAINING EXAM STANDARDS
Standard setting Determining the pass mark - OSCEs.
UELSU reps conference Student Engagement: What does is mean for you?
Examing Rounding Rules in Angoff Type Standard Setting Methods Adam E. Wyse Mark D. Reckase.
Wastewater Treatment Plant Operator Exam Setting Performance Standards With The Modified Angoff Procedure.
Standardized Test Scores Common Representations for Parents and Students.
Chapter 14 Understanding and Using Standardized Tests Viewing recommendations for Windows: Use the Arial TrueType font and set your screen area to at least.
Standard Setting Methods with High Stakes Assessments Barbara S. Plake Buros Center for Testing University of Nebraska.
Information on New Regents Examinations for SCDN Presentation September 19, 2007 Steven Katz, Director Candace Shyer, Bureau Chief Office of Standards,
1 Establishing A Passing Standard Paul D. Naylor, Ph.D. Psychometric Consultant.
The College of Saint Rose School of Education Department of Literacy and Special Education Teacher Candidate Assessment.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Assessing Educational Effectiveness Lisa M. Beardsley-Hardy, PhD, MPH, MBA Director of Education General Conference of Seventh-day Adventists.
Tailoring Course Evaluations/Student Feedback to Improve Teaching Jeffrey Lindstrom, Ph.D. Siena Heights University Webinar 6 October 2014.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Employing Empirical Data in Judgmental Processes Wayne J. Camara National Conference on Student Assessment, San Diego, CA June 23, 2015.
What Really Goes on at an AP Statistics Reading? USCOTS May 18, 2007 Brad Hartlaub Kenyon College Daren Starnes Fountain Valley School of Colorado.
Developing Learning Assessment Plans and Outcomes Assessments: Challenges and Opportunities in Medical Education Research Robert A. DiTomasso, Ph.D., ABPP.
International Diabetes Federation (IDF) East Mediterranean and Middle East Region (EMME) Workshop on Professional Educational Methodology in Diabetes
Florida Tech’s University Assessment Committee For A Continuing Culture of Assessment.
Assessment and Performance Standards How Good is Good Enough? March 4-6, 2008.
 Field Experience Evaluations PSU Special Educator Programs Confidence... thrives on honesty, on honor, on the sacredness of obligations, on faithful.
Assessment and Testing
Fair and Appropriate Grading
Chapter 7 Utility. Utility Analysis What is a Utility Analysis? Some Practical Considerations –The pool of job applicants –The complexity of the job –The.
Building Exams Dennis Duncan University of Georgia.
CREATE – National Evaluation Institute Annual Conference – October 8-10, 2009 The Brown Hotel, Louisville, Kentucky Research and Evaluation that inform.
B.A. (English Language) UNIVERSITI PUTRA MALAYSIA
VSA & Assessment of Learning Outcomes Sally Frazee, Temple University Criss Gilbert, University of Minnesota-Twin Cities Susan Morgan – Appalachian State.
Assessment of Course-Level Learning Outcomes in Psychology.
DATA ANALYSIS Looking at Student Work November 2013.
WEIGHTED AVERAGE ALG114 Weighted Average: an average where every quantity is assigned a weight. Example: If a teacher thinks it’s more important, a final.
Classroom Assessments Chapter Fourteen. The Big Picture How can a teacher be sure to use the information from assessment to grade students fairly? How.
Unraveling the Mysteries of Setting Standards and Scaled Scores Julie Miles PhD,
Standards-Based Tests A measure of student achievement in which a student’s score is compared to a standard of performance.
Review of Cut Scores and Conversion Tables (Angoff Method)
Introduction Tony Cortez, Account Executive
CLEAR 2011 Annual Educational Conference
Looking at Student Work November 2013
The Good The Bad & The Ugly Real-Life Examples of the SLO Assessment Report Form With Tips on How to Complete It August 21, 2012.
RELATING NATIONAL EXTERNAL EXAMINATIONS IN SLOVENIA TO THE CEFR LEVELS
Your introduction to this year’s English exam.
Standard Setting for NGSS
SAT Math Overview.
Review
Daniel 5.
The Book of Daniel.
What to do with your data?
Standard Setting Zagreb, July 2009.
Deanna L. Morgan The College Board
A Process for the Direct Assessment of Program Learning Outcomes Based on the Principles and Practices of Software Engineering Robert W. Lingard California.
Presentation transcript:

How to Fail a Student Lisa M. Beardsley-Hardy, PhD, MPH, MBA Director of Education General Conference of Seventh-day Adventists

Mene, Mene, Tekel, Parsin Mene : God has numbered the days of your reign and brought it to an end. Tekel : You have been weighed on the scales and found wanting. Peres : Your kingdom is divided and given to the Medes and Persians. “ (Dan 5:26-28 NIV)

How to Fail a Student  Absolute cut-off scores  Norm-referenced cut-off score (e.g. 1 SD below mean fails 15% of class independent of competency)  Criterion-referenced

How to Fail a Student  Provide experience with assessment and feedback.  Sample multiple times.  Plan for leniency to be fair to other students (e.g., require final paper in week 13 of semester) to allow grace.  Use multiple methods (multi-trait, multi-method sampling).  Determine and document reliability.

“ Empirical rule ” or

Nedelsky Method Determine minimally competent or borderline “ F-D ” student Divide 1 by the number of answers from which the test taker has to guess. Sum expected scores and divide by number of questions on the exam.

Nedelsky example My music teacher thinks that Marian Anderson sings ________any other contralto he has ever heard. My music teacher thinks that Marian Anderson sings ________any other contralto he has ever heard. (A) more well than (B) better than (C) the best of (D) more better over Eliminate A & D, guess between B & C = ½=.50

Modified Nedelsky Method  Determine minimally competent or borderline “ F-D ” student  Divide 2 by (2 + the number of answers) from which the test taker has to guess.  Sum expected scores and divide by number of questions on the exam.

Modified Angoff Method Requires a copy of the test or at least a wide selection of items from the test item pool. Asks each judge to “ state the probability that the ‘ minimally proficient person ’ would answer each item correctly ” or the number of borderline students out of 100 who would be expected to get the item correct. Multiple rounds of ratings and discussion are conducted. Probabilities are then summed and the total sum = cut score. The ratings and discussion must be conducted separately for each cut score being determined. Used in NAEP, CLEP and many state programs, particularly on tests that are all Multiple Choice (see

Konosuke Matsushita Matsushita Company (ticker MC) –Panasonic phy/code/04.html

Evidence of Closing the Loop  Broad discussion of outcomes data  Discussion of implications of data and outline institutional learning agenda  Document institutional learning in committee minutes, annual departmental report, and reports to students, BT, and accreditation bodies

Discussion  Observations  Questions  Recommendations  Next steps

Faculty Development Purchase or download A Primer on Setting Cut Scores on Tests of Educational Achievement from Purchase or download A Primer on Setting Cut Scores on Tests of Educational Achievement from df/Cut_Scores_Primer.pdf