Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessing Educational Effectiveness Lisa M. Beardsley-Hardy, PhD, MPH, MBA Director of Education General Conference of Seventh-day Adventists.

Similar presentations


Presentation on theme: "Assessing Educational Effectiveness Lisa M. Beardsley-Hardy, PhD, MPH, MBA Director of Education General Conference of Seventh-day Adventists."— Presentation transcript:

1 Assessing Educational Effectiveness Lisa M. Beardsley-Hardy, PhD, MPH, MBA Director of Education General Conference of Seventh-day Adventists

2 Why Assessment Matters  Provide Feedback  Direct efforts towards learning outcomes  Student  Teacher  Demonstrate accountability  Basis for curriculum refinement and institutional learning

3 Goal of Education  High mean, small SD  Junior medicine clerkship, First and Third-Year Internal Medicine Residents  Vital exam results for sophomore medical students

4 Role of Feedback  Essential cognitive activity to ensure accuracy in acquisition of new knowledge and skill (summarize, elaborate, monitor for accuracy) –Radiology: practice does not make perfect without feedback!

5 Direct Efforts to Learning Outcomes  Student  Teacher  Determine competencies required at next level and assess that knowledge and skills “ We ’ re getting better students ”

6 Demonstrate Accountability  Students  Parents  Board of Trustee  Donors  Accrediting Agencies

7 Basis for Curriculum Refinement  Course (instructor)  Major (department faculty)  General Education (interdisciplinary)  Degree (faculty senate, graduate council, academic affairs)  Alverno College: videotaped interviews  CSU Monterey Bay: capstone projects http://csumb.edu/site/x13842.xml  University of Illinois: Objective Structured Clinical Exams and Senior Comps

8 What to Measure?  Knowledge  Skills (procedures, clinical skills, complex behavioral sequences, interpersonal, teamwork)  Problem-solving skills (PBL)  Attitudes  Values  Spiritual commitment

9 Knowledge  MCQ  Norm-referenced vs. criterion referenced (Nedelsky or Angoff method)  Essay, journal, thesis

10 Skills  Objective Structured Clinical Exam (OSCE)  Multi-station, composed of subscales  Assess interpersonal skills (subjective)  Recording by standardized patient

11 OSCEs  Reduce test variance (standardized patients/problems)  Reduce rater variance (eagles and doves)  Reduce environmental variance (same time, testing conditions)  Reduce cheating (proctors, application of knowledge)  INCREASE variance due to student

12 Problem-Solving Skills  Triple Jump (assign problem and assess problem-solving strategy; assess outcome after 24 hours)  Modified essay exam (progressively provide additional information)  Peer assessment on teamwork  Problems requiring application of knowledge and skills

13 O = T + E Observed score is composed of true score and error (Classical test theory) Basis for calculation of Cronbach ’ s alpha and reliability coefficients such as KR-20

14 Fair Exams Do not use dishonest standards when measuring length, weight or quantity. Use honest scales and honest weights, honest multiple choice questions and an honest comprehensive exam. I am the LORD your God, who brought you out of Egypt. Do not use dishonest standards when measuring length, weight or quantity. Use honest scales and honest weights, honest multiple choice questions and an honest comprehensive exam. I am the LORD your God, who brought you out of Egypt. (Lev. 19:35-36, ephah and hin in NIV) (Lev. 19:35-36, ephah and hin in NIV)

15 Sources of Error  Student (illness, mistakes)  Teacher (eagles vs. doves)  Environment (noise, distractions, defective equipment)  Testing instrument (ambiguity in items, two distracters correct)  Cheating

16 Sources of Error: Cheating  Prevention: integrity statement in syllabus to include plagiarism, appropriate testing behavior, methods of prevention such as electronic monitoring of papers and videotaping)  Control testing environment  Confirm suspicions in writing; legal counsel  Deal with substantiated incidents as part of student ’ s education

17 Fair Exams  Use test blueprint  Establish cut-off score  Provide opportunity to review test results (after scoring but before final cut-off score)  Use item-analysis to modify cut-off  Apply SEM for high-stakes exams

18 Test Blueprint  Based on learning objectives  Determines test construction (items sampled and weight assigned to subscales)  Basis for retesting

19 Test Blueprint Mirrors Objectives Objective: Upon completion of this course (program) the student will be able to: Main test Make- up test 1 define… 2 interpret…. 3 perform a…. 4 evaluate… 5 determine… 6 solve… 7 create…

20 Mene, Mene, Tekel, Parsin Mene : God has numbered the days of your reign and brought it to an end. Tekel : You have been weighed on the scales and found wanting. Peres : Your kingdom is divided and given to the Medes and Persians. “ (Dan 5:26-28 NIV)

21 Discussion  Observations  Questions  Recommendations  Next steps


Download ppt "Assessing Educational Effectiveness Lisa M. Beardsley-Hardy, PhD, MPH, MBA Director of Education General Conference of Seventh-day Adventists."

Similar presentations


Ads by Google