Marishiel Mejia-Samonte, MD

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Objectives Explain the purpose of the RIME feedback method.
Introduction to Competency-Based Residency Education
E.g Act as a positive role model for innovation Question the status quo Keep the focus of contribution on delivering and improving.
Workplace assessment Dr. Kieran Walsh, Editor, BMJ Learning. 2.
Where to from here in Assessment in Medical Education? Dr Heather Alexander 5 November 2010.
Assessment of Professionals M. Schürch. How do you assess performance? How do you currently assess the performance of your residents? What standards do.
Workplace-based Assessment. Overview Types of assessment Assessment for learning Assessment of learning Purpose of WBA Benefits of WBA Miller’s Pyramid.
Promoting Excellence in Family Medicine nMRCGP Workplace-based Assessment March 2007.
Clinical Examination and Procedural Skills (CEPS) The Introduction of Integrated DOPS The assessment of psychomotor skills in WPBA for the MRCGP examination.
Clinical Examination and Procedural Skills The assessment of psychomotor skills in WPBA for the MRCGP examination.
Overview: Competency-Based Education & Evaluation
Assessment of Clinical Competence in Health Professionals Education
Workplace-Based Assessment Case-Based Discussion (CBD) These slides have been prepared to facilitate discussion on the use of the CBD. The suggested practical.
Workplace-Based Assessment Clinical Evaluation Exercise (CEX) These slides have been prepared to facilitate discussion on the use of the CEX. The.
Training the OSCE Examiners
Grade 12 Subject Specific Ministry Training Sessions
Classroom Assessment A Practical Guide for Educators by Craig A
Measuring Learning Outcomes Evaluation
Assessing and Evaluating Learning
Science Inquiry Minds-on Hands-on.
Assessment of Communication Skills in Medical Education
ACGME OUTCOME PROJECT : THE PROGRAM COORDINATOR’S ROLE Jim Kerwin, MD University of Arizona.
Fundamentals of Assessment Todd L. Green, Ph.D. Associate Professor Pharmacology, Physiology & Toxicology PIES Seminar
Excellence in Clinical Teaching Your Name Here Your Organization.
Intending Trainers Course. 1. Communication and consultation skills – communication with patients, and the use of recognised consultation techniques 2.
Quality in language assessment – guidelines and standards Waldek Martyniuk ECML Graz, Austria.
Classroom Assessments Checklists, Rating Scales, and Rubrics
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Measuring Complex Achievement
Direct Observation of Clinical Skills During Patient Care NEW INSIGHTS – REYNOLDS MEETING 2012 Direct Observation Team: J. Kogan, L. Conforti, W. Iobst,
Assessment Tools.
What is “Competency” in the New Millennium? Shirley Schlessinger, MD, FACP Associate Dean for Graduate Medical Education University of Mississippi Medical.
Assessment tools MiniCEX, DOPS AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Workplace based assessment for the nMRCGP. nMRCGP Integrated assessment package comprising:  Applied knowledge test (AKT)  Clinical skills assessment.
MRCGP The Clinical Skills Assessment January 2013.
1. Mini-Clinical Evaluation Exercise (mini-CEX) 2.
ETHICAL ISSUES IN HEALTH AND NURSING PRACTICE CODE OF ETHICS, STANDARDS OF CONDUCT, PERFORMANCE AND ETHICS FOR NURSES AND MIDWIVES.
CHW Montana CHW Fundamentals
Quality Assurance processes
Classroom Assessments Checklists, Rating Scales, and Rubrics
Workplace Based Assessments
Masters in Medical Education in Clinical Contexts
DATA COLLECTION METHODS IN NURSING RESEARCH
MRCGP The Clinical Skills Assessment January 2013.
Empathy in Medical Care Jessica Ogle (D
“You teach best what you most need to learn…..”
AP European History Mr. Vincent Spina
ASSESSMENT OF STUDENT LEARNING
Undergraduate teaching of consultation skills – examples from the teaching of pharmacy and medicine Tracey Coppins Teaching Fellow, School of Pharmacy,
Classroom Assessments Checklists, Rating Scales, and Rubrics
Clinical Assessment Dr. H
HISTORY TAKING BSNE I. The purpose of medical practice is to relieve patient suffering. In order to achieve this, one must make a diagnosis to guide therapeutic.
Human Resource Management By Dr. Debashish Sengupta
Reading Research Papers-A Basic Guide to Critical Analysis
Reliability and Validity of Measurement
Grade 6 Outdoor School Program Curriculum Map
Work Place Based Assessment
Assessment 101 Zubair Amin MD MHPE.
Assessment of Clinical Competencies
Business and Professional Excellence in the Workplace
Unit 7: Instructional Communication and Technology
Writing Criterion Referenced Assessment Criteria and Standards
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
E-portfolio By Carol and Barry.
Chapter 4 Instructional Media and Technologies for Learning
By Carol, Sally and Barry
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Presentation transcript:

Marishiel Mejia-Samonte, MD Mini-CEX: Is it an X? Marishiel Mejia-Samonte, MD

The use of a variety of different assessment methods has been characteristic of medical education Gone are the days when medical knowledge and clinical skills of doctors were assessed using written and oral examinations Written examinations: open-ended questions graded by hand Oral examinations required the student to go to a patient’s bedside, gather the information and then present a diagnosis and treatment plan to assessors who asked questions and made judgements about the performance

New methods have been developed focusing on clinical skills like: Communication A competency that cannot be tested well by written examinations or examinations wherein the student-patient encounter is unobserved New methods have been developed focusing on clinical skills like: History-taking and performing physical examinations Communication skills Procedural skills Professionalism

Selection of Assessment Method: Quantifiable Validity Degree to which the inferences made about medical competence based on assessment scores are correct Reliability or Generalizability Measure of the relative magnitude of variability in scores due to error, with the aim of achieving a desired level of measurement precision

Selection of Assessment Method: Not Readily Quantifiable Educational Effect Capitalizes on the students’ motivation to do well and directs their study efforts in support of the curriculum Feasibility Degree to which the assessment method selected is affordable and efficient for the testing purpose Acceptability Extent to which stakeholders in the process (students, faculty, patients) endorse the measure and the associated interpretation of scores

Simulations Increasingly being used in medical education to ensure that examinees can demonstrate integration of prerequisite knowledge, skills and affect in a realistic setting Standardized patients Computer-based simulation Computer programs Model-driven simulations Virtual reality

Issues Associated with Simulations Fidelity Equivalence Standardization Reliability Case-Generation Security

Simulations provides a means of beginning to assess these skills, but real patients often have… more complex problems more acutely ill demand more skill than can be simulated by modern technology

Work-Based Assessment Educational mission protect the safety of patients provide the opportunity for educational feedback to the trainee Educational mission dictates that the methods chosen for assessment protect the safety of patients and provide the opportunity for educational feedback to the trainee

Why Choose Work-Based Assessment Training of doctors occurs in the setting of patient care To ensure that the type and complexity of patient-care problems that doctors face during training are the same as those encountered in practice is both a challenge and an opportunity

Work-Based Assessment Trainees confront a broad array of health care problems and just like doctors in practice they are required to integrate all of their skills in response Those involved in the actual development of the assessment systems will need to consider a number of issues

Designing a System of Assessment Step 1: Define the content to be assessed, and the focus of the assessment Step 2: Define the purpose of assessment Step 3: The blue printing process Step 4: Choose or develop methods Step 5: Train assessors Step 6: Standard setting Step 7: Reporting and review system

Mini Clinical Evaluation Exercise (Mini-CEX) a method for simultaneously assessing the clinical skills of trainees and offering them feedback on their performance simple modification of the traditional bedside oral examination relies on the use of real patients and the judgments of skilled clinician educators

How does the original CEX works? a faculty member observes the trainee interact with a patient in any of a variety of settings the trainee conducts a focused history and physical examination and after the encounter provides a diagnosis and treatment plan the faculty member scores the performance using a structured document and then provides educational feedback

How does the mini-CEX works? Relatively short, about 15 minutes Occur as a routine part of the training program Several different occasions by different faculty examiners Proper representation of different clinical problems Each of the encounters should represent a different clinical problem, appropriately sampled from the list of patient problems

Competence Descriptor of a Satisfactory Trainee History Taking Facilitates patient’s telling of a story Effectively uses appropriate questions to obtain accurate, adequate information Responds appropriately to verbal and nonverbal cues Physical Examination Follows efficient, logical sequence Examination is appropriate to the clinical problem Explains to the patient Sensitive to the patient’s comfort Modesty

Competence Descriptor of a Satisfactory Trainee Professionalism Shows respect, compassion, empathy Establishes trust Attends to patient’s needs of comfort, respect, confidentiality Behaves in an ethical manner Awareness of relevant legal frameworks Aware of limitations Clinical Judgement Makes appropriate diagnosis and formulates a suitable plan Selectively orders/ performs appropriate diagnostic studies Considers risks and benefits

Competence Descriptor of a Satisfactory Trainee Communication Skill Explores patient’s perspective Jargon free Open and honest Empathetic Agrees management plan/ therapy with patient Organization/ Efficiency Prioritizes Timely Summarizes

Competence Descriptor of a Satisfactory Trainee Overall Clinical Care Demonstrates satisfactory clinical judgement, synthesis, caring, effectiveness Efficiency Appropriate use of resources Balances risks and benefits Awareness of own limitations

Strengths of the CEX Trainee’s performance with a real patient Skilled clinician-educator who both assesses the performance and provides educational feedback Complete and realistic clinical challenge It evaluates the trainee’s performance with a real patient. In medical school, the Objective Structured Clinical Examination (OSCE) is often used and it does an excellent job of assessing clinical skills. As trainees approach entry to practice, however, their education and assessment needs to be based on performance with real patients who exhibit the full range of conditions seen in the clinical setting. The trainee is observed by a skilled clinician-educator who both assesses the performance and provides educational feedback. This enhances the validity of the results and ensures that the trainee receives the type of constructive criticism that should result in a reduction of errors and an improvement in quality of care. The CEX presents trainees with a complete and realistic clinicalchallenge.Theyhavetoget all of the relevant information from the patient, structure the problem, synthesise their findings, create a management plan, and communicate this in both oral and written form.

Weaknesses of the CEX Standards to follow Alternative assessments Selection of assessors Equivalence The research showed that trainees’ performances with one patient were not a very good predictor of their performances with other patients. Consequently, they needed to be observed on different occasions with different patients before drawing reliable conclusions about their competence. Observing each trainee with several patients was also desirable from an educational perspective, since different patients require different skills from trainees and this significantly broadens the range and richness of feedback they receive. The research showed that the assessors did not agree with each other even when they were observing exactly the same performance. Training of assessors is helpful to some degree but much larger improvements in the reliability and validity of the ratings was achieved by including different faculty members in the overall assessment of each trainee. This was also useful from the perspective of education, since trainees received feedback from different assessors, each with their own specialties, strengths, and perspectives. In terms of the method itself, the CEX focused on the trainee’s ability to be thorough with a single new patient in a hospital setting that is uninfluenced by time constraints. In contrast, different patients pose different challenges and the tasks or competencies required of doctors vary considerably depending on the setting in which care is rendered. Further, most patient encounters are much shorter than two hours so the CEX does not assess the trainee’s ability to focus and prioritize diagnosis and management.

Evidences on the Utility of the Mini-CEX Reliability Eight (8) to fourteen (14) raters The original studies showed a reliability of 0.8 with 12-14 raters, more recent studies have shown reliable results with as few as eight raters Narrow standard error of measurement (SEM) suggesting that those who have high (or low) scores initially may need as few as four encounters and further assessment can be focused on trainees with borderline results. Norcini demonstrated good inter-rater reliability, with no large differences in ratings between examiners and across settings.

Evidences on the Utility of the Mini-CEX Validity Good face validity Involves the observation of a real patient encounter in a real clinical environment Able to differentiate between levels of experience Scores do improve over time More experienced trainees receive higher ratings

Evidences on the Utility of the Mini-CEX Educational Impact There is some evidence that mini-CEX promotes deep learning and encourages self reflection Its educational effect is based on a significant increase in the number of occasions on which trainees are directly observed with patients and offered feedback on their performance.

Cautions on the Utility of the Mini-CEX Higher ratings for more complex cases Faculty ratings are lower than resident’s ratings of students Not intended for use in high stakes assessments and should not be used to rank or compare candidates There may be a significant halo effect with a high correlation between scores achieved on individual competencies. Care needs to be taken when interpreting the results of a mini-CEX instrument which attempts to assess multiple distinct domains of performance.

Cautions on the Utility of the Mini-CEX Primary purpose: provide an opportunity to observe the trainee’s clinical skills give constructive feedback Assessors need to be trained in the use of the mini-CEX assessment method. Primary purpose: to provide an opportunity to observe the trainee’s clinical skills (which otherwise happens rarely) and give constructive feedback. For this to happen effectively both the assessor and the trainee need to be familiar with the assessment instrument and the assessor needs to be both trained and competent in the procedure/skill they are assessing (in order to be able to make a judgement) and trained in how to give feedback.

Factors Influencing Rater Judgements Intrinsic Factors Gender Experience or expertise Clinical skills Bias Rater confidence Judgment-making Factors Conceptualization Interpretation Attention Impressions Lee, V. et al. Academic Medicine

Factors Influencing Rater Judgements External factors Specialty, encounter setting, and factors related to doctor, patient, and consultation. Differences in case specificity: ↑ scores with ↑ duration or ↑ complexity Local rater bias Assessor’s prior knowledge or relationship with the resident and the institutional culture and educational system all influenced their mini-CEX ratings Rater training alone In relation to translating judgments into scores, we found articles that described scoring integration (i.e., the assimilation of the different miniCEX domains into an overall score) and domain differentiation (i.e., the ability to distinguish different dimensions of performance across the mini-CEX domains).. External factors Firstly, scores varied according to specialty,21 encounter setting,2,22,32 and factors related to doctor, patient, and consultation.6 The authors of this last study6 also found that raters struggled with how to take contextual influences (e.g., consultation complexity, non-native-speaking patient, presentation of multiple complaints) into account when assessing communication skills. Secondly, differences in case specificity influenced mini-CEX scores; that is, higher scores were associated with encounters of increased duration32 or higher complexity.2,22,32 Thirdly, one set of investigators using a direct observation tool similar to the mini-CEX for the Neurology Clinical Examination found evidence of local rater bias; specifically, local faculty were less likely to fail residents than external faculty.33 Fourthly, Kogan and colleagues19 reported that the assessor’s prior knowledge or relationship with the resident and the institutional culture and educational system all influenced their mini-CEX ratings. Finally, as previously mentioned, rater training alone appeared to have conflicting results on improving the interreliability of faculty ratings.27 Lee, V. et al. Academic Medicine

Factors Influencing Rater Judgements Scoring Factors Scoring integration Domain differentiation In relation to translating judgments into scores, we found articles that described scoring integration (i.e., the assimilation of the different miniCEX domains into an overall score) and domain differentiation (i.e., the ability to distinguish different dimensions of performance across the mini-CEX domains).. External factors Firstly, scores varied according to specialty,21 encounter setting,2,22,32 and factors related to doctor, patient, and consultation.6 The authors of this last study6 also found that raters struggled with how to take contextual influences (e.g., consultation complexity, non-native-speaking patient, presentation of multiple complaints) into account when assessing communication skills. Secondly, differences in case specificity influenced mini-CEX scores; that is, higher scores were associated with encounters of increased duration32 or higher complexity.2,22,32 Thirdly, one set of investigators using a direct observation tool similar to the mini-CEX for the Neurology Clinical Examination found evidence of local rater bias; specifically, local faculty were less likely to fail residents than external faculty.33 Fourthly, Kogan and colleagues19 reported that the assessor’s prior knowledge or relationship with the resident and the institutional culture and educational system all influenced their mini-CEX ratings. Finally, as previously mentioned, rater training alone appeared to have conflicting results on improving the interreliability of faculty ratings.27 Lee, V. et al. Academic Medicine

Tamara. Family Medicine

Feedback Basic teaching methods used in clinical settings General complaint from medical students and residents is, ‘‘I never receive any feedback.’’ Explanations for this perceived lack of feedback: actual lack of feedback students’ not realizing that they have been getting feedback problems with data collection on feedback received by students Hypothesis is that clinicians do not appreciate the role of feedback as a fundamental clinical teaching tool, and do not recognize the many opportunities for using that tool.

Designing a System of Assessment Step 1: Define the content to be assessed, and the focus of the assessment Step 2: Define the purpose of assessment Step 3: The blue printing process Step 4: Choose or develop methods Step 5: Train assessors Step 6: Standard setting Step 7: Reporting and review system