Principles of assessment in the clinical setting

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Effective Feedback Win May, MD, PhD Beverly Wood, MD, PhD Division of Medical Education Keck School of Medicine University of Southern California.
Implementing Evidence- Based Practice Training in a Scientist-Practitioner Program David DiLillo Director of Clinical Training University of Nebraska-Lincoln.
Evaluation and Promotions: Introduction for PGY1s Thomas Maniatis, MD, CM, MSc (Bioethics), FACP, FRCPC Chair, Faculty Postgraduate Promotions Committee.
GME Lunch n Learn Series Cuc Mai September Common Program Requirements: Competency-based goals and objectives for each assignment at each educational.
Assessment of Clinical Competence in Health Professionals Education
Dr. Dalal AL-Matrouk KBA Farwaniya Hospital
Measuring Learning Outcomes Evaluation
Fundamentals of Assessment Todd L. Green, Ph.D. Associate Professor Pharmacology, Physiology & Toxicology PIES Seminar
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Topic 4 How organisations promote quality care Codes of Practice
PLAN AND ORGANISE ASSESSMENT. By the end of this session, you will have an understanding of what is assessment, competency based assessment, assessment.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Learner Assessment Win May. What is Assessment? Process of gathering and discussing information from multiple sources to gain deep understanding of what.
Understanding Meaning and Importance of Competency Based Assessment
Developing an Assessment System B. Joyce, PhD 2006.
Assessment Tools.
What is a Planned Curriculum?
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
Assessing Competence in a Clinical Setting GRACE Session 12.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
FAIMER Assessing Teaching Excellence John Norcini, Ph.D.
Creating a Positive Learning Environment
Interpersonal and Communication Skills
PORTFOLIO ASSESSMENT (Undergraduate Medical Education)
Evaluation Of and For Learning
School – Based Assessment – Framework
Thinking about assessment…
3 Chapter Needs Assessment.
Introduction Developed in collaboration with: Lead Advisor
NEEDS ANALYSIS.
NATIONAL COUNCIL OF PROFESSERS OF EDUCATIONAL ADMINISTRATION
Assessment of Learning 1
Inquiry-based learning and the discipline-based inquiry
MRCGP The Clinical Skills Assessment January 2013.
Five Microskills of Effective Feedback Focus on SBIRT Maureen Strohm, MD, MSEd with thanks to Julie G Nyquist, Ph.D.
Developing A Competency-based Curriculum
Rachel Carlson Jeanie McHugo
Developing A Competency-based Curriculum
What are the competencens students should have at the end of practical training?
ASSESSMENT OF STUDENT LEARNING
Undergraduate teaching of consultation skills – examples from the teaching of pharmacy and medicine Tracey Coppins Teaching Fellow, School of Pharmacy,
Clinical Assessment Dr. H
This presentation includes the audio recording from the “Review of the Internal Medicine Subspecialty Reporting Milestones” webinar held on September 11,
This presentation includes the audio recording from the “Review of the Internal Medicine Subspecialty Reporting Milestones” webinar held on September 9,
Guide to Intern Assessment Processes for Supervisors
Teaching with Instructional Software
Strategies and Techniques
Principles of Assessment & Criteria of good assessment
Module 9: Category III: Monitoring Student Progress
CBEI Essentials for Residents, Fellows, Advanced Practice Providers, and Faculty A 10-minute primer on student performance assessment in required clerkships.
Chapter 2 Performance Management Process
Evaluate the effectiveness of the implementation of change plans
5CF004 – Leading Practice and Developing Critical Thinking Reflection
Project–Based Learning
Guide to Intern Assessment Processes for Interns
Balancing Administrative & Clinical Supervision
ASSESSMENT IN COUNSELLING PREPAIRED BY: DR.MUNA ABDEEN ABDELRAHMAN.
Effectively Training Parents in Behavior Analytic Interventions
Parent-Teacher Partnerships for Student Success
Unit 7: Instructional Communication and Technology
A Review of Effective Teaching Skills
Evaluation and Testing
In The Name Of the Most High
OSCE Interest group 3rd April 2009 Barry Ricketts
6 Chapter Training Evaluation.
Integrating Best Practices of Participant Evaluation Clinical Instructor Intensive Adrienne Small, DNP, FNP-C, CNE, CHSE Medical Instructor Duke University.
Presentation transcript:

Principles of assessment in the clinical setting Adriana Venturini, PT, MSc Assistant professor (professional) , ACCE Physical Therapy Program Adapted from Valérie Dory, MD PhD Undergraduate Medical Education Faculty Development Team, Faculty of Medicine Marion Dove, MD, Beth Cummings, MD, Laurie Plotnick, MD McGill University Spring Clinical Day June 10th, 2015

Assessment is challenging 'you don't want to sort of be the one who sticks the knife in them' Potential negative outcomes for learner and assessor Perceived lack of competence Challenge of documentation Challenge of providing useful feedback and remediation Dudek et al. 2005; Cleland et al. 2008; Rees et al. 2009

Assessment as educational reasoning Data Interpretation Decision History includes key questions about relevant hypotheses End of rotation objectives Pass Assessment organised Standards of care Case presentation coherent Feedback Good CR Intervention - skilled Previous student Entrust Notes well organized The way I do things

Clinical reasoning Data Interpretation Decision Nendaz & Perrier 2012

The “W-5” of assessment What do we assess? HoW do we assess? Why do we assess? When do we assess? What do we assess? HoW do we assess? Who assesses? This slide – the W-5 of assessment – frames the content of this plenary.

Epstein 2007 3 categories of goals for assessment: Selection purposes – choose students for positions where there are more candidates than places Protect patients – ensure that only competent individuals are allowed to practice Optimize learning – before, during, and after the assessment

Optimizing learning Before the assessment Objectives convey what is important to learn Extrinsic motivation During the assessment period Provides an opportunity to practice and hence to increase retention After the assessment Provides feedback to students to direct future learning Provides feedback to educators to direct curriculum Ensures students have met prerequisites to progress in training Identifies students in need of remediation

Summary Why assess students? Tells students what is important to know Allows monitoring of progress Enhances learning and retention Increases motivation and commitment

When should we assess? If you ask students… “ … we are always being assessed …” “…we never get feedback…”

The Assessment Continuum B A This slide highlights the importance of giving feedback to the learner, based on ongoing assessment, or formative assessment. Formative assessment Summative assessment

The Assessment Continuum Formative Assessment: Assessment done with the intent of providing ongoing information for individual or program modification Summative Assessment: Assessment done at the end of a course or program to determine whether the individual or program has met a set of predetermined expectations.

What Do We assess? Competencies Knowledge Skills Attitudes As mentioned earlier, competence includes Knowledge, Attitudes and Skills, and all three should be included in the assessment process. Teachers often find the assessment of attitudinal problems to be most challenging. It is important to note that assessment usually drives learning. Thus, if we truly value something (e.g. professional behaviors), we should assess it. The methods required to adequately assess knowledge vs. attitudes vs. technical skills are quite different and must be carefully considered.

What Do We assess? Our profession specific roles: Expert Communicator Collaborator Manager Health Advocate/Change agent Scholar Professional The CanMEDS roles form the framework for assessment in postgraduate medical education at McGill. All but one of the seven reflect knowledge, attitudes and skills that fall outside of the realm of ‘medical expert’.

What Do We assess? Competencies Expert Knowledge Basic sciences, clinical sciences Skills Data gathering, hypothesis generation, assessments, designing treatment plan Attitudes Responsibility in discharging duties Competencies Expert As mentioned earlier, competence includes Knowledge, Attitudes and Skills, and all three should be included in the assessment process. Teachers often find the assessment of attitudinal problems to be most challenging. It is important to note that assessment usually drives learning. Thus, if we truly value something (e.g. professional behaviors), we should assess it. The methods required to adequately assess knowledge vs. attitudes vs. technical skills are quite different and must be carefully considered.

Competencies Professional What Do We assess? Knowledge Knowledge of professional norms Skills Developing therapeutic relationship Attitudes Respect Altruism Competencies Professional As mentioned earlier, competence includes Knowledge, Attitudes and Skills, and all three should be included in the assessment process. Teachers often find the assessment of attitudinal problems to be most challenging. It is important to note that assessment usually drives learning. Thus, if we truly value something (e.g. professional behaviors), we should assess it. The methods required to adequately assess knowledge vs. attitudes vs. technical skills are quite different and must be carefully considered.

Does Shows How Knows How Knows What Do We assess? Does Shows How Knows How Knows Miller’s pyramid describes different levels of competence, from the simplest level (the base of the pyramid) to the most complex level (the tip of the pyramid). For example, a learner may be aware (know) of the importance of good communications skills when conducting an interview with an interpreter, be able to describe (know how) the principles of how this should be done, demonstrate (show how) these communication skills in a role play or simulation, and finally communicate according to these principles in the clinical setting (does). As much as possible, we should strive to assess the ‘tip of the pyramid’, which is often the most difficult to assess. This requires direct observation ‘in the field’, and is the most ‘authentic’ assessment of clinical competence. Miller, 1990

How do we assess? Stimulus = task Real patient(s) Simulation Clinical care, technical procedure Simulation Mannekin, high fidelity simulation, standardized pt, rater role playing Clinical scenario Paper/computer-based vignette, video Knowledge question Does Shows how Knows how Knows

Response = data collected How do we assess? Response = data collected Observation (live, video) Oral Written (chart review) Simulator records Written (open) Written (closed) Does Shows how Knows how Knows

Reliability The consistency with which a method measures what it is designed to measure Validity The degree to which a method measures what it is intended to measure

Reliability and validity

To Improve Reliability of your Assessment Program… Multiply! Multiple Methods Multiple Observations Multiple Observers

Who ASSESSes? Supervisors/teachers Other health care professionals Self Peers Students Patients Licensing/national bodies “Testing . . . testing.” It is also important to review WHO will be evaluating the learning and to discuss the benefits of multiple stakeholder perspectives.

The Bottom Line Know why you are assessing Know what you are assessing Match your method to your objective Be specific Involve the learner Document

In summary: Assessment should not be a surprise! Assessment is more than the completion of a form. Why do we think about assessment last when learners think about it first? Weston, 2004 This slide once again emphasizes the direct link between on-going feedback (formative assessment) and the final summative evaluation. If there is in fact no link, then true evaluation or assessment has not been done adequately.