Presentation is loading. Please wait.

Presentation is loading. Please wait.

Preparing the Future Primary Care Workforce Together

Similar presentations


Presentation on theme: "Preparing the Future Primary Care Workforce Together"— Presentation transcript:

1 Preparing the Future Primary Care Workforce Together
Assessment Through Observation Importance and Challenges Special Thanks to Jennifer Kogan, Lisa Conforti and the Direct Observation Research Team

2 Assessing for the Desired Outcome
Work-based assessment is mostly accomplished through the observations of faculty, team members, peers and other co-workers Performance in Practice//Multi-source feedback/ Direct Observation Does (action) Standardized Patients/Simulation Shows How (performance) Diagnostic Reasoning using clinical vignettes/ Chart Stimulated Recall Knows How (competence) Multiple choice Questions Knows (knowledge)

3 In-Training Performance Assessment
Assessment in authentic situations Learners’ ability to combine knowledge, skills, judgments, attitudes in dealing with realistic problems of professional practice Assessment in day to day practice Enables assessment of a range of essential competencies, some of which cannot be validly assessed otherwise What I am talking about is performance assessment. Although this is a little long, I think that this description of performance assessment by Govaerts about in-training assessment published about 2 years ago. Govaerts defines performance assessment as assessment in authentic situations read slide This is becoming even more important as licensing organizations such as the ABIM continue to focus on competency assessment and milestones. Govaerts MJB et al. Adv Health Sci Edu. 2007;12:239-60

4 Who Watched You? Being Observed How did it feel? Was it useful?
Have you done it? Have 3x5 cards on tables. On one side write about being observed. On the other side write about being the observer. 2-3 minutes to write. 5 minutes to share. Themes from the tables

5 Theories supporting the importance of direct observation
Development of expertise Role in outcomes (competency) based medical education Necessity in supervision

6 Clinical Skills Do Matter
History/exam Makes diagnosis > 80% of the time Even in era of technology Required to avoid unnecessary testing Patient centered communication associated with Increased patient knowledge and self-efficacy Increased adherence and well-being Improved outcomes Decreased costs

7 How Do People Become Experts?
Deliberate practice Working on well defined tasks Informative feedback Repetition Self-reflection Motivation Endurance Ericsson KA et al. Psych Rev (3):

8 Self Assessment Individually generated summary judgment of one’s skill level Inaccurate Poor performers overestimate Outstanding performers underestimate Davis D et al. JAMA 2006; 296: Eva KW et al. Acad Med. 2005;80:S46-54

9 Expert Performance vs. Everyday Skills
What this picture illustrates is the qualitative difference between the course of improvement of expert performance and of everyday activities. In the initial phase of learning, or the cognitive phase, novices try to understand the activity and concentrate on avoiding mistakes (example). This is where you are concentrating on what you are doing. In the middle phase of learning, the ‘‘associative’’ phase, performance is smoother. You don’t need to concentrate as hard to perform at an acceptable level, and salient mistakes increasingly rare. Finally, after a period of training and experience, individuals are able to perform the tasks automatically and with a minimal amount of effort. However there is a consequence to this automation. When automaticity occurs, people lose conscious control over execution of the skill, and can no longer make specific intentional adjustments or modifications. When this automated phase of learning as been attained, performance reaches a stable plateau with no further improvements. Expert performers are able to counteract this automaticity and they remain in the cognitive or associate phase of learning. The way that they accomplish this is by deliberately constructing and seek out training situations in which the desired goal exceeds their current level of performance. This means they continue to practice. That is part of our job- to help trainees construct and seek out training situations that can further their skills set. But it is not any type of practice- and people have studied the way an expert practices, vs someone who is just very good. Ericsson KA. Acad Med. 2004

10 The Role of the Coach “They observe, they judge, and they guide”
“That one twenty-minute discussion gave me more to consider and work on than I’d had in the past five years” “Medical practice is largely unseen by anyone who might raise one’s sights. I’d had no outside ears and eyes.” Atul Gawande, New Yorker 10/3/2011

11 Observation and Safe Patient Care
Importance of appropriate supervision Entrustment Trainee performance* X Appropriate level of supervision** Must = Safe, effective patient-centered care * a function of level of competence in context **a function of attending competence in context

12 Your Supervision in Clinic
How do you usually supervise? When do you supervise more closely? How do you change your supervision to ensure patients get safe, effective, patient-centered care? What did you learn observing that will change how you supervise going forward? REMEMBER: SUPERVISION ALSO FOR FEEDBACK Q1: Routine Q2: Response: When do you supervise more closely Time of year Trainee skill Patient complexity Others?

13 Direct Observation - Purposes
Patient Safety Highlight there is a tension between these roles Assessment Coaching

14 Problems with Performance Assessment
Poor accuracy Focus on different aspects of clinical performance Differing expectations about levels of acceptable clinical performance Rating errors Halo effect/ “Horn” effect Leniency/stringency effect Central tendency So what is going on here? First, unstructured forms have low accuracy. Accuracy means that the scores reflect what is being observed and nothing else. Gordon Noel in 1992 showed that the accuracy of faculty when watching resident performance was poor: only 30% of their strengths and weaknesses were identified. However, this accuracy increased to 60% when the faculty were given a structured form that helped them to direct their attention to specific dimensions of performance. Inter-rater agreement is based on having different raters evaluate the same performance. Why has this been so poor? First raters are likely focusing on different aspects of clinical performance. For example, one person may focus heavily on the questions asked, the physical exam manuevers done, whereas another may focus more on the process of the encounter- such as how were questions asked, was there empathy. The other reason inter-rater agreement is low is because raters have different expectations about what is acceptable clinical perofrmance. Halo effect: cognitive erros- generalize a positive trait to all traits- and pay less attention to the negatives. Naysayers>

15 Factors Influencing Faculty Ratings
Minimal impact of demographics Age, gender, clinical and teaching experience Faculty’s own clinical skills may matter Faculty with higher history and patient satisfaction performance scores provide more stringent ratings. Kogan JR. et al. Acad Med. 2010;85(10 Suppl):S25-8

16 Factors Influencing Faculty Ratings
Different frameworks for judgments/ratings Self-as-reference (predominant) Trainee level Absolute standard Practicing physicians Idiosyncrasy Kogan JR, et al. Med Educ (10): Yeates P et al. Adv in Heath Sci Educ. In Press Govaerts Adv Health Sci Educ (2):

17 Faculty OSCE Clinical Skills
Competency Mean (SD) Range Generaliz-ability History Taking 65.5% (9.6%) 34% - 79% 0.80 Physical Exam 78.9% (13.6%) 36% - 100% 0.52 Counseling 77.1% (7.8%) 60% - 93% 0.33 Patient Satisfaction1 5.62 (0.48) 4.43 – 6.63 0.60 1On 7-point scale N=44 Kogan JR. et al. Acad Med. 2010;85(10 Suppl):S25-8

18 Other Factors Influencing Ratings
Contextual factors Encounter complexity Resident characteristics Institutional culture Emotions surrounding constructive feedback Inference Factors outside resident performance

19 Definition of Inference
a. The act or process of deriving logical conclusions from premises known or assumed to be true. b. The act of reasoning from factual knowledge or evidence.

20 Types of Inference about Residents
Skills Knowledge Competence Work-ethic Prior experiences Familiarity with scenario Feelings Comfort Confidence Intentions Ownership Personality Culture

21 The Problem with Inference
Inferences are not recognized Inferences are rarely validated for accuracy Inferences can be wrong

22 Direct Observation: A Conceptual Model
Kogan JR, et al. Med Educ. 2011

23 Achieving Accurate, Reliable Ratings
Form not the magic bullet Assessment requires faculty training Similar frameworks Agreed upon levels of competence Move to criterion referenced assessment The first important fact to recognize is that the assessment form itself is not the magic bullet. By that I mean that we are all tempted to spend out time making what we think are better instruments with the assumption that this will lead to more accurate and reliable assessments. But you have to remember that forms do not fill themselves out. And as we have seen, reliable and accurate assessments require faculty training. It is only with training that we will get our core faculty to approach observation with similar frameworks and agree upon levels of competence. And an important caveat is that as we thinking about faculty development, it should be stated that the best models for how to do this are not really know. There are a few studies that have looked at more intensive, one time faculty development for example the study done by Eric Holmboe a few years ago. However, shorter faculty development does not seem to help. If we were to make an analogy to IVF, Some of the current thinking is that faculty development in assessment probable needs to occur as a slow drip where there is multiple brief instruction over time as opposed to a one time bolus.

24 Video Scenario Watch the following clinical encounter between a internal medicine resident and patient Task 1: Complete the questions on the rating form first Task 2: Provide ratings on the miniCEX form

25 Observations and Ratings
As is typical, there is a wide range of overall ratings. Even from program directors who have thought a lot about evaluation and assessment. You are all over the map. Now some would argue this is acceptable in assessment because if we have enough ratings, we can overcome our poor inter-rater reliability. That is with assessments of a trainee, we can get to a reliable/reproducible rating. But, as you all know, that is not particularly feasible. Second it leaves the resident confused and trying to sort out how they are doing and which direction they need to go. I am sure you are all familiar with trainees frustration with the variability in our assessments of them. A great example is students frustration learning oral case presentation skills- every attending has different standardards and expectations. Sp we do not have enough time here to actually do direct observation training. What we will do instead is talk about what is going to be needed to train your core faculty in assessment. 25

26 Performance Dimension Training
Well, the reality is, that if we do not train faculty in observation and assessment, we do not give them the framework needed to make accurate, reliable and valid assessments. That is, they do not know, when assessing, what specific clinical skills, attitudes, knowledge, or behaviors that they are looking for or should be paying attention to. 26

27 Performance Dimension Training
Identify specific dimensions of a competency in behavioral terms Discuss the criteria and qualifications required for each dimension of that competency Develop a SHARED MENTAL MODEL Achieve evidence-based standardization and calibration Holmboe ES ABIM 2010 27

28 Performance Dimension Training
Identify important components of information transfer (counseling about assessment and plan) and starting a medication What should be discussed or done? How should it be discussed or done? Make certain that components are described behaviorally

29 Performance Dimension Exercise
Review frameworks for information transfer (counseling about assessment and plan/starting a medication) for any additions SEGUE Makoul GT. 1993/1999 Structured Clinical Observation (SCO) Lane JL, Gottlieb RP. Pediatrics. 2000;105:973-7. Informed Decision Making Braddock CH, Edwards KA, Hasenberg NM, Laidley TL, Levinson W. JAMA 1999; 282: Note: break down into smaller groups of 4 29

30 Apply Your Framework to Scenario
What did the resident do well? What are the errors/deficiencies?

31 Monitoring for Inference
TIP 1: Ask Is this the "right" conclusion? Why am I making these interpretations? Is this really based on all the facts? TIP 2: Reflect on your reasoning Do you tend to make assumptions too easily? Do you tend to select only part of the data? Note your tendencies so that you can learn to do that stage of reasoning with extra care in the future.

32 Synthesis to Judgment Goal: Improve the quality and accuracy of the educational “judgment” using a compare and contrast process

33 Steps: Synthesis to Judgment
Review vignettes of different performance levels Judge using behaviorally-based frameworks (e.g. evidence based frame of reference) Trainer provides feedback on assessment accuracy Discuss discrepancies between scripted performance and participants’ assessments

34 What Was the Basis of Your Judgment?
Why did you give this rating? What influenced your rating?

35 Independent Work Check off elements on your framework that constitute competent for patient centered, safe, effective care. MOVING TO CRITERION BASED FRAME OF REFERENCE (NOT NORMATIVE) Note: implications for assessment, implications for coaching, implications of supervision

36 Group Discussion What are the elements? Apply to scenario
What are the implications of this approach? Note: implications for assessment, implications for coaching, implications of supervision

37 Observation and Safe Patient Care
Importance of appropriate supervision Entrustment Trainee performance* X Appropriate level of supervision** Must = Safe, effective patient-centered care * a function of level of competence in context **a function of attending competence in context


Download ppt "Preparing the Future Primary Care Workforce Together"

Similar presentations


Ads by Google