Benchmarking Clinicians Farrokh Alemi, Ph.D.. Why should it be done? Hiring, promotion, and management decisions Help clinicians improve.

Slides:



Advertisements
Similar presentations
Good Help to Those in Need. ® BON SECOURS VIRGINIA HEALTH SYSTEM Guardianship and Patients Who Make Poor Decisions ACMA Virginia Chapter Spring Conference.
Advertisements

“The purpose of an educational institution is to lead the students, who initially believe the educational institution is there to educate them, to the.
Ask Me Anything American Nurses Training Association.
Time Between Charts Farrokh Alemi, Ph.D.. Steps in construction of time in between charts 1. Verify the chart assumptions 2. Select to draw time to success.
Tutorial on Risk Adjusted P-chart Farrokh Alemi, Ph.D.
1 How to Provide Feedback to Your Sponsor? Farrokh Alemi, Ph.D.
Quality Patient Care Is Frequently Measured The Communication Systems Prevalent in Nursing Units. Through Analysis of.
Edward P. Sloan, MD, MPH, FACEP Manuscript Writing: How to Get your Manuscript Written Effectively and Easily.
How to understand a research article Behavioral Research.
RESULTS: PHASE II INFOBUTTONS IN USE: Examples of Context Specific Links to Web-based Materials METHODS: PHASE I Study Design Ethnographic evaluation of.
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
Writing Objectives Rationale and Strategies 1. Session Goals Appreciate the value of writing clear and measurable behavioral objectives. Re-evaluate objectives.
a judgment of what constitutes good or bad Audit a systematic and critical examination to examine or verify.
Medical Reports Dr. Nasser Al - Jarallah.
How to Write a Case Report 101
Project Control Farrokh Alemi, Ph.D. Lee Baliton.
Documentation PN 103. Introduction The “chart” = health care record – LEGAL record The process of adding written information to the chart is called: –
PAIRED TEST Farrokh Alemi Ph.D.. Framework for Hypothesis Testing.
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
 Be familiar with the types of research study designs  Be aware of the advantages, disadvantages, and uses of the various research design types  Recognize.
Chapter 20 Patient Interview. 2 3 Learning Objectives  Define and spell key terms  Define the purpose and the key components of the patient interview.
Module 3. Session DCST Clinical governance
Mini-CEX Mini-clinical evaluation exercise لیلا پاداش
BACKGROUND Health Care Attitudes and Trends among the Pediatric Prescribing Community Mahesh Narayan 1 MB, MSE, Dimple Patel 1 MS, Peter C. Adamson 1,2,3.
THE CASE BOOK. DR. S. YOHANNA REVISION COURSE.
1 The Patient Perspective: Satisfaction Survey Presented at: Disease Management Colloquium June 22, 2005 Shulamit Bernard, RN, PhD.
Module 5: Data Collection. This training session contains information regarding: Audit Cycle Begins Audit Cycle Begins Questionnaire Administration Questionnaire.
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
Professor Kristy K. Taylor.  Job Functions:  Roles and qualities of an Office Manager  Motivate and Mentoring Team Members  Certification  The Office.
Risk Assessment Farrokh Alemi, Ph.D.. Session Objectives 1.Discuss the role of risk assessment in the TQM process. 2.Describe the five severity indices.
Time to Dissatisfied Customer Farrokh Alemi, Ph.D. October 2006.
1 Course on Quality Change Cycles Farrokh Alemi, Ph.D
Module 3. Session Clinical Audit Prepared by J Moorman.
1 FAST Information System Farrokh Alemi, PhD. 2 Design Basis Interviews of interested parties Administrative Office of U.S. Courts PACTS Current system.
Insert Graphic Presentation Title Name Title Location.
Prepared by Dr. Hoda Abdel Azim
Facilitate Group Learning
Project Control Farrokh Alemi, Ph.D.. Course on Project Management Purpose Monitor progress Monitor progress Corrective action Corrective action.
HRM-755 PERFORMANCE MANAGEMENT OSMAN BIN SAIF LECTURE: TWENTY SEVEN 1.
Promoting Quality Care Dr. Gwen Hollaar. Introduction We all want quality in health care –Communities –Patients –Health Care Workers –Managers –MOH /
Clinical Epidemiology and Evidence-based Medicine Unit FKUI – RSCM
Mount Auburn Practice Improvement Program (MA-PIP)
1 How to Change Practice Patterns? Farrokh Alemi, Ph.D.
Risk Adjusted X-bar Chart Farrokh Alemi, Ph.D. Based on Work of Eric Eisenstein and Charles Bethea, The use of patient mix-adjusted control charts to compare.
Documentation and Reporting
Trouble? Can’t type: F11 Can’t hear & speakers okay or can’t see slide? Cntrl R or Go out & come back in 1 Sridhar Rajappan.
Session 6: Data Flow, Data Management, and Data Quality.
Doctor of Physical Therapy Writing and Using Objectives in Clinical Education Harriet Lewis, PT, MS Co Academic Coordinator of Clinical Education Assistant.
Actualizing The EHR Implications For Residency Training.
Objectives of behavioral health integration in the Family Care Center
Documentation and Reporting
Introduction to Evaluation
Project Initiation Farrokh Alemi, Ph.D.
Data Collection Principles
Patient Medical Records
The Nursing Process and Pharmacology Jeanelle F. Jimenez RN, BSN, CCRN
P-Chart Farrokh Alemi, Ph.D..
Comparing two Rates Farrokh Alemi Ph.D.
One way ANALYSIS OF VARIANCE (ANOVA)
Selecting the Right Predictors
Benchmarking Clinicians using Data Balancing
Presenting Data Farrokh Alemi, Ph.D.
A Typical Improvement Effort
Medical Students Documenting in the EMR
Medical Students Documenting in the EMR
Publication of research
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Spring 2019 Room 150 Harvill Building 9:00 - 9:50 Mondays, Wednesdays.
Designing Your Performance Task Assessment
Benchmarking Clinicians using Data Balancing
Risk Adjusted P-chart Farrokh Alemi, Ph.D.
Presentation transcript:

Benchmarking Clinicians Farrokh Alemi, Ph.D.

Why should it be done? Hiring, promotion, and management decisions Help clinicians improve

Intrusion in clinician’s practice? Managers understand patient outcomes Practice profiles are constructed after the fact, when the patient is gone. Both the patient and managers can use benchmarked data  Poor clinicians are bad for the patient as well as for the organization

How should the analysis be done? Compare clinician to average peer Compare clinician to average peer taking care of same kinds of patients Compare clinician to expectations on admission Compare clinician and peers on patients matched on certain features In benchmarking, a clinician's performance is compared to an expected value.

Compare Clinician to Average Peer Calculate peer providers and the clinician’s average and standard deviation Compare using test of hypothesis with unequal means Problem:  Maybe misleading as providers see different kinds of patients and the clinician with more severely ill patients will naturally have worst outcomes

Example Data 123 internal medicine residents at the New York-Presbyterian Hospital in New York City. The outcomes examined included following outcomes:  Patients' satisfaction measured by telephone interviews of at least 10 patients  Disease-management profiles for average of 7 diabetes and 11 hypertensive patients. Patient's condition Frequency of use of various medications  Faculty-evaluations on seven dimensions: History taking Physical examination Differential diagnosis Diagnostic and/or treatment plan Health care maintenance Act compassionately Team player

Sample Report

Compare Clinician to Average Peer Caring for Same Kinds of Patients

Example Data

Comparing Clinicians to Expected Prognosis at Admission Assess patients severity Predict prognosis Calculate pair-wise student-t comparison of observed and expected values

Example Data

Comparing Clinicians When Patient's Severity of Illness Is Not Known

Example Data

Event Tree for Clinician’s Patients Is Kept, Outcomes Change

Is it reasonable to benchmark clinicians? Measurement distorts goals Measurement leads to defensive behavior No adequate measure of severity maybe available Too much time spent on measurement and too little on improvement

How Should Benchmarked Data Be Presented? Before the meeting  Schedule a feedback time and date as soon as possible.  Check your data to make sure there are no errors.  Add text, charts or graphics. Supplement numeric data with anecdotal information and the customer's voice (e.g. a short audio from a patient).  Distribute handouts ahead of meeting to participants

How Should Benchmarked Data Be Presented? At the meeting  Make it clear that the evaluation is confidential  Make a brief introduction of the purpose of the session  Acknowledge the limitation of the practice profiling method  Present the data and not the conclusions  Explicitly ask for clinician's evaluation of the data after each section of the report is presented  DO NOT defend the practice profiling method, the benchmarking effort or any aspect of your work  Thank the clinicians for their time and describe next steps

How Should Benchmarked Data Be Presented? After the meeting  Summarize the comments and append it to the report.  Describe resources available.  Send a written report to each clinician.  Ask the clinicians to comment on: What worked well and what needs improvement? Do they plan to change their practice and in what way? Was it worthwhile?  Set the time of next benchmarking report. .

Take Home Lesson Expected Outcomes Can Be Benchmarked Using Severity of Patients’ Illness