Measuring the Patient Experience at the Physician Group and Doctor Levels: On the Ground in California Cheryl Damberg, Ph.D. Senior Researcher, RAND January.

Slides:



Advertisements
Similar presentations
1 Physicians Involved in the Care of Patients with Recently Diagnosed Cancer CanCORS Provider Composition Writing Group Academy Health Annual Research.
Advertisements

National STD Prevention Conference Philadelphia, PA March 8, 2004 What Do Physicians Know and Want to Learn about STD/HIV Partner Notification?
(CAHPS) Experience of Care Surveys From Design to Implementation
The HCAHPS and Competency Connection HealthStream, Inc. The HCAHPS and Competency Connection HealthStream, Inc.
Race/Ethnic Differences in Reports and Ratings of Health Care Ron D. Hays, Ph.D. RAND.
Integrated Healthcare Association: Statewide Pay for Performance (P4P) Collaborative Ron Bangasser, MD Dolores Yanagihara, MPH National P4P Summit – Preconference.
Clustered or Multilevel Data
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
Holding Health Plans & Providers Accountable for High-Quality, Patient-Centered Care January 23, 2015.
Utilizing severity to interpret changing trends of hospitalized injury rates in the United States, Claudia A. Steiner, MD, MPH 1 Li-Hui Chen,
The CAHPS Health Literacy Item Set Beverly Weidmer, RAND Corporation AHRQ 2009 Annual Conference Research to Reform: Achieving Health System Change Bethesda,
CAHPS Overview Clinician & Group Surveys: Practical Options for Implementation and Use AHRQ ANNUAL MEETING SEPTEMBER 18, 2011 Christine Crofton, PhD CAHPS.
Psychometric Properties of the Consumer Assessment of Healthcare Providers and Systems (CAHPS) Clinician and Group Adult Visit Survey September 11, 2012.
1. Confirming and Supplementing CAHPS® Communication Items Using Feedback from High-Performing Physicians. Ron D. Hays, Ph.D., Professor of Medicine and.
1 Studying the Doctor-Patient Relationship Ron D. Hays, Ph.D. - UCLA Department of Medicine: Division of General Internal.
Research AND (RAND) November 22, 2010 Westat, Rockville, MD.
3 Key “Do’s” of Public Reporting R. Adams Dudley, MD, MBA Professor of Medicine and Health Policy Support: Agency for Healthcare Research and Quality,
Student Engagement Survey Results and Analysis June 2011.
Reporting Medical Group and Physician Performance Patient Experience & Clinical Results June 2006 Ted von Glahn Director of Consumer Engagement Pacific.
Putting Results on Patient Experiences with Physicians to Use July 2007 Ted von Glahn Director of Performance Information and Consumer Engagement Pacific.
CAHPS® Survey to Evaluate the Patient’s Experience with the Medical Home Ron D. Hays, Ph.D. UCLA Department of Medicine RAND Health Program November 18,
Methods and Advantages of Combining HOS and Your Organization’s Data to Improve Quality and Performance Richard D. Hector, MA, MPH, PhD Health Services.
Nursing Excellence Conference April 19,2013
Rewarding Performance: Three-Year Results from California's Statewide Pay-for-Performance Experiment Cheryl L. Damberg, PhD, Kristiana Raube, PhD, Stephanie.
Differences in Patient Reports and Ratings of Ambulatory Health Care by Race/Ethnicity Ron D. Hays, Ph.D. June 13, :45-11:45 (EPN 6021)
Slide 1 Estimating Performance Below the National Level Applying Simulation Methods to TIMSS Fourth Annual IES Research Conference Dan Sherman, Ph.D. American.
Measuring and Reporting Patients’ Experiences with Their Doctors Process, Politics and Public Reports in Massachusetts Melinda Karp MHQP Director of Programs.
Quality Through the Eyes of the Patient: State-of-the-Art Concepts Paul D. Cleary, Ph.D. April 10, 2001 Quality Through the Eyes of the Patient: State-of-the-Art.
Survey Updates and Quality Improvement Resources Julie A. Brown, RAND Corporation AHRQ 2012 Annual Conference Bethesda, MD September 9, 2012.
HS /18/04 Patient Reports and Ratings of Health Care Ron D. Hays, Ph.D. Patient Reports and Ratings of Health Care Ron D. Hays, Ph.D.
Confidential property of UnitedHealthcare. Do not distribute or reproduce without the express permission of UnitedHealthcare. Accountability and Quality.
Maryland Department of Health and Mental Hygiene WB&A Market Research Executive Summary THE 2003 MARYLAND MEDICAID MANAGED CARE CUSTOMER SATISFACTION SURVEY.
Use of CAHPS® Database by Researchers: Findings Related to Differences by Race and Ethnicity Ron D. Hays, Ph.D. RAND.
3 Key “Do’s” of Public Report Design and Tools That Can Help You Do Them Dale Shaller, MPA Shaller Consulting Group AHRQ 2010 Annual Conference September.
Quality Measurement and Gender Differences in Managed Care Populations with Chronic Diseases Ann F. Chou Carol Weisman Arlene Bierman Sarah Hudson Scholle.
- a Rewarding Results National Grant Pay for Performance: Driving Improvement through Provider Recognition & Reward MCOL Healthcare Web Summit Participating.
Measuring and Rewarding Physician Performance: A National Movement David S. P. Hopkins, Ph.D. Pacific Business Group on Health Provider Reimbursement Web.
California Pay for Performance: Reporting First Year Results and The Business Case for IT Investment Lance Lang, MD Health Net, California November 18,
Overview of CAHPS ® and the National CAHPS ® Database Assessing Patients’ Experiences with Care: Using CAHPS ® as a Standardized Quality Metric Dale Shaller,
The Health and Wellbeing Study: An Investigation into the Perceived Health and Wellbeing of Irish Adults Living with Asthma in Ireland Dr Mary Hughes,
Measuring Patients’ Experiences With Individual Physicians: Are We Ready for Primetime? Presented at: Academy Health Annual Research Meeting San Diego,
Second Generation P4P Community-Wide Diabetes and Asthma Care Thomas Foels, MD MMM
Quality of Care in Physician Groups Do Larger Integrated Systems Deliver Higher Quality Care? Ateev Mehrotra MD MPH RAND Pittsburgh & University of Pittsburgh.
Are Positive Experiences with Health Care Bad for Health? June 14, 2015 (AcademyHealth) Minneapolis Convention Center 1301 S. 2nd Avenue, Minneapolis,
__________________________________________________ California Office of the Patient Advocate Business, Transportation & Housing Agency 2005 HMO Report.
CAHPS Clinician & Group Survey 2.0 Update Ron D. Hays RAND, Santa Monica, CA UCLA, Los Angeles, CA.
The California Pay for Performance Program Stephen Shortell, Ph.D., MPH Dean, School of Public Health University of California at Berkeley National Pay.
Assessing Patient Satisfaction Ron D. Hays UCLA Division of General Internal Medicine and Health Services Research RAND Health Program AUA Foundation Summer.
Patient Reports about Health IT Developing & Field Testing Survey Questions for Outpatient Settings Keith McInnes Boston University School of Public Health.
CAHPS PATIENT EXPERIENCE SURVEYS AHRQ ANNUAL MEETING SEPTEMBER 2012 Christine Crofton, PhD CAHPS Project Officer.
Care Coordination and Electronic Health Records AcademyHealth Annual Research Meeting June 9, 2008 Connecting the Medical Home with the Rest of the Village.
The Health Literacy of America’s Adults Summary of Results from the 2003 NAAL NIFL/LINCS Region II Health Literacy Summit March 5, 2008.
IMPORTANCE OF STATISTICS MR.CHITHRAVEL.V ASST.PROFESSOR ACN.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
Civility Among Healthcare Employees: The Impact on Patients AcademyHealth Annual Research Meeting June 2005  Boston, MA Mark Meterko PhD 1, David Mohr.
Performance of CAHPS® Health IT items September 20, 2011 Ron D. Hays, Ph.D. RAND, Santa Monica, Ca.
Performance of CAHPS® Health IT items September 20, 2011 Ron D. Hays, Ph.D. RAND, Santa Monica, Ca.
Reporting on Patients' Experiences with Primary Care: Resources and Examples Lise Rybowski The Severyn Group September 18, 2011 Presented at the AHRQ 2011.
Incentive Payments and Public Reporting Jessica DiLorenzo GE Corporate Health Care Initiatives February 6, 2006.
The Hospital CAHPS Program Presented by Maureen Parrish.
Independence Plan Update February 26, © 2009 Harvard Pilgrim Health Care2 Key Points  Independence Plan introduced in 2005 –Tiered copayment product.
SUSTAINING QUALITY THROUGH EXCELLENT PATIENT EXPERIENCE ​ Katie Saul, MPH ​ JSI Research & Training Institute, Inc.
Disparities in process and outcome measures among adults with persistent asthma David M. Mosen, PhD, MPH; Michael Schatz, MD, MS; Rachel Gold, PhD; Winston.
Reporting Approaches and Best Practices Jennifer Benjamin NCQA
Experience Survey Results CTC Practice Reporting Workgroup
The Patient-Centered Medical Home & Health 2.0
Efficiency in P4P: Guiding Principles for Implementing a Successful Physician Efficiency Profiling Program Dr. Jonathan Niloff Tuesday, March 10, 2009.
Value-Based Healthcare: The Evolving Model
Evaluating AETC NCRC Partnerships for Impact
Patient-reported Outcome Measures
Presentation transcript:

Measuring the Patient Experience at the Physician Group and Doctor Levels: On the Ground in California Cheryl Damberg, Ph.D. Senior Researcher, RAND January 27, 2011 CHCF Conference: Transforming Healthcare Through the Patient Experience “The Science of Measurement—Collecting Meaningful and Actionable Data”

2 1/27/2011 Background on Patient Experience Surveying in California 1996 and 1998 Pacific Business Group on Health (PBGH) launched a patient experience survey in 58 medical groups in California –Also measured 2-year changes in patient functional status (SF-12) Found better quality of care resulted in slower functional decline * Redundant and conflicting survey efforts by the health plans led to consolidated effort to use common patient experience survey to assess medical group performance *KL Kahn, DM Tisnado, JL Adams, H Liu, W Chen, F Hu, CM Mangione, RD Hays, and Cheryl L. Damberg. Does Ambulatory Process of Care Predict Health-Related Quality of Life Outcomes for Patients with Chronic Disease? Health Services Research 42:1, Part I (February 2007)

3 1/27/2011 Patient Experience Measurement Current Landscape in California Patient Assessment Survey (PAS) Group-level Survey Doctor-level Surveys Doctor- Level PAS Group results Transparency: Office of the Public Advocate Consumer Website Pay-for-Performance: IHA P4P Program Quality Improvement: Physician Groups California Quality Collaborative (CQC) Pay-for-Performance; Medical Group Internal Incentives

4 1/27/ Surveying the Patient Experience at the Physician Group Level Since 2001, the California Collaborative Healthcare Reporting Initiative (CCHRI) has been annually fielding a patient experience survey to assess the care in ~140 medical groups statewide –Results are publicly reported Use the Patient Assessment Survey (PAS) –A derivative of the Clinician-Group CAHPS survey 2010 Survey: –Commercially-insured HMO/POS enrollees, ages >=18 Randomly sampled 900 adult patients per group – n=450 PCPs and n=450 specialists –139 medical groups (179 reporting units) –37.2% overall response rate

5 1/27/ Key Findings for 2010 PAS Performance gains –More groups highly rated by the Office of the Public Advocate (OPA) public report card –Steady, small score increases 5 yrs –Lowest quartile groups improve Performance variation –5-10 point score range across groups –Lowest scoring groups close gap –Variation at practice site and MD levels Performance gaps & opportunities –Access and chronic care self-mgm’t –Higher performance outside CA

6 1/27/2011 Medical Group Performance Advances Public Reporting: Office of Patient Advocate Ratings— 2010 vs Performance Ratings % Medical Groups 2010 Performance Ratings % Medical Groups 2006 Excellent 15%1% Good 66%59% Fair 19%39% Poor 01%

7 1/27/2011 Steady Small Gains in Statewide Average Performance +1.6 pts +2.5 pts+4.1 pts +2.1 pts

8 1/27/2011 Lowest scoring groups see larger score gain compared to highest scoring groups over 5 years ( )

9 1/27/2011 Performance Variation Across Medical Groups 2010 Mean10 th PCT90 th PCT Patient-Doctor Interactions Office Staff Coordinated Care Patient Access Chronic Care Health Promotion

10 1/27/2011 Why focus on Doctor Level? Variation at Group, Practice Site and Physician Levels Medical Group “A” Access Mean Score Practice Sites Access Mean Score Physicians Access Mean Score Medical Group 70PED Site 1 80 Spec Site 2 73 PCP Site 3 72 PCP Site 4 71 PCP Site 5 67 PCP Site 6 67 PED Site 7 65 OBG Site 8 64 Top 5 MDs MD1 93 MD2 88 MD3 88 MD4 88 MD 5 88 Low 5 MDs MD6 53 MD7 52 MD8 50 MD9 48 MD10 42 One medical group example: patient access

11 1/27/2011 Desire to Close the Gap Between CA and Other CG-CAHPS Sites (Mean Scores) US National ‘05/06 NY ‘06Med Center Mass ‘07 CA ‘10 Needed care soon enough Regular appt. soon enough Wait less than 15 minutes Regular hrs call-back After hrs care met needs Doctor explains things Doctor listens carefully Instructions for symptoms Doctor knows medical hist Doctor spends enough time Doctor shows respect Adjustment is limited to what was common among these datasets, age, sex, race, general health, but not chronic conditions.

12 1/27/ Doctor-Level Survey Sponsored by CCHRI, as an extension of the group- level Patient Assessment Survey (PAS) –Designed to help groups reduce variation and improve scores on statewide P4P measures 29 physician groups participated –Survey was mailed to 158,000 patients Response rate: 35% (over 56,000 patients) ~35 or more patients of each physician completed the survey Survey modes: 2 mailings (or could complete on-line through web link) Results were confidential (used internally by the medical groups and their physicians)

13 1/27/2011 Sample Report Card to Doctors

14 1/27/2011 Sample Physician Report Card

15 1/27/2011 The 21 st Century: Sites Displaying Doctor Ratings Continue to Grow and Evolve

16 1/27/2011 Methods Issues to Consider Psychometric properties of Measures Reliability Validity Risk of Misclassification

17 1/27/2011 Psychometric Properties: 2010 Doctor Survey Range of Item-Scale Correlations % Ceiling Estimated MD-level reliability (n=35) Minimum N required to achieve 0.70 MD- level reliability 25 th – 75 th range of means across MDs Adult PCP N PT = 44,519 N MD =1,111 Quality of MD-Patient Interactions (n=6 items) Health Promotion (n=2 items) Access (n=5 items) Coordination of Care (n=2 items) Chronic Care (n=5 items) Office Staff (n=2 items)

18 1/27/2011 Methods Issue: Reliability * A statistical concept that describes how well one can confidently distinguish the performance of one provider from another Measured as the ratio of the “signal” to the “noise” –The between-provider variation in performance is the “signal” –The within-provider measurement error is the “noise” –Measured on a 0.0 to 1.0 scale Zero = all variability is due to noise or measurement error 1.0 = all the variability is due to real differences in performance *Source: Adams JL, The Reliability of Provider Profiling: A Tutorial, Santa Monica, Calif.: RAND Corporation,TR-653-NCQA, As of June 8, 2010:

19 1/27/2011 Between-Provider Performance Variation = average performance for each provider Higher between-provider variation (easier to tell who is best) Lower between-provider variation (harder to tell who is best)

20 1/27/2011 Different Levels of Measurement Error (Uncertainty about the “true” average performance) = average performance for each provider Higher measurement error (harder to tell who is best) Lower measurement error (easier to tell who is best) = range of uncertainty about “true” average performance

21 1/27/2011 Link between Reliability* and Risk of Misclassification Higher reliability in a measure: –Means more signal, less noise –Reduces likelihood that you will classify provider in “wrong” category Reliability is function of: –Provider-to-provider variation –Sample size Per Adams*: “Reliability ASSUMES validity” *Adams JL, The Reliability of Provider Profiling: A Tutorial, Santa Monica, Calif.: RAND Corporation,TR-653-NCQA, As of June 8, 2010:

22 1/27/2011 Validity Validity –the extent to which the performance information means what it is supposed to mean, rather than meaning something else –Ask yourself: Does the measurement measure what it claims to measure? Consider whether these threats to validity exist: –Is the measure controllable by the provider? –Does patient behavior affects the measure? –Is the measure affected by differences in the patients being treated (case mix)? –Is the measure controlled by other factors than the provider?

23 1/27/2011 Good Resource for Provider Measurement and Reporting AHRQ Report: –Friedberg MW, Damberg CL. Methodological Considerations in Generating Provider Performance Scores for Use in Public Reporting by Chartered Value Exchanges. Rockville, MD: Agency for Healthcare Research and Quality; To obtain a copy of the White Paper, please and she will put you on the distribution list