Welcome to the NQC TA Call on The Basics of Performance Measurement for Quality Improvement May 8, 2008 Nanette Brey Magnani, EdD, NQC Consultant Genevive.

Slides:



Advertisements
Similar presentations
Follow-up after training and supportive supervision The IMAI District Coordinator Course.
Advertisements

ADAP Clinical Quality Management Tutorial Two: How to Develop an ADAP Quality Management Plan The Health Resources and Services Administration, HIV/AIDS.
Missoula City-County Health Department/ Partnership Health Center Missoula, MT Erin Chambers (406) National Quality Center.
Funded by HRSA HIV/AIDS Bureau An Introduction to Performance Measurement for Quality Improvement Lori DeLorenzo, Jennifer Keller & Terry Bray Thursday,
Funded by HRSA HIV/AIDS Bureau Selecting an Indicator & Collecting Performance Data Barbara M Rosa, RN-C, MS.
+ Overview of Service Categories Under the Ryan White Care Act – Definitions, Integration, and Evaluation HIV Health & Human Services Planning Council.
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
Quality Improvement Prepeared By Dr: Manal Moussa.
Section 4 Processes & Projects Section 1 Quality Statement Section 2 Quality Infrastructure Section 3 Performance Measures & Annual Goals Section 7 Communicating.
An Introduction to Performance Measurement for Quality Improvement
The Role of the CPCDMS in QM Activities Elizabeth Love, MPH Harris County Public Health and Environmental Services Department HIV Services Section.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
USING URS for QUALITY MANAGEMENT Case Study 1: “How many of the women currently enrolled in the RWCA case management program are actually receiving routine.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
1 Seminar on 2008 SNA Implementation June 2010, Saint John’s, Antigua and Barbuda GULAB SINGH UN Statistics Division Diagnostic Framework: National.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Part A Treatment Adherence Site Visit reviews Kinga Cieloszyk, MD,MPH Deputy Medical Director of Clinical Care, NYCDOHMH, HIV Care, Treatment, and Housing.
Addressing Maternal Depression Healthy Start Interconception Care Learning Collaborative Kimberly Deavers, MPH U.S. Department of Health & Human Services.
Performance Measurement: How Is Data Used in Quality Improvement ? Title I Mental Health Providers Quality Learning Network Quality Learning Network Johanna.
The Ryan White Program and its Expectations for Quality The Quality Academy Tutorial 3.
Choosing Quality Measures for HIV Care and Services The Quality Academy Tutorial 8.
Funded by HRSA HIV/AIDS Bureau The Basics of Performance Measurement for Quality Improvement Nancy Showers, DSW 888-NQC-QI-TA NationalQualityCenter.org.
The Quality Management Plan The Quality Academy Tutorial 5.
Introduction to the Model for Improvement How to Get Started with Quality Improvement Teams The Quality Academy Tutorial 12.
Quality Improvement Resources The Quality Academy Tutorial 4.
1 in+care Campaign Pre-Work Webinar October 12, 2011 Clemens Steinbock Director National Quality Center.
Back to Basics – QI 101 December 19, 2013 Presenters: Jane Caruso – NQC Coach Kevin Garrett – NQC Senior Manager.
1 Quality Management for Non-Clinical Care Barbara Rosa, RN-C, MS Thursday, August 26; 10-11:30am Washington 1 RWA-0419.
Collecting Performance Data The Quality Academy Tutorial 9.
1 An Introduction to Performance Measurement for Quality Improvement Barbara Rosa Wednesday, August 25, 11-12:30pm Maryland B RWA-0416 Quality Institute.
Funded by HRSA HIV/AIDS Bureau HRSA/ HAB Quality Expectations Magda Barini-García, MD, MPH CAPT USPHS CMO - Division of Science & Policy HAB Quality Lead.
Funded by HRSA HIV/AIDS Bureau Titles I & II Technical Assistance (TA) Webex January 11, 2007 Donna Yutzy, NQC Consultant Quality Management 101.
Funded by HRSA HIV/AIDS Bureau How to write and update a Quality Management Plan? Clemens Steinböck, MBA “How can you make this topic entertaining and.
What is a HIVQUAL Regional Group? How can it help my HIV care program?
How to Work with Your Subrecipients on Joint Quality Improvement Goals
Indianapolis TGA Presentation
Strategies to Reduce Antibiotic Resistance and to Improve Infection Control Robin Oliver, M.D., CPE.
Developing a Quality Management Plan December 2005
Charlotte/ TGA Presentation
NYSDOH AIDS Institute Quality of Care Program eHIVQUAL
Regional Group Networking and Peer Exchange
Marlene Matosky and Susan Robilotto HIV/AIDS Bureau (HAB)
The Basics of Performance Measurement for Quality Improvement Nancy Showers, DSW 888-NQC-QI-TA NationalQualityCenter.org.
What is a HIVQUAL Regional Group? How can it help my HIV care program?
Oakland EMA Patricia La Brie Calloway, R.N., P.H.N. Program Manager
HAB/NQC Cross-Part Collaborative An Overview
Using Regional Groups and Peer Learning to Improve HIV Care
RtI Innovations: Evaluation Anna Harms & Jose Castillo
A COLLABORATIVE APPROACH TO ESTABLISH PREDICTORS
Developing Performance Measures
Using Regional Groups and Peer Learning to Improve HIV Care
HIV Quality Improvement (QI) and the Treatment Cascade: How QI has Impacted Reach, Recruitment, Testing, Treatment, and Retention Efforts in Thailand?
Choosing a Quality Improvement Project
How to Work with Your Subrecipients on Joint Quality Improvement Goals
If you are having trouble with Webex, please call
How Can I Use My Completeness Report to Improve Data Quality?
SRH & HIV Linkages Agenda
Project Management Process Groups
Using Data for Program Improvement
Retention: What It Means for You
Concepts of Nursing NUR 212
Action Planning for Quality Improvement
Using Data for Program Improvement
Sandra M. Foote Senior Advisor, Chronic Care Improvement June 23, 2005
Needs Assessment Slides for Module 4
monitoring & evaluation THD Unit, Stop TB department WHO Geneva
Quality Management Planning
Crisis Response and Information Services
Baltimore Eligible Metropolitan Area (EMA) Planning Council Meeting
Illustrative Cluster Detection and Response Strategy
Presentation transcript:

Welcome to the NQC TA Call on The Basics of Performance Measurement for Quality Improvement May 8, 2008 Nanette Brey Magnani, EdD, NQC Consultant Genevive Meredith, Maine Part B Jack Rustico, Connecticut Children,Youth and Family Network Hollie Malamud-Price, Michigan Part B & D

Linking Performance Measurement and Quality Improvement Infrastructure

How to Go in Circles QI Change QI Measure

Trends in QM From monitoring (QA) to improvement projects (QM) From QA by administrators to QM by teams From core medical indicators to expanded scope of process indicators From 100% goals to goals by benchmarking (internal, external) From data by hand to data by computer From process to outcome indicators Accountability to/ inclusion of consumers From program to regional QM

Example: Maine Part B Program Participant of NQC/HAB Low Incidence Initiative Realized difference between Quality Assurance and Quality Management Developed tools to feed data back to agencies for QI; little agency response Now QM part of yearly subgrantee contracts which include QI Projects/PDSA cycles

Basics of Performance Measurement Why measure? What to measure? When to measure? How to measure? Strategic planning for measurement

Why Measure?

Reasons to Measure Separates what you think is happening from what really is happening Establishes a baseline: It’s ok to start out with low scores! Determines whether changes actually lead to improvements Avoids slippage

Reasons to Measure (cont.) Ongoing / periodic monitoring identifies problems as they emerge Measurement allows for comparison across sites, programs, EMAs, TGAs and states and across years The Ryan White Treatment Modernization Act of 2006 mandates performance measurement The HIV/AIDS Bureau places strong emphasis on quality management

What to Measure

What is a Quality Indicator? A quality of care indicator is an aspect of patient care that is measured to evaluate the extent to which a facility provides or achieves a particular element of care. Generally, based on specific standards of care derived from guidelines issued by a professional society and/or government agency.

Process Indicator Topic Areas Medical processes Case management processes Clinic / agency / EMA / state processes Patient utilization of care -underutilization -overutilization -misutilization State, EMA,TGA common processes Coordination of care processes

Example: Clinical HAB Core Measures http://hab. hrsa % of clients with HIV infection who had 2 or more CD4 T-cell counts performed in the measurement year % of clients with AIDS who are prescribed HAART % of clients with HIV infection who had 2 or more medical visits in an HIV care setting in the measurement year % of clients with HIV infection and a CD4 T-cell counts below 200 cells/mm3 who were prescribed PCP prophylaxis % of women with HIV infection who are prescribed ARV therapy

Example: HIVQUAL Measures www.hivqual.org Clinical visits HIV specialist care ARV Therapy Management Adherence assessment HIV monitoring Lipid screening Gynecology care STD management Hepatitis C Screening Mental Health Prevention education Health literacy screening (pilot) Baseline resistance testing (pilot)

Example: Coordination of Care – Michigan Department of Community Health – Part D Initial chart reviews to establish baseline data Referrals documented in care plans Referral documented between agencies Use referral field in CAREWare Case management – documentation of any referral Medical - Colposcopy, dental and opthamology (new for Part D, Part C) Performance measure: Clients with identified needs will have documentation of referrals. 75% of charts will have documentation of referrals. Number of charts documenting referrals/number of clients with documented needs.

Outcome Indicator Topic Areas Patient Health Status Intermediate outcomes like immune and virological status Survival Symptoms Disease progression Disability Subjective health status Hospital and ER visits

Example: Outcomes- Achieving Undectable Viral Loads Snapshot of Antiretroviral Treatment and Success at the Grand Junction Western Slope HIV Collaborative Clinic Chart review on 4/23/08 137 GJ patients with at least one medical visit in the period 7/1/07-12/31/07 113 (82%) of GJ patients were on ART Of the 113, 97 (87%) and 77 (68%) had an HIV VL < 400 and < 50 respectively 75% of the patients not on ART had either a CD4 > 350 or had declined therapy Source: Merilou Johnson and Lucy Graham

Example: Maine Part B Program Choosing Process and Outcome Measures Every agency participating in reporting data is at the table, as well as consumer reps First agreed on outcomes of medical case management Then determined outputs and the process to achieve outputs

Example cont’d: Maine Part B Outcome of medical case management: Improved quality of life Outputs: Achievement of short term client-defined goals Processes: Comprehensive annual assessment Quarterly care plans Care plans need minimum of two client-defined goals for focus of intervention during that quarter

Example: Maine Statewide All Parts Indicators Outcomes and Process Receive adequate medical monitoring 2 or more medical visits/year 2 or more CD4/year 2 or more VLs/year Receiving adequate medical care CD4 < 200 receive HAART Pregnant women receiving ARV therapy CD4 <200 receiving PCP prophylaxis Comprehensive Care Case management measures 20

What is a Good Indicator? Relevance- How Important is the Indicator? Does the indicator affect a lot of people or programs? Does the indicator have a great impact on the programs or patients/clients in your EMA, TGA or state? Measurability Can the indicator realistically and efficiently be measured given finite resources? Measurability

What is a Good Indicator? (Cont’d) Accuracy Is the indicator based on accepted guidelines or developed through formal group-decision making methods? Improvability Can the performance rate associated with the indicator realistically be improved given the limitations of services and population?

Specify criteria to define your measurement population Location: all sites, or only some? Gender: men, women, or both? Age: any limits? Client conditions: all HIV-infected clients, or only those with a specific diagnosis? Treatment status? To start, you need to define your eligibility criteria for the measurement population. The measurement population consists of those patients who are eligible for measurement based on pre-established criteria. Defining a population requires identifying both which records should be reviewed and which should not. The key point here is to select the focus of your data collection efforts. Consider the following criteria to define your measurement population: - Location: What facilities within the care system will be included? - Gender: Does the indicator apply exclusively to men or women, or to both? - Age: Are there particular age limits? - Patient condition: Is a confirmed diagnosis required, or simply symptoms or signs? Do certain conditions make the patient ineligible? - Active treatment status: How many visits are required for eligibility? Must the patient currently be in treatment? Must the treatment have occurred within a certain time frame? When you are finished addressing these questions, you will have a list of eligibility criteria. Sampling Records

Portland TGA Performance Measures: Using the Chronic Care Model Framework

Process for Developing Relevant and Accurate Indicators An example from Children Youth and Family Network (CYFAN,CT)

Connecticut Statewide Medical Case Management Indicators HRSA facilitator assigned through RW Part A and B Project Officer at beginning of process; very important neutral party for buy-in Statewide group representing RW Parts A,B,C and D (mostly providers and administrators at the table) began meeting Summer, 2007 Opportunity was good for informal collaboration across RW Care Act, generally missing in Connecticut for some time. A combination of face to face meetings (5 total to date) and email exchange Individual Administrators and Providers across Connecticut were assigned the task of discussing Medical Case Management concepts and indicators with front-line staff and consumers. In some cases this approach worked well in other cases not so well. Group worked off of existing HRSA definition of Medical Case Management for the most part

Examples of Medical Case Management Standards in Connecticut Client records/care plans will have medical assessment performed every 3 months and eligibility & support services assessment performed every 6 months. Referral linkages will be tracked. Agencies document referrals in appropriate data base and/or progress notes. Care Plans should be signed by the case manager developing the plan and by the client. The client’s signature confirms that the client understands the plan (if the client does not sign the Care Plan, document reason in the client’s Progress Note). Progress note entries must include the full legal name and title of the person making the entry. The entry must also be dated and time, title and credentials within five (5) days after an interaction with the client. Client records will have progress notes updated monthly. Medical case managers must receive minimum training requirements established by Parts A, B, C, D.  

Connecticut Statewide Medical Case Management Indicators: Lessons Learned Major Advantages More consistent expectations for service Maximization of resources Better client information available Major Challenges Higher demand on professional development Information access Adaptation to current client encounters

Indicator Definition Tips Base the indicator on guidelines and standards of care when possible Be inclusive (of staff and consumers) when developing an indicator to create ownership Be clear in terms of patient / program characteristics (gender, age, patient condition, provider type, etc.) Set specific time-frames in indicator definitions

Lessons Learned – Portland TGA; Michigan Part D; Maine Part B Programmatic lessons: Keep clients at the center – performance measures are aimed at improving the quality of the care and evaluation programs, not at proving a thesis Don’t let measurement guide the program - the results should guide the program but not the measurability Work with providers to establish measures - provide technical assistance and make compromises; include consumers in process as well Remain flexible with what’s happening at program level e.g. changes in staff Learn together Reality of baseline data could be a shock, i.e., a lot of work to do!

Cont’d Lessons Learned Technical lessons: Be realistic about time involved in collecting data for certain measures Critical to have a client-level data base Provide technical assistance- different capacities of subcontractors Providers are experts at providing care – we need to provide expertise in data analysis Flexibility in terms of software, i.e., CAREWare catching up with what is going on in chart reviews

How to Measure

Create a Plan Decide on a sampling plan (sample size, eligible records, draw a random sample) Develop data collection tools and instructions Train data abstractors Run pilot test (adjust after a few records) Inform other staff of the measurement process Check for data accuracy Remain available for guidance Make a plan for display and distribution of data

Using a Random Sample Use a random sample if the entire population can’t easily be measured “Random selection” means that each record has an equal chance of being included in the sample. The easiest way to select records randomly is to find a random number table and pull each record in the random sequence. Every record needs an equal chance of being included in the sample. You can’t just pick out the records that you know are ”good.” This is not hard to do, but it takes a little while to explain. For detailed instructions, please use “Measuring Clinical Performance: A Guide for HIV Health Care Providers,” available at the web site given here.

Resources to Randomize “Measuring Clinical Performance: A Guide for HIV Health Care Providers” (includes random number tables) A useful website for the generation of random numbers is www.randomizer.org Common spreadsheet programs, such as MS Excel Random sampling is not hard to do. For detailed instructions, please use “Measuring Clinical Performance: A Guide for HIV Health Care Providers,” available at the website shown on the slide. A useful website for the generation of random numbers is also: www.randomizer.org Most spreadsheet programs, such as Microsoft Excel offer random number tables as well. Sampling Records

Collect “Just enough” Data The goal is to improve care, not prove a new theorem 100% is not needed Maximal power is not needed In most cases, a straightforward sample will do just fine The data you need for quality improvement is not the same as the data that drives a peer-reviewed study of a randomized clinical trial. You don’t need to count every chart. You don’t need a precisely defined sample. Simple sampling techniques work quite well enough.

Strategies Depend on Resources Data systems enhance capability More indicators can be measured Indicators can be measured more often Entire populations can be measured Outcome as well as process indicators can be measured Alerts, custom reports help manage care Personnel resources Person power for chart reviews, logs, other means of measurement is needed Expertise in electronic / manual measurement

When to Measure

Frequency You don’t need to measure everything all of the time. You can sample a short period of time and extrapolate the results Balance the frequency of measurement against the cost in resources If limited resources, measure areas of concern more frequently, others less frequently Balance the frequency of measurement against usefulness in producing change Consider the audience. How will frequency best assist in setting priorities and generating change?

National HIVQUAL Data Reports Shows national trends based on self-reported data by participating HIVQUAL grantees Provides an opportunity to compare program performance with national data to highlight areas of improvement opportunity

The HIVQUAL Project 2006 Performance Data Part C and D Programs

Questions for Data Follow-up What are the results for key indicators? What are the major findings based on the generated data reports and your data analysis? What is the frequency of patients / programs not getting care? What is the impact of not getting the care? How does the performance compare with benchmark data? What is the feasibility of improving the care?

Key Questions for Data Follow Up (Cont’d) Example: Maine Part B, MDCH, CYFAN How can you best share the data results with your key stakeholders (Part A/B QI committees, HIV providers, consumers, etc.)? How do you generate ownership among providers and consumers? How will you assist in initiating/implementing QI projects to address the data findings? Who will be responsible and what are the next steps?

On Our Way to… QI Heaven Measure Change

Contact Information Genevive Meredith Genevive.Meredith@maine.gov 207-287-4846 State of Maine Center for Disease Control, Ryan White Part B Program Hollie Malamud-Price malamudh@michigan.gov 313-456 436 Michigan Dept of Community Health, Detroit Lucy Graham lucy.graham@stmarygj.org 970-255-1735 St.Mary’s Family Practice, Grand Junction, CO Jack Rustico jrustico@chcact.org,860-667-7820 Connecticut Children, Youth and Family Network Margy Robinson margaret.l.robinson@co.multnomah.or.us 503-988-3030 Portland TGA

National Quality Center (NQC) NYSDOH AIDS Institute 90 Church Street—13th Floor New York, NY 10007-2919 888-NQC-QI-TA Info@NationalQualityCenter.org NationalQualityCenter.org