Presentation is loading. Please wait.

Presentation is loading. Please wait.

Module 19 – Day 3 8:00 – 8:15am (15 min) Welcome to Day 3.

Similar presentations


Presentation on theme: "Module 19 – Day 3 8:00 – 8:15am (15 min) Welcome to Day 3."— Presentation transcript:

1 Module 19 – Day 3 8:00 – 8:15am (15 min) Welcome to Day 3

2 Agenda – Day 3 8:00 Welcome and Parking Lot 8:15 Study Groups II
8:30 The Coach as a Strategic Planner: Supporting HIV Providers through QI Action Plans 10:15 Introduction to Performance Measurement 11:45 Study Groups III 12:30 Lunch 1:30 Setting Your Coaching Agenda 2:00 Aha! Moments, Evaluation and Celebration 3:00 Adjourn

3 Parking Lot Review & Discussion

4 Coach as a Strategic Planner
Module 21 – Day 3 8:30 - 9:45 am (75 min) Coach as a Strategic Planner

5 Framework for Coaching Quality Improvement

6 Strategic Thinker Strategically develops an organization-wide QM program and assists providers in doing the same within their networks and agencies. Study by Peter Gollwitzer; writing down goals – 33-75% Decision Paralysis – 47 to 28%

7 Strategic Plans Drive Action Plans
“Planning without action is futile, action without planning is fatal.” Your reactions?

8 What are Strategic Decisions?
Has long term impact Focus on achieving advantage Complex w/ high degree of uncertainty Major resource implications Often concerned with organizational scope of activities Includes matching of activities w/ community, resources and environment Challenges client to stretch, think new, use internal strengths to overcome challenges Impacts operational decisions and leads to wave of secondary decisions that will need support

9 Strategic Thinking …is a foundational skill that allows leaders to be more proactive for today's problems …links what is happening today to tomorrow’s organizational performance, resources and patient outcomes …understands underlying causes and seeks ways to add organizational value …involves often profound personal/professional changes and shifts of paradigm …uses an ideal vision as the underlying basis for planning and continuous improvement

10 Group Exercise: Definitions (10min)
Step 1: Identify in your table group a facilitator and a recorder Step 2: Find brief operational definitions for the following terms – M21 Strategic and Action Planning Definitions: Strategic Plan Work Plan Vision Statement Step 3: Report back to larger group and debrief Step 4: Give feedback to facilitator (2 min) 10

11 From Mission to Action Develop / Review Organizational Vision Statement Develop / Review Strategic Plan Define Measurable, Long-term Objectives Develop Annual Work Plan Monthly Review Process: The results are controlled from the management on a monthly base The review process is also cascaded and drives down to all necessary levels Check action plans and measure TTI´s--If objectives are not reached the responsible person has to analyze the root causes and define countermeasures in order to get back on track“ Define Goals for following Year Evaluate Against Objectives / Goals

12 What are the critical criteria for an effective Vision Statement?
Your reactions?

13 Criteria for Effective Vision Statements
Inspiring and motivating Brief and clearly articulated Bold with long-term aims Positive and without jargon Demonstrating benefits to the larger community

14 Group Exercise: Comparing Vision Statements
Step 1: Join your table group Step 2: Assign a facilitator and recorder Step 3: Review the provided Vision Statements – M21 Comparing Vision Statements - and rate them using the provided assessment tool Step 4: Give feedback to facilitator (2 min) 14

15 Video: Creating a Vision
Andrew Cuomo’s Speech to announce the End the Epidemic Initiative for NYS 15

16 By Now You Know What’s Coming….

17 Introduction to Performance Measurement
Module 22 – Day 3 10:15 – 11:30 am (75min) Introduction to Performance Measurement 17

18 Learning Objectives: You will learn about…
Discuss common language of Improvement Apply HAB performance measures to identify areas for improvement in your clinic/program Understand the importance of performance data for quality improvement Understand the key concepts on how to develop and write an indicator definition

19 Framework for Coaching Quality Improvement

20 Measurement Advocate Develops system-wide performance measurement system reflective of the internal and external needs of an organization. Weatherman IQ test You can’t manage what you don’t meaure

21 Find a Balance between Measurement and Improvement

22 What we want to avoid…….. Quality Management Program

23 What are Data? Data (n) (plural)
Facts or information used usually to calculate, analyze, or plan something Data are facts or information – why is there a distinction between the two? Shouldn’t all data be factual? (This is a good lead in to the next couple of slides which will start to identify the different kinds of data including qualitative data which are not so much “facts” as sometimes “opinions” – a point which was made by one of the pilot TCQ participants who asked, “wait a minute, just because I think Jelly beans are good doesn’t make it a fact” – to which we replied, how true and changed the slide. Always remember to let your audience be experts and don’t be afraid to learn!). The image here is of a cloud to symbolize the information cloud similar to those used by computers, tablets, and smart phones to virtually store information. The information is there, you have to decide you want it and download it – collecting data is similar, you have to decide what data you want and then go to the cloud (in this case the system) and ask it for it! Source: accessed on 04/05/14

24 Numeracy Competency Data are the improvement language…
Data are not a grade but a guide… Variation is the voice of the system. . .

25 “Tracking a few key measures over time is the single most powerful tool a team can use.”
Attributed to T Nolan, PhD Find reference on IHI web site 25

26 How to develop an indicator

27 What is a quality indicator?
A quality indicator is tool to measure specific aspects of care and services that are optimally linked to better health outcomes while being consistent with current professional knowledge and meeting client needs. In Tutorial 2, we defined quality as “the degree to which health services for individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge.” An indicator is a way of measuring whether care and services you provide as well as the activities you perform are linked to improved health outcomes for clients. An indicator takes one aspect of HIV care, and provides a way of assessing how often this specific aspect of care is properly provided. By defining what “properly provided” means (this is where the “current professional knowledge” comes in), indicators enable you to learn about the level of quality in the care your HIV program provides.

28 Dimensions of Quality Technical Quality
Provider Perception of Quality of HIV Care Experience Quality Consumer Perception of Quality of HIV Care Leonard Berry, Texas A&M University, IHI conference 2001

29 What makes a good indicator?
Relevance Does the indicator affect a lot of people or programs? Does the indicator have a great impact on the programs or patients/clients in your EMA, State, network or clinic? Measurability Can the indicator realistically and efficiently be measured given finite resources? Clearly, the universe of things that can be measured is vast. How do we begin to select a manageable number of areas to track? There are four main criteria to use in selecting sound indicators. The first two are: Relevance. Are you looking at something that matters to your program? And measurability. Can you actually measure this aspect of care, given the resources you have?

30 What makes a good indicator? (cont’d.)
Accuracy Is the indicator based on accepted guidelines or developed through formal group-decision making methods? Improvability Can the performance rate associated with the indicator realistically be improved given the limitations of your services and population? The next two criteria are: Accuracy. How valid is this indicator? Does it really reflect “current professional knowledge?” Does it build on accepted guidelines for HIV care? If it deals with an aspect of care not yet covered by a guideline, has there been consensus by professionals and peers? Improvability: the ultimate goal is to improve the quality of care. As you select indicators, focus first on those that will help you improve. If you answer "no" to any of these questions, the indicator—while still relevant to patient care—is probably either too difficult to measure or less than critical to patient care. On the other hand, if you answer "yes" to all of the questions, you have most likely found a viable indicator that will give you the most benefit for your measurement resources.

31 Defining an Indicator Who is eligible to be evaluated?
What part of this population should have received the care being measured? (Who should be counted in the denominator?) What part of those who should have received the care did receive the recommended care? (Who should be counted in the numerator?) We’ve now spoken a lot about what kinds of indicators to use in your HIV program. Selecting indicators can be difficult, although often regulatory agencies will select them for you. But just as important as the selection of the indicator is its precise definition once it has been selected. The key steps in defining an indicator are: Figuring out the overall population: who should be eligible to be evaluated? Determining who, in this overall population, should have received the care being measured. It makes no sense, for example, to measure the level of gynecologic care received by men. Those who “should have received the care” make up the denominator of your measure. The third step then involves determining who among those who “should have received” the care actually did receive the recommended care. These people then become the numerator of your measure.

32 Eligible Patients Denominator Numerator
(pts seen in the clinic in the last 12 months) Denominator (pts at least one medical visit in the measurement year) Numerator (pts viral load less than 200 copies/mL at last HIV viral load test during the measurement year) This diagram displays the concept graphically. The outer circle represents the eligible patients: for example, all the patients seen in your clinic in the last twelve months. The middle circle then becomes those who “should have received the recommended care.” For this example, an indicator of PCP prophylaxis, those who “should have received the care” are the patients with CD4 counts of less than 200. Your denominator. The inner circle then represents those who should have received the therapy and actually received it. Your numerator.

33 How will you define who is eligible to be evaluated?
Location: all sites, or only some? Gender: men, women, or both? Age: any limits? Client conditions: all HIV-infected clients, or only those with a specific diagnosis? Treatment status? How do you define your eligibility criteria for the measurement population, the outer circle in our diagram? One approach is to answer the following questions: - Location: What facilities within the care system will be included? - Gender: Do we want to focus on one sex? - Age: Are there particular age limits? - Patient condition: Is a confirmed diagnosis required, or simply symptoms or signs? Do certain conditions make the patient ineligible? - Active treatment status: How many visits are required for eligibility? Must the patient currently be in treatment? Patients on antiretroviral treatment only? Answering these questions helps you define the parameters of the patient population you will then look at more closely.

34 Steps in defining an indicator
Specify the reasonable requirement for care All HIV-positive ambulatory patients with a CD4 count less than 50 cells/mm3 receive an annual ophthalmology exam Define the indicator % of HIV-positive patients with a CD4 count less than 50 cells/mm3 within the last year have an annual ophthalmology exam documented in the medical record Once you decided who should be in your measurement population, or sample, you can now move to the specifics of defining the indicator. This process has five steps. The first step involves clarifying the specific ‘reasonable requirement’ on which you want to base you indicator definition. You may look at established guidelines or accepted standards of care. In this example, HIV guidelines suggest that the “reasonable requirement” is that all HIV-positive ambulatory patients with a CD4 count less than 50 receive an annual ophthalmology exam. When the requirement for care has been decided, you translate this into a measure: something that begins with “number of” or “percentage of.” In this case, we want to know the percentage of HIV-positive patients with a CD4 count less than 50 within the last year who have an annual ophthalmology exam documented in the medical record.

35 Steps in Defining the Indicator (cont’d.)
Set the denominator: Specifically which patients should be getting the care? The number of HIV-positive patients with a CD4 count less than 50 cells/mm3 within the last year Set the numerator. Which patients got the care? The number of patients with an annual ophthalmology exam documented in the medical record Measurement: Divide the numerator by the denominator to get the performance percentage Now we need to define the denominator, the middle circle in our diagram. Who “should have received” this care? In our example, that’s all HIV-positive patients with a CD4 count less than 50 within the last year. And then the numerator, the inner circle. Of those patients with a CD4 count less than 50, how many had, in fact, an annual ophthalmology exam documented in the medical record? Divide the numerator by the denominator to get a percentage, and you have your performance result. Now you know how you are doing!

36 Tips for defining indicators
Base the indicator on guidelines and standards of care when possible Include staff and consumers when developing an indicator to create ownership Be clear in terms of patient / program characteristics (gender, age, patient condition, provider type, etc.) Set specific time-frames in indicator definitions Defining indicators is important, and at times can be difficult. Here are some tips for success: Use the guidelines that exist, don’t try to re-create them. At the end of this tutorial we provide references for guidelines and indicators, specific to HIV care. Don’t re-invent the wheel. Use what’s already been done. In selecting and defining indicators, involve your staff and consumers. Being measured makes people anxious, one way to lessen this anxiety is to involve those being measured in the selection of the measurements. Be very clear about the patients who belong in the population, denominator and numerator. The clearer you can make these specifications, the easier to use your indicator will be, and you will get more accurate results. Clarify also the time frames. You want specificity, particularly around when certain test should be reasonably performed.

37 Development of HAB Clinical Performance Measures
Underlying assumption: The development and use of performance measures is key to a solid QM program Internal HAB working group identified potential indicators Recommendations from IOM report “Measuring What Matters” used as a framework for developing measures Significant input from key stakeholders was obtained 37

38 HAB Clinical Performance Measures
Measures focus on clinical services provided to adults & adolescents Detail sheets outlined specific information related to each measure Calendar year is the measurement period 38

39 HAB Clinical Performance Measures: Group 1
Represents the 5 core clinical performance measures deemed critical for HIV programs to monitor Indicators*: Viral Load Suppression Prescribed Antiretroviral Therapy Medical Visits Frequency Gap in Medical Visits PCP Prophylaxis Bring up the updated list of measures and make downloadable from the webex. 39

40 HAB Clinical Performance Measures: Group 1
Viral Load Suppression: Percentage of patients, regardless of age, with a diagnosis of HIV with a HIV viral load less than 200 copies/mL at last HIV viral load test during the measurement year Prescribed Antiretroviral Therapy: Percentage of patients, regardless of age, with a diagnosis of HIV prescribed antiretroviral therapy for the treatment of HIV infection during the measurement year Medical Visits Frequency: Percentage of patients, regardless of age, with a diagnosis of HIV who had at least one medical visit in each 6-month period of the 24-month measurement period with a minimum of 60 days between medical visits Gap in Medical Visits: Percentage of patients, regardless of age, with a diagnosis of HIV who did not have a medical visit in the last 6 months of the measurement year PCP Prophylaxis: Percentage of patients aged 6 weeks or older with a diagnosis of HIV/AIDS, who were prescribed Pneumocystis jiroveci pneumonia (PCP) prophylaxis (Use the numerator and denomination that reflect patient population.) Bring up the updated list of measures and make downloadable from the webex. 40

41 HAB Clinical Performance Measures
Adolescent/Adult: Cervical Cancer Screening, Chlamydia Screening, Gonorrhea Screening, Hepatitis B Screening, Hepatitis B Vaccination, Hepatitis C Screening, HIV Risk Counseling, Oral Exam, Pneumococcal Vaccination, Preventive Care and Screening: Screening for Clinical Depression and Follow-Up Plan, Preventive Care and Screening: Tobacco Use: Screening and Cessation Intervention, Substance Use Screening, Syphilis Screening Medical Case Management (MCM): Care Plan, Gap in Medical Visits, Medical Visit Frequency Oral Health: Dental and Medical History, Dental Treatment Plan, Oral Health Education, Periodontal Screening or Examination, Phase I Treatment Plan Completion ADAP: Application Determination, Eligibility Recertification, Formulary, Inappropriate Antiretroviral Regimen Systems-Level: Waiting Time for Initial Access to Outpatient/Ambulatory Medical Care, HIV Test Results for People Living with HIV, HIV Positivity, Late HIV Diagnosis, Linkage to HIV Medical Care, Housing Status Bring up the updated list of measures and make downloadable from the webex. 41

42 NQC Publication ‘Measuring Clinical Performance’ is a guide for HIV providers to learn more about indicator development and data collection Since 1997, the National HIVQUAL Project has developed clear, comprehensive sets of quality indicators for HIV care and also provides software and other tools for data collection. The Project also authored a guide for HIV providers, called ‘Measuring Clinical Performance’ with more detailed information about indicator development and data collection. Go to their web site, explore these indicator resources, and download their guide to measuring clinical performance. 42

43 Group Exercise Pick one topic from the following list:
housing linkage to care oral health education Develop a an indicator definition Report back to the larger group

44 Options for Follow-up Activities
‘Do nothing!’ – if scores are within expected ranges and goals, frequently repeat measurement ‘Take Immediate Individual Action’ – follow-up on individual pts (missed appointments, pts not on PCP prophylaxis, etc) and/or provider ‘Quick PDSA’ – develop a quick pilot test ‘Launch QI Project!’ – set up a cross-functional team to address identified aspects of HIV care Should this go before the PDSA cycle picture. 44

45 Barriers to Putting Data into Action
Don’t even know where to get data/info Paralysis by analysis No one is interested in it Defensiveness Too complex to understand Incorrect interpretation of data What are some of the reasons you can’t move from information to action? After gathering this feedback from the audience, we’ll address each area individually.

46 Kubler Ross Stages of Coping with Data
Denial: “The data are wrong….” Anger: “The data are right, but it’s not a problem…” Bargaining: “The data are right, it’s a problem, but it’s not my problem…” Acceptance: “The data are right, it’s a problem, it’s my problem…” Stop feeling about it, think about it Moves you from this reaction to Next slide 46

47 Tips for Sharing Data Involve stakeholders when reports are generated and disseminated Share reports with staff promptly & listen to variance explanations View performance improvement as a management tool Anticipate defensiveness Watch out for paralysis by analysis (“we need even better data before we act…..”) Its not always easy to create data ownership, so here are some tips that will help you. Involve stakeholders in the process when reports are generated and disseminated. People will be more inclined to take ownership if they feel as though they helped in the process. Share reports with staff promptly & listen to variance explanations View Performance Improvement as a management tool. Do not use results for punishment and include outcomes achieved when evaluating staff. People have a tendency to defend their turf, so communicate data results objectively to avoid defensiveness. Remember, our goal is to take action on the data to improve, so keep everyone focused on the required action and don’t over analyze the data. Data Sharing

48 Module 24 – Day 3 11:45 – 12:30pm (45 min) Study Groups

49 Framework for Coaching Quality Improvement

50 Collaboration Builder
Works collaboratively and helps providers build collaborative partnerships with individuals and groups of health care providers to achieve their improvement goals. Sign in the office – ‘What is the #1 reason why people do not give donations? They have not been asked.’ Cross Part Collaborative NYS Advisory Committee

51 Agenda More time for Study Groups to work on plans – 30min
Return to your planning forms from yesterday’s module Finish the Study Group Planning Matrix Be ready to present a 2-3 minute overview of your Study Group to the large group What are your strengths? What are you planning to work on? What will your challenges be?

52 Study Group Presentations and Debriefing
What did you learn about your team compared to the others? Any surprises or significant findings that were going to impact your team that you needed to mitigate? Questions about how this plan will be used?

53 Lunchtime!!

54 Setting Your Coaching Agenda
Module 25 – Day 3 12:45 – 1:15 pm (30min) Setting Your Coaching Agenda

55 Agenda Review and prioritize your individual Personal Improvement Plan (PIP). Which 2-3 areas will you focus on first? Present your individual PIP to the larger group – 25 min Constructive feedback to presenters …Who will go first?

56 Post-TCB Activities Study Groups Work Flow: PDSA Cycle: Case Study:
Complete a work flow diagram – March 17; teams can be formed to complete this expectation. Results are shared with other Study Group participants for further feedback. PDSA Cycle: Complete a PDSA Cycle on an area for improvement in your work environment – February 10. Case Study: Identify one past or current coaching barrier and share real life strategies how the participant has overcome this barrier – March 17.

57 Sharing of Aha! Moments and Brief Day 3 Evaluation
Module 26 – Day 3 2:00 – 3:00pm (60 min) Sharing of Aha! Moments and Brief Day 3 Evaluation

58 Framework for Coaching Quality Improvement

59 Objective Assessor Assesses individual and organizational performance, gives informative feedback and tracks progress over time.

60 Highlights & Aha! Moments
What have been some of your personal highlights or Aha! Moments from today’s session? Use the next 2 minutes to reflect on today, identify a few ideas

61 The way the course was delivered today was an effective way for me to learn.
Strongly Disagree Disagree Somewhat agree Agree Strongly Agree

62 I had sufficient opportunity to participate today.
Strongly Disagree Disagree Somewhat agree Agree Strongly Agree

63 Materials were useful during the day.
Strongly Disagree Disagree Somewhat agree Agree Strongly Agree

64 The facilities, equipment, etc. were favorable to learning.
Strongly Disagree Disagree Somewhat agree Agree Strongly Agree

65 The agenda and content for today was logically organized.
Strongly Disagree Disagree Somewhat agree Agree Strongly Agree

66 Overall, I was satisfied with the session facilitator(s).
Strongly Disagree Disagree Somewhat agree Agree Strongly Agree

67 I will refer to or use the materials going forward.
Strongly Disagree Disagree Somewhat agree Agree Strongly Agree

68 My knowledge and/or skills increased as a result of today.
Strongly Disagree Disagree Somewhat agree Agree Strongly Agree

69 The workshop had the right balance of lectures and interactive activities.
Strongly Disagree Disagree Somewhat agree Agree Strongly Agree

70 Overall, I was satisfied with today.
Strongly Disagree Disagree Somewhat agree Agree Strongly Agree

71 How ready are you to coach an organization to improve its quality program?
Not Ready Somewhat Ready Mostly Ready Ready Very Ready

72 How complete is your knowledge of coaching roles and functions?
Not Complete Somewhat Complete Mostly Complete Complete Very Complete

73 Additional Comments about Today
What worked well today for you? What should we do differently tomorrow?

74 Thank You :-)

75 Closing Closing Remarks Certificates Group Picture

76 Team Picture

77 National Quality Center (NQC)
NationalQualityCenter.org


Download ppt "Module 19 – Day 3 8:00 – 8:15am (15 min) Welcome to Day 3."

Similar presentations


Ads by Google