Download presentation
Presentation is loading. Please wait.
Published byHeidi Myllymäki Modified over 5 years ago
1
Data-based Decision Making: Understanding, Assessing, and Using Data
Slides for Module 10 Topic: Assessing Data Quality and Usefulness
2
Assessing Data Quality and Usefulness 1
Triangulating Data from Multiple Sources Data Reports and Formats
3
Training Objectives Following the training, participants will be able to: Identify and describe 4 factors to consider in assessing the quality and usefulness of a study or data report Describe the use of triangulation in comparing data from multiple sources and studies Describe at least 5 characteristics of effective HIV data reports prepared for PC/PBs
4
Training Objectives (cont.)
Identify at least 4 ways to make data presentations understandable and useful for all PC/PB members Describe at least 3 ways to make data charts understandable and useful for PC/PB members, including those who are not data experts Describe sound practices for receiving and discussing data
5
Data Myths and Realities
Consumers don’t understand data Many people are uncomfortable using data: more people are innumerate than illiterate People often think they won’t understand HIV-related data and charts because they don’t use them every day It takes a Ph.D. to understand RWHAP data People can learn to understand and use data Moderate training is sufficient to understand most RWHAP data if it is well presented Data will give you the answer Data can be spun to give you many different answers Data vary greatly in quality and need to be assessed EGM
6
Assessing Data Quality and Usefulness 2
All data are not “created equal” – so PC/PBs need to assess the quality of the data and reports they receive
7
Assessing Data Quality
PC/PB member roles: Review data from multiple sources Ask questions about how data were gathered, tabulated, and analyzed Compare and weigh data from different sources and studies Decide how much confidence to place in the data Give the greatest weight in decision making to the “best data” PC/PB Support Staff, Consultants, & Recipient Staff: Provide/present data from various sources Understand and share information on data quality and limitations
8
Critical Factors for Reviewing or “Weighing” Data
Number of respondents/Size of study Representativeness/Sampling Content/Questions Quality Control EGM
9
Numbers and Representativeness: Who Was Included in the Survey or Study? 1
Numbers: Numbers of people or sample size – more weight to data from larger numbers of people Representativeness: More confidence in data when the individuals sampled were chosen to represent the entire HIV population, subpopulation, or the targeted portion of the community – through: Probability sampling Purposive or representative sampling
10
Numbers and Representativeness: Who Was Included in the Survey or Study? 2
Probability sampling: Using a random sampling method where each member of the population has an equal probability of being included, so that findings can be assumed to reflect the entire population from which the sample was drawn Purposive or representative sampling: Selecting people for the study so that they mirror the HIV population in your EMA or TGA or the subpopulation you are targeting
11
Content/Questions and Quality Control
Content/Questions: Look at whether the questions asked were clear and understandable, so that they were likely to generate reliable data that really measure what they were supposed to be measuring Quality control: Look for evidence that the data were collected using appropriate methods and by appropriately trained individuals Data collection process carefully managed/monitored and agreed-upon methods implemented Data reviewed for completeness, nonduplication, and data entry where relevant
12
Questions to Ask in Assessing and Interpreting Surveys and Studies
Who was responsible for the study? Were knowledgeable consumers and other PLWH involved in design? Does the “tool” use good questions? Are they clear and understandable? Do they seem likely to generate reliable data that really measure what the study is supposed to be measuring? Was the tool pre-tested? What was the sample size? Is it representative? What evidence is there that the data were collected using appropriate methods and by trained individuals? Was there “quality control” to be sure the stated data gathering and analysis process was followed?
13
Terms for Key Measures of Data Quality: Reliability and Validity
Reliability involves consistency and “repeatability” of findings – you would get the same results if: You did the study a second time You asked the same person the same question again Validity involves the credibility or “believability” of your findings – they truly represent the phenomenon you are trying to measure The tools used measured what they were supposed to measure (internal validity) The results can be “generalized” or assumed to apply to people beyond the sample in the study (external validity) Source: “Introduction: Reliability and Validity,” UC Davis,
14
Quick Questions A: Assessing Surveys and Studies
Your PC/PB had problems with data reliability in its last PLWH survey in 2016 – a lot of questions seem to have been misunderstood, and findings were inconsistent with epi data and HIV care continuum data. The survey was done in a hurry, so that findings could be used in preparing your Integrated HIV Prevention and Care Plan. PC/PB staff and members have limited expertise with large-scale surveys, but you want to do a new survey next year. What are some practical things you can do to improve data quality? Who within the PC/PB should lead this effort?
15
Quick Questions B: Assessing Surveys and Studies
Your EMA/TGA has several large subrecipients including a federally qualified health center (FQHC) and a clinic associated with a large teaching hospital. These subrecipients all receive research grants and seem to do a lot of internal assessment of client needs and service quality. You want to encourage them to share research findings with the PC/PB, but you also want to be able to review and assess the quality of their data, as you do for other data. What might be a good process for inviting, receiving, and reviewing such studies? What are some possible concerns? Who within the PC/PB should lead this effort?
16
Common Causes of Unreliable Data (Low Reliability)
People don’t know or can’t remember the answer to the question Example: People may not accurately recall their CD4 count so survey data on CD4 counts are often not reliable Example: Asked if they have been “out of care for more than a year at any point since diagnosis,” many PLWH don’t remember Too much information is requested Example: A survey is so long or intake form questions are so complicated that people responding get tired and answer later questions with little thought
17
Common Causes of Data Validity Issues (Low Validity)
Poorly stated questions Example: A survey asks PLWH “whether you have been diagnosed with HIV disease/not AIDS or with AIDS (stage 3 HIV)”; confusion around “AIDS” and “diagnosed” leads people who currently have a high CD4 count to say “HIV/not AIDS” – responses are not valid The sample is small or unrepresentative Example: A focus group of 9 Latina PLWH provides new information about this population, which the PC/PB wants to use to redesign a service model – but the group were all of similar age, all English speakers – and their needs are unlikely to represent the needs of all Latina PLWH in the EMA/TGA
18
Quick Questions C: Data Reliability and Validity Issues
The PC/PB wants to better understand issues facing recently incarcerated PLWH, who appear to have low retention, medication adherence, and viral suppression rates. When you look closely at RWHAP HIV care continuum data, PLWH survey data, and focus group data, you find that each one used a different definition of “recently” and “incarcerated.” You like the focus group definition best (“continuously in jail or prison for at least 6 months during the past 3 years”), but it included only 9 people, so you aren’t sure what weight to give the findings. What kinds of data problems are these? What might you do to avoid them?
19
Triangulating Data from Multiple Sources
A PC/PB can have additional confidence in findings found in more than one data set, and can use multiple methods to increase understanding of a topic or issue–that process is called triangulation
20
Triangulation of Data The process of comparing data on the same topic from 2 or more sources or research studies to: “Cross check” or “cross-validate” the data-- See whether the different sources report similar findings Increase understanding of the topic PC/PBs can have greater confidence in findings that are reported from several different studies or sources or obtained through different methods EGM
21
Approaches to Triangulation
Can involve data from several types of needs assessment activities (such as PLWH survey, focus groups, provider survey) Can also involve data from the recipient or other sources (such as epi data, RSR data on client utilization, HIV care continuum data) Often involves comparing quantitative and qualitative data Comparisons should involve critical review of methods used to obtain data, to decide how much confidence to place in the data from each source
22
Using Triangulation Identify an important question or issue
Look at multiple data sources to see what information they provide on the question or issues Compare data to determine similarities and differences Explore possible reasons for differences Assess data sources to determine which should receive the most “weight” Where findings are similar across sources, especially the “better quality” sources, use those findings in decision making Use findings to inform additional data gathering
23
Examples of Triangulation 1
Issue: Treatment of Chronic Health Problems Data Sources Compared: PLWH Survey, Key Informant Interviews and Monitoring Data PLWH Survey: A very high percentage of PLWH from outlying counties in your new survey report serious health problems, especially diabetes and high blood pressure, that are not being adequately treated Interview and Monitoring Data: Key informant interviews and monitoring data confirm that these PLWH often have no primary care physician and infectious disease specialists are seriously overburdened
24
Examples of Triangulation 2
Issue: Barriers to care/linguistic access Data Sources Compared: Focus Group and Provider Survey Focus Group: A focus group of Latino immigrants reports that some providers have no Spanish-speaking clinical staff and are using receptionists as untrained interpreters Provider Survey: Provider surveys confirm a lack of language capacity and the need for funding for language services
25
Sum Up: Assessing Data Quality and Usefulness
All data are not created equal – PC/PB members need to be able to ask appropriate questions and “weigh” data quality PC/PBs need to ask for and receive data in clear and useful formats, including reports and presentations that include user-friendly charts and explanations Triangulation helps in comparing various data sets and identifying similar findings PLWH play a valuable role in assessing data due to their familiarity with the system of care
26
Data Reports and Formats
For maximum value in planning and decision making, even the most reliable and accurate data need to be presented in clear, understandable formats
27
Importance of How Data are Presented
Written reports, oral presentation, and charts have huge influence on understanding and use of data PC/PBs can become good users of data through: Use of well designed, clear reports and presentations Sound basic training on data sources and uses Mini-trainings provided along with presentations Allocation of the time necessary to discuss and clarify data PC/PB & recipient commitment to ensuring that all members feel comfortable asking questions and know they will receive thoughtful, useful answers
28
Preparing Data Reports for PC/PBs
A data report prepared for the PC/PB should: Include narrative, tables, and charts Provide information on data sources, sample size and methods, and limitations Use plain language – designed for your audience A plain language document looks good, is organized logically, and is understandable the first time you read it Use only necessary technical terms – defined & explained Include an Executive Summary Provide comparisons to epi and other data and studies, local and national Be reviewed in draft by the responsible committee
29
Preparing Oral Presentations
Data presentations to the PC/PB should: Use plain language Define new or unfamiliar terms Maximize use of charts with brief text to explain them Avoid data overload Include structured “mini-training” to maximize understanding of data presented Be previewed by the responsible committee and Executive Committee – and revised as needed Be available electronically before the meeting, and copied and distributed at the meeting
30
Preparing Data Charts Charts in reports or presentations to the PC/PB should: Be clear and easy to read – color contrast, data labels, numbers large enough to read Provide data in a logical order Include brief text to highlight and explain content Use graphics to highlight key data Use consistent formats Same type of chart used for same type of data throughout Same colors used for a population or other variable Specify total number and percent of responses/clients
31
Making Data User-Friendly: Example
The next two slides illustrate how data can be made easier to understand quickly – they present the same data, but: One uses a table, with data presented clearly, but in no obvious order The other uses a bar chart, with data presented in order based on percent of people using each service
32
Sample Data Table: PLWH Survey
Services Received During the Past Year (Percent by Service Category) [N=486] Service Category % Receiving Service Outpatient Ambulatory Health Services 83% Oral Health 55% Mental Health 23% Medical Case Management 92% Medical Nutrition Therapy 43% Housing Assistance 19% Emergency Financial Assistance 27%
33
Sample Data Bar Chart: Same PLWH Survey Data
Services Received During the Past Year (Percent by Service Category) [N=486]
34
Quick Questions D: Different Ways of Presenting Data
Review the previous two slides, which present the same survey data in two different formats: as a data table and as a data bar chart. Which one do you find easier to understand quickly? Why? How much of the difference involves the format (table or bar chart), and how much involves other factors? To what extent does your answer to Question 1 involve personal preferences? To what extent does it involve the level of data training you have had?
35
Presenting and Discussing Data
Summarize methods, numbers, sampling, limitations as needed to assess data quality Refine/focus what you present based on your audience and the expected use of the data Highlight and explain key findings If you “go beyond the data” for interpretation or conclusions, clearly state your assumptions Highlight key issues or uncertainties Allow immediate clarifying questions and discussion at several points and at the end
36
Discussing the Data The presenter and/or PC/PB leadership should:
Allocate meaningful time for discussion whenever data are presented Encourage questions about data source and quality Identify and highlight findings that would benefit from diverse input and interpretation Be sure major points of discussion are summarized and included in minutes Document and follow up on data questions and obtain/present answers
37
Optional Slides for Activities
38
Activity 10.5 Instructions
Work with a small group, choosing a facilitator, recorder, and reporter. Be prepared to share your work with the full group.
39
Activity 10.5: Using Triangulation
Assume you are a committee of the PC/PB that just carried out the mental health data triangulation and received the data described in the handout, and answer the following questions: What do you see as the key findings? How much weight would you give the information from each source? Why? What, if any, additional data might you want to review or collect before deciding on possible action? How might you eventually use these data in planning?
40
Activity 10.6: Assessing Data Charts and Tables
Work in a small group, choosing a facilitator, recorder, and reporter. Review the situation and the data tables, charts, and/or text summaries assigned to your group, and answer the related questions. Prepare your reporter to present the assigned materials and your review to the full group.
41
Situation A: Sample Formats for Epidemiologic Data
The formats presented here all involve tables and charts used to present data that come from Epidemiologic Profiles – though they can of course be used for other types of data presentations
42
Asian/Pacific Islander Mixed Race/ Other/Unknown
Format #1: Race/Ethnicity of People Living with HIV in the EMA by Jurisdiction, 2017 (Percent) N = 12,370 Race/Ethnicity Central City/County N = 5,690 Suburban County #1 N = 3,831 Suburban County #2 N = 2,581 3 Rural Counties N = 258 Black, not Hispanic 65% 57% 35% 36% White, not Hispanic 16% 22% 37% 61% Hispanic 7% 23% 2% Asian/Pacific Islander <1% 1% 4% Mixed Race/ Other/Unknown 3% 0%
43
Format #2: Proportion of Living HIV Cases, by Race/ Ethnicity, Sex and Mode of Transmission, EMA, 2017 (N=7,243)
44
Format #3: Race/Ethnicity of People Living with HIV in the TGA, 2017
Race/Ethnicity of People Living with HV in the TGA, December 2017
45
Format #4: People Living with HIV Disease in the EMA in 2017
A total of 5,828 people are living with HIV disease 47% Black 38% White 12% Hispanic/Latino 60% years old 49% MSM 28% White MSM Out of 10 people living with HIV disease, approximately: 5 are Black 4 are White 1 is Hispanic/Latino 8 are male 6 are years old 5 are MSM 3 are White MSM
46
Format #5: Trends in New Diagnoses
Source: Annual Epidemiology and Surveillance Report: Data through 2016, Washington, DC Sample Epi Profile Chart: 5-Year Trends in Newly Diagnosed HIV Cases by Year and Gender Identity, Washington DC Format #4: Trend chart with data labels showing percentages of newly diagnosed PLWH by year and by gender identity, plus the total number of PLWH newly diagnosed each year. May seem more complicated but can be used each year. Trend chart with data labels showing percentages of newly diagnosed PLWH by year and by gender identity, plus the total number of PLWH newly diagnosed each year. May seem more complicated but can be used each year.
47
Situation B: Sample Formats for Needs Assessment Data
The text slides, charts, and tables presented here come from needs assessment data reports – based on PLWH and providers surveys, sometimes supplemented by focus groups.
48
Format #1: Mental Health & Psychosocial Services
Uses: Mental health services are psychological and psychiatric treatment and counseling services offered to individuals with a diagnosed mental illness, conducted by a licensed mental health professional; psychosocial services include support & counseling services such as support groups that may be provided by peers or other non-licensed personnel (support service) Need/gap: 45% of consumers responding to the survey reported “depression or other mental health issues” in the past year; PLWH indicated a similar level of need for mental health services and “peer support/support groups”: 63% of consumers surveyed reported a need for mental health services, 52% received such services over the prior year, and 11% of those who needed services did not receive them 60% reported a need for peer support/support groups, 46% received them, and 14% of those needing them did not; concern that some may soon end Satisfaction: 72% of mental health clients and 65% of peer support/ support group clients were “very satisfied”; satisfaction with both services was somewhat higher in the central city than in other parts of the TGA
49
Format #2: Sample Needs Assessment Data: PLWH Survey
Co-Occurring Conditions Reported by PLWH, by Gender Identity (Percent) The most frequently reported co-occurring condition PLWH reported dealing with over the past year is depression and other mental health issues – reported more often by transgender and female PLWH than by men
50
Type of Medical Insurance, 2017 (N=3,657)
Format #3 Type of Medical Insurance, 2017 (N=3,657) In 2014, new insurance categories were added to the RW RSR; this is an offshoot of the ACA. Majority of clients had Medicaid or no insurance at all.
51
Format #4 Medical-related Services Needed & Needed but Not Received by PLWH (Percent), Past 12 Months To determine percent who received a service, subtract Needed/Not Received from Total Needed
52
Format #5: Most Important Services to Help PLWH Stay in Care As Reported by PLWH & Medical Case Managers Service PLWH Ranking PLWH N=265 MCM Ranking MCM [N=20] Case Management 1 216 19 Oral Health Services 2 144 9 EFA - Rental Assistance 3 123 7 12 EFA - Utilities 4 111 10 8 Medical care 5 104 18 Medications (ADAP) 6 103 17 Support Groups 73 13 Groceries/Hot Meals 70 11 Transportation 15 Mental Health Services 60 14
53
Situation C: Service Expenditures and Utilization Data
The text slides, charts, and tables presented here all involve data on services expenditures, costs per client, and utilization data, all provided by the recipient.
54
Framework #1: Targeted and Actual Service Utilization by Service Category, 2017
% Attainment OAHS 1,437 1,608 112% Medical Case Management 1,064 2,098 197% Oral Health 386 302 78% Drug Assistance 339 486 143% Substance Abuse Treatment 62 188 303% Mental Health 326 906 278% Medical Nutrition Therapy 137 126 92% EFA - Food Vouchers 279 293 105% Home Delivered Meals 215 198 Medical Transportation 250 364 146% Linguistic Services 210 212 101% Professional Services - Legal 60 34 57% Health Insurance Assistance 132 98 74%
55
Framework #2: Most Used Service Categories, 2017
Rank Service Category Percentage 1 Outpatient Ambulatory Medical Care 79% 2 Medical Case Management 73% 3 Oral Health Care 29% 4 Medical Transportation 23% 5 Health Insurance Assistance 18% 6 Mental Health 17% 7 Food Bank/Home Delivered Meals 15% 8 Emergency Financial Assistance 14% 9 Medical Nutrition Therapy 8% 10 Substance Abuse Treatment 5%
56
Support Service Category
Framework #3: Expenditures, Clients Served, and Costs per Client for Support Services, 2017 Support Service Category Expended # Clients Served Part A Cost per Client Child Care $2,170 27 $80 Emergency Financial Assistance $168,197 323 $521 Food Bank/ Home Delivered Meals $215,252 337 $639 Legal Services $68,066 30 $2,269 Linguistic Services $14,376 73 $197 Medical Transportation Services $38,676 805 $48 Psychosocial Support Services $57,494 26 $2,211 Non-Medical Case Management $286,328 348 $823
57
Framework #4: Most Used Services for Four Special Populations, 2017
MSM of color (n=934) # % Outpatient/Ambulatory Health Services 652 70% Medical Case Management 594 64% Early Intervention Services 148 16% Mental Health Services 132 14% Medical Transportation 215 23% Recently Diagnosed (n=397) # % Outpatient/Ambulatory Health Services 266 67% Medical Case Management 241 61% Early Intervention Services 82 21% Mental Health Services 60 15% Medical Transportation 87 22% Receiving Substance Abuse Treatment Services (n=354) # % Outpatient/Ambulatory Health Services 326 92% Medical Case Management 259 73% Mental Health Services 195 55% Oral Health Care 99 28% Medical Transportation Latina (n=98) # % Medical Case Management 74 76% Outpatient Ambulatory Health Services 65 66% Outreach 18 18% Food Bank/Home Del Meals 17 17% Medical Transportation 16 16%
58
Framework #5: Percent of Funds Expended in Each Core Medical Services Category, 2017
One service category was overspent. Two were underspent by more than 20%.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.