Download presentation
Presentation is loading. Please wait.
Published byLynette Campbell Modified over 9 years ago
1
Burden and Loss: The Role of Panel Survey Recordkeeping in Self-report Quality and Nonresponse ITSEW 2010 Ryan Hubbard and Brad Edwards
2
2 Record Usage, Perceived Respondent Burden, and Measurement Record use may increase perceived respondent burden. Respondents experiencing higher levels of perceived burden may be more likely to drop out of the study. The regular use of records should improve event measurement and subsequent matching to administrative data. Record usage may disproportionately affect certain subsamples.
3
Background 3
4
4 Medicare Current Beneficiary Survey Longitudinal Study for Centers for Medicare and Medicaid Services currently in Round 58 16,000 Medicare beneficiaries in a rotating panel design 12 round study including baseline and exit interview over 4 years CAPI interview on health, health care utilization and health care expenditures designed to augment Medicare administrative data on events and payments Matches costs to health care events in an effort to enumerate payers and payments. Relies on respondent collection and interviewer entry of medical insurance statements from multiple sources Asks respondents to voluntarily track health care utilization on a provided calendar.
5
Modeling Error in MCBS Record Usage 5
6
Simplified Model 6 Perceived Burden Non-response Record Usage Measurement Error Ratio of Event Costs Covered by a Statement Panel Attrition Non-Reporting of Events in Admin Records Event and Cost Data Differing from Admin Records Event Calendar Increase Decrease
7
7 Predicting Attrition in Later Waves Main PredictorsOther Predictor Variables in Model Ratio of costs covered by statement Average health care payments per round Any interviews conducted in Spanish Highest degree obtained Ratio of rounds a calendar used SP ever on Medicaid MCBS panelEver use a proxy Average experience of interviewer Same interviewer throughout study Household income < $25,000 Married at some point during study Average interview length (min) Age when entered study (categorical) Eligible for VA benefits Average number of events per round Sex of sample person Average of self reported health (1=excellent;5=poor)
8
8 Summary of Attrition Results Higher levels of statement usage reduces the likelihood for refusal-based attrition. This relationship is stronger than any other in the model. Greater use of the calendar, controlling for other factors, reduces the likelihood for refusal-based attrition. Statement usage, because it is relatively passive, and calendar usage, because it is voluntary and may improve interview flow may actually dissuade attrition through a reduction in perceived burden. The effects we see actually represent unmeasured motivations such as topic salience and altruism. Those invested in the study are more likely to comply with requests for record keeping and complete all rounds of the study.
9
Simplified Model 9 Perceived Burden Non-response Record Usage Measurement Error Ratio of Event Costs Covered by a Statment Panel Attrition Non-Reporting of Events in Admin Records Event and Cost Data Differing from Admin Records Event Calendar
10
10 Research Questions Does statement usage reduce event measurement error? Does statement usage reduce cost measurement error? Are there other factors related to the interview and the respondent that outweigh the effects of record usage?
11
11 Analytical Approach Analysis uses the two most recently completed panels in the study. Explores differences in reporting for respondents who provide insurance statement records vs those who do not. Examines the role of other factors such as demographics and paradata in this process Final goal is to examine match rates for two groups with regard to: Event reporting Cost reporting
12
12 Preliminary Analyses and Findings
13
13 Differences in Event Reporting: Full Sample With StatementsNo Statements N45972964 Average Number of Events6.34.3 Total Number of Providers16.97.8 Total Number of Costs114.532.6 Total No Statement Costs60.832.6 OLS: Average # of Events (N= 5690) BBeta Provided Statements4.13.26 Average Self Reported Health1.66.27 R 2 =.14
14
Full Sample: Full Regression 14 OLS: Average # of Events (N= 5266) BBeta Provided Statements2.62.17 Average Self Reported Health1.66.34 Used Calendar4.88.27 Highest Degree Obtained3.08.11 R 2 =.24 OLS: Average # of Events (N= 5266) BBeta Ratio of Costs Covered by Statement111.37.44 Average Self Reported Health16.80.28 Used Calendar32.66.18 Highest Degree Obtained2.83.10 R 2 =.36
15
15 Differences in Event Reporting: High Event Sample (50+ Events) With StatementsNo Statements N2964198 Average Number of Events10.78.2 Total Number of Providers21.013.9 Total Number of Costs152.777.2 Total No Statement Costs76.776.9 OLS: Average # of Events (N= 3161) BBeta Provided Statements3.08.12 Average Self Reported Health1.58.25 R 2 =.07
16
High Event Sample: Full Regression 16 OLS: Average # of Events (N= 2958) BBeta Provided Statements2.16.09 Average Self Reported Health2.01.33 Used Calendar3.80.18 Highest Degree Obtained.273.10 R 2 =.12 OLS: Average # of Events (N=2958) BBeta Ratio of Costs Covered by Statement92.70.34 Average Self Reported Health18.97.31 Used Calendar26.31.13 Highest Degree Obtained3.33.12 R 2 =.21
17
17 Modeling the Match
18
18 Forthcoming Data Events reported through survey data are matched with CMS administrative data (when available). Events are combined to form a comprehensive reporting of medical events for each sample person. Complex algorithm used for this process that results in a source code for each event. Source code (survey, admin, or both), event type, and event date can be matched back to sample person to help assess measurement error.
19
History of Matching on the MCBS Goal of match to produce data more complete than either survey or billing records (Eppig and Chulis,1996). Administrative records are not the gold standard but provide good evidence: That a health service occurred That Medicare was a payer The amount paid by Medicare Survey data provide: A good record of out-of-pocket and third party payments Record of events/services not covered by or submitted to Medicare
20
20 Matching Details from an Earlier Panel 2008 match data not yet available For all medical practitioner, inpatient and outpatient events in 2006 a raw match indicates: 30% of events come from survey data only 43% of events come from administrative data only 27% of events match (can be found in both) Inpatient events: 55% match Medical practitioner events: 26% match Outpatient events: 26% match Dental Visits: 0.4% match Not a 1 to 1 correspondence
21
21 Workshop Issues Given the rich administrative data to be available, we are interested in advice regarding the best analytical approach for assessing measurement error in: Event reporting Event detail reporting Cost reporting How do we best utilize administrative data that is a good source for matching a subset of survey events
22
Ryan A. Hubbard RyanHubbard@westat.com Brad Edwards BradEdwards@westat.com
23
23 Logistic Regression Results: Full Model N=5193 Variable Standardized Coeff. [B] Odds Ratio [EXP(B)] Ratio of costs covered by statement -3.17.04 Ratio rounds calendar used -.39.68 Ever use a proxy -.47.63 SP ever on Medicaid -.62.54 Average of self reported health.111.12 Eligible for VA benefits -.57.57 Age when entered study (categorical).091.05 Any interviews conducted in Spanish -.56.57 MCBS panel.271.31 Same interviewer throughout study Average interview length (min) -.02.98 Average number of events per round.171.18 Average years of interviewer experience -.05.95
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.