Download presentation
Presentation is loading. Please wait.
Published byLesley Barton Modified over 6 years ago
1
Is It Worth the Cost? The Use of a Survey Invitation Letter to Increase Response to an Survey. Brian Robertson, PhD, VP Research John Charles, MS, Research Analyst Mark Noyes, MPH, Research Analyst AAPOR Conference May 18, 2018 Denver, CO
2
Acknowledgement We would like to thank the Massachusetts Health Connector Authority for allowing us to design and implement this experiment and to present our findings.
3
Subtitle The push to web design Survey background Research goals
Our experiment Our results Costs Conclusions Subtitle
4
The push-to-web methodology
Push-to-web involves contacting consumers by mail to request an online response to a survey. Push-to-web survey design are used when addresses are unavailable. Previous research found that embedding a cash incentive with a survey invitation letter had a significant impact on response rates (Messer and Dillman, 2011). But is it worth the cost when addresses are readily available? There is little research on whether invitation letters are useful when a reliable list of s is available.
5
MA Health Connector Survey background
Massachusetts Health Connector Authority is the governing body for the state’s health insurance exchange. Market Decisions Research (MDR) has worked with the Health Connecter since 2010 surveying current enrollees as well as dis-enrollees. The survey asks about reasons for enrolling, experience with enrollment, any problems experienced, past insurance coverage, health care access and barriers, costs, and demographics. The prior surveys had relied on two methodologies over the years: Telephone and mail (2010 – 2014) /web survey with telephone follow-up (2015 – 2016) An invitation letter without an incentive + invitations and reminders
6
Changing the Data collection Methodology
When data collection used the online/telephone methodology we found that more than three-quarters of the surveys were completed online. Due to previous success, data collection moved to a web only methodology in 2017: Send and invitation letter to respondents as in prior administrations. New to the Data Collection: Offer a token incentive ($2) to respondents. Include the incentive in the invitation letter. Following the letter, send an invitation along with two reminder s.
7
Research goals An opportunity:
Would a letter and token incentive increase survey response and improve data quality? First, would it increase the response rate among all respondents? And increase the response rate among those groups typically less likely to respond (dis-enrollees, younger respondents, males). Second, would it improve data quality? Would respondents more closely matching the actual geodemographic profile of the target population? But also consider: What are the cost implications? Would improvements in response and data quality be worth the additional cost?
8
Experimental design The sample was separated into two experimental groups and two control groups. Both the experimental and control included a group of current enrollees and dis-enrollees (those who did not retain their insurance coverage). The two control groups were sent an the invitation and two reminder s. The two experimental groups were first sent an invitation letter with a $2 token incentive. This was followed by an invitation and two reminder s.
9
Survey response Group Sample Size Completed Surveys
Experimental: Mail Invitation and 3 s 6,768 923 Control: 3 s 4,200 273 Total 10,968 1,197 It required a median of 15 minutes for a respondent to complete the online survey.
10
Our results The letter and token incentive did: But…
Increase overall survey response. Increase response rates among current enrollees and dis-enrollees. Increase response among both male respondents and younger respondents. But… The greatest increase in response rates were among female respondents and those 55 and older. These groups also had the highest response rates in the control group.
11
Survey response rates Total Current Enrollees Dis-enrollees
Experimental: Mail Invitation and 3 s 13.6% 16.2% 7.4% Control: 3 s 6.5% 7.1% 4.6% Percentage Point Increase 7.1% 9.1% 2.8% Percent Increase 110% 129% 60%
12
Response over time
13
By gender
14
By age group
15
By area of the state
16
By receipt of subsidy
17
Data quality Did the experimental group do a better job of matching the geodemographic profile of the actual population? The experimental groups were more representative than the control group in terms of: The age distribution Whether a subsidy was provided to help pay cost of premiums But were less representative when looking at: Gender Geography
18
By gender
19
By age group
20
By area of the state
21
By receipt of subsidy
22
Absolute difference in representativeness
Experimental Control Age 32% 43% Gender 10% 0% Region 15% 11% Subsidy 30% The results show that age and subsidy status were better represented in the experimental group while gender and region were better represented in the control group. Each number represents the absolute value of the difference between the group and the actual population percentage summed across all categories. The smaller the number, the closer to the distribution of the population.
23
These are the costs directly associated with data collection.
Sending the letter and token incentive significantly increased the cost per completed survey. Experimental Group Control Group Online survey costs Sample management Printing Postage Incentives These are the costs directly associated with data collection.
24
Overall the cost per complete was nearly 8 times higher among those sent the invitation letter.
Control Experimental Printing and mail preparation $0.00 $5,947.50 Postage charges $2,630.74 Incentives $13,050.00 Total cost of Invitation Letter $21,628.24 Cost per letter mailed $3.31 Invitation letter costs per completed survey $23.43 Online survey cost per completed survey $1.25 Sample management costs $608.00 $992.00 Total cost per completed survey $3.48 $25.76
25
Conclusions A survey invitation letter did significantly increase survey response rates overall and among groups traditionally less likely to respond. However, the largest increase in response rates were among those already most likely to respond. The increase came at a notable expense. The survey invitation letter did not (really) improve how well respondents matched the geodemographic profile of the population. With an accurate list, these results suggest that increasing the sample size and accepting a lower response rate may be a reasonable (and less expensive) alternative.
26
Is It Worth the Cost? The Use of a Survey Invitation Letter to Increase Response to an Survey. Brian Robertson, PhD, VP Research John Charles, MS, Research Analyst Mark Noyes, MPH, Research Analyst AAPOR Conference May 18, 2018 Denver, CO
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.