Download presentation
Presentation is loading. Please wait.
1
General population surveys on the web: new findings from the UK Joel Williams
2
Context for the experiment?
Community Life survey Covers volunteering, charity donation, civic participation, neighbourhood perspectives, attitudes to diversity etc. Revival of classically designed face-to-face interview probability sample survey … but client has ‘digital default’ philosophy Commissioned an experiment to gather evidence for an online survey alternative Recruited through postal address sample frame rather than access panel Similar to many of Dillman’s experiments in US but larger scale Parallel face-to-face interview sample for comparison
3
What did we want to find out from the experiment?
What response rate can be achieved with this method of recruitment? What difference do incentives make to the response rate? What is the data quality like? What is the profile of an online sample compared to a face-to-face interview sample? Do the two samples give similar estimates? What value is added by the inclusion of a postal questionnaire with reminder letter? We designed an experiment to test online data collection as a method for generating the Community Life population estimates. It was designed to answer these questions: [as above]
4
What did the experiment look like?
3 contacts Conditional incentive of £0,£5 or £10 OR 2 contacts Unconditional incentive of £0 or £5 The experiment is the largest ever test of this method. We drew a random sample of approximately 6,700 addresses. We sent an invitation to do an online survey plus two reminders for non-responders A random subset of non-responders received a postal questionnaire with the second reminder. They could fill this out instead of going online. We tested a mix of incentives and the results are shown on the next slide. +1 contact
5
Incentive increases response rate but not to level achieved with face-to-face interviews
WEB ONLY No incentive WEB ONLY C£5 incentive WEB ONLY C£10 incentive WEB ONLY U£5 incentive FACE-TO-FACE C£5 incentive This chart shows the expected response rate for an online-only design compared to the standard face-to-face interview design. The little black arrows show the margin of error for each response rate estimate. The long-run response rate for each design ought to be within the bounds of the arrow. There are four columns, one for each incentive type. The highest response rate is 25%, some way short of the face-to-face interview response rate of 60%. Offering an incentive increases the response rate – from 16% to a maximum of 25%. The cost of the unconditional £5 incentive (equal to a conditional incentive of around £20) makes it unattractive compared to the £10 conditional incentive.
6
Adding a postal questionnaire to 2nd reminder substantially increases the response rate (but shorter questionnaire) WEB +POSTAL No incentive WEB +POSTAL C£5 incentive WEB +POSTAL C£10 incentive WEB +POSTAL U£5 incentive FACE-TO-FACE C£5 incentive 39 35 Adding a postal questionnaire with the second reminder increases the response rate substantially. Even if no incentive is offered, the response rate is higher than an online-only design with a £10 incentive offered. Nevertheless, the highest response rate is still lower than the face-to-face interview response rate. The postal questionnaire has to be an edited version of the full questionnaire to avoid appearing too onerous a task. It includes only key questions so, effectively, the response rate is lower for the non-key questions included in the online questionnaire but excluded from the postal questionnaire. 31
7
What is the quality of the data using web designs?
Some positive results: Little evidence of respondents speeding through the online questionnaire Same level of care taken to complete the questions Similar number of items selected from multi-code questions Similar differentiation between questions using same response scale No (additional) bias towards items at top of response list Some less positive results: One quarter of respondents appear to be the wrong individual Higher dropout than with F2F interviews especially at first few screens More ‘don’t know’ and ‘refusal’ answers (partly due to inevitably imperfect adaptation of interview questions)
8
What do we know about the demographic profile?
No gender bias in the web samples (Expected) bias towards individuals who are... middle aged white degree educated working home-owning married Addition of postal questionnaire only slightly improved sample profile and not in all respects 8 8
9
Despite increase in response rate when adding postal questionnaire, the age profile gets worse
This slide shows how the age profile of the sample differs by survey design. First, the population profile can be compared with the face-to-face interview sample profile; even with a response rate of 60%, there is some bias towards the older age groups. The next bar show the online-only age profile if a £10 incentive is offered. It has a slightly older age profile than the face-to-face interview design albeit with fewer people aged 75+. The last bar show the age profile when a postal questionnaire is included with the second reminder. Adding the postal questionnaire only seems to bring in more middle-aged and older people. This is a case where increasing the response rate by adding a postal questionnaire option makes the sample less representative. 16-24 25-34 35-49 50-64 65-74 75+
10
How do the findings compare with findings from F2F interviews?
Small sample sizes for each design so large margins of error around findings, even when borrowing power through use of main effects rather than saturated model Inclusion of a postal questionnaire with the second reminder does not make estimates more aligned despite much higher response rate Minimal impact of demographic weighting of data suggests fairly weak relationship between response propensity and key measures Differences may remain due to (i) different modes of data collection or (ii) insufficient accounting for sample differences 10 10
11
(difference from interview estimate)
Incentive and demographic weighting improves alignment of estimates but many remain statistically significant [mean t scores] WEB ONLY No incentive WEB ONLY C£10 incentive WEB ONLY No incentive DEMOG WTS WEB ONLY C£10 incentive DEMOG WTS MEAN T SCORE (difference from interview estimate) This chart shows the expected response rate for an online-only design compared to the standard face-to-face interview design. The little black arrows show the margin of error for each response rate estimate. The long-run response rate for each design ought to be within the bounds of the arrow. There are four columns, one for each incentive type. The highest response rate is 25%, some way short of the face-to-face interview response rate of 60%. Offering an incentive increases the response rate – from 16% to a maximum of 25%. The cost of the unconditional £5 incentive (equal to a conditional incentive of around £20) makes it unattractive compared to the £10 conditional incentive. 11 11
12
In summary... What response rate can be achieved with online-only data collection? A little over 15% unless an incentive is used What difference do incentives make to the response rate? A conditional incentive of £10 raises the response rate to 22% An unconditional incentive can achieve a slightly higher rate but cost is prohibitive What is the data quality like? Generally good although some doubt over random selection within household What is the profile of an online sample compared to a F2F interview sample? More likely to be middle-aged, white, well educated, working, home-owning and married Do the two samples give similar estimates? Expect approximately 40% to be significantly different (some large, most small) What value is added by the inclusion of a postal questionnaire? Very little; higher response rate but estimates no more aligned with F2F estimates
13
What now? Client has commissioned a much larger test of the web-only method A number of questions still to answer: What is the most effective weighting strategy for an online-only survey? Can we get a better understanding of the impact of data collection mode on the results? What can we do to ensure random sampling of individuals at sampled addresses? How should we help data users account for a change in design?
14
Thank you
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.