Download presentation
Presentation is loading. Please wait.
2
ADRCE, University of Southampton
Project ADRCE_001: Experiences, reflections and using linked census – survey data to monitor survey non-response Jamie C. Moore Peter W.F. Smith Gabrielle B. Durrant ADRCE, University of Southampton
3
Our research focus We are interested in the impacts of survey non-response on dataset quality. In particular, non response (NR) bias. - the deviation from a population value due to respondent / non-respondent differences. How to assess NR biases though? Survey response rate found to be an inaccurate indirect measure. Otherwise, lack of information on non-respondents an issue.
4
Assessing NR biases using census data as a sample information source
We link 2011 census attribute information on individuals in UK social survey samples to their responses. Quantify bias risks in terms of variation in response propensities estimated given attribute covariates (representativeness). Coefficient of Variation of response propensities (CV) = SD / r Repeated interview attempts, so monitoring over call record as well to investigate whether reducing number of attempts will affect dataset quality. Here, we present results from the Labour Force Survey (LFS).
5
CVs & phase capacity (PC) points
Final response rate = 68.3%; Resp. prop~ Sex + Age + High Qual. + Act. last week + Tenure PC point = call where CV within a of call record minimum. Points earlier when threshold increased.
6
Survey variables Ok – explaining how I calculate the ‘per cent biases’ presented in this slide. These are simply ((call n prop respondents in covariate category – final call prop respondents in covariate category) / final call prop respondents in covariate category). Conceivably, later in the call record (post call 10 or so) one could identify that an individual in the survey sample in covariate category x was interviewed at a given call, based on minimal bias values (at this point changes in nos interviews in categories from call to call are mostly less than 10). However, I don’t present any raw proportions, so this wouldn’t be exact, the individual in question would need to know that there had been x-1 failed interview attempts (recall, interviews are face to face), and also call records are at the HH level, so I’ve had to make some assumptions translating them to individual level (i.e. what to do if interviews are over multiple calls) that probably are often wrong. As well, without other information appended I would probably argue that knowing a sample member in covariate category x responded on the nth interview attempt is not disclosive in itself. CV PC point = call 6, so conservative, but still 7.6% call saving!!
7
Experiences and reflections…
The application process. - Endless form filling, but relatively painless in our case. Obtaining the data. - Risk aversion / the wheels of government turn slowly. - Mutual benefits & reciprocity potential lubricants? - Take the lead yourself, but find and cherish your experts. Working with the data. - Will be in a secure research environment. - Awareness of constraints of environment and efficient research practice necessary. - Communication with your support officers the key!
8
Acknowledgements This research was funded by the ESRC National Centre for Research Methods, Workpackage1 (grant reference number ES/L008351/1) and the ESRC Administrative Research Centre for England (ADRCE) (grant reference number ES/L007517/1). This work contains statistical data from ONS which is Crown Copyright. The use of the ONS statistical data in this work does not imply the endorsement of the ONS in relation to the interpretation or analysis of the statistical data. This work uses research datasets which may not exactly reproduce National Statistics aggregates.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.