Download presentation
Presentation is loading. Please wait.
Published byKelley Maxwell Modified over 8 years ago
1
The Challenge of Non- Response in Surveys
2
The Overall Response Rate The number of complete interviews divided by the number of eligible units in the sample Complete interviews – complete plus partial A measure of results of all efforts to execute a study
3
Eligibility of Sample Units Known eligibility Unknown cases –Best available scientific information to estimate eligible cases Reason for cases being unknown –Change of address –Death –Sickness –Senility –Old age
4
Bias Significant Non-response associated with higher likelihood of bias –Formula for bias
5
Sources of Bias Two possible sources of nonresponse –Item: individual questions –Unit: questionnaire Item nonresponse can produce more bias than unit nonresponse Surveys with high nonresponse can have low bias Surveys with low nonresponse can have high bias
6
Addressing Bias Differences between nonresponders and responders doesn’t necessarily measure bias –Large enough sample sizes will produce differences even for unimportant levels of bias –Need to decide how much bias makes a difference
7
Estimating Bias from Unit Nonresponse Conduct follow-up survey with non- responders Study call records - surrogates Converted refusals Difficult to contact Study the last 5% of respondents They would be noncontacts Bias can be related to amount to effort Administrative data Other survey results
8
Comparison to Other Survey Results Are the actual populations the same? Were the questions and responses worded identically? Were the questions asked in similar contexts? Did the survey use the same mode of data collection? Were the surveys conducted within the same time frames?
9
Attrition in Longitudinal Surveys Increased nonresponse in U.S. Government household surveys –Example is Survey of Income and Program Participation; 1992 dropout rate is 23% and 2001 rate is 29 percent Dropouts may result in biased survey estimates Evidence that this happens in mixed in several US studies.
10
Nonresponse Analysis Need to consider how well nonresponse can be modeled How well the estimate can be predicted
11
Adjusting for Bias Collect variables related to nonresponse Call History Make post survey adjustments Weighting Raking - adjusting to known values
12
Procedures to Maximize Response Panel maintenance Tracking Refusal Conversion Fieldwork
13
Panel Maintenance Respondent locations between surveys –Contact names –Telephone numbers –Alternative addresses –Change of address cards
14
Tracking Attempts to locate movers when forwarding address is not available
15
Refusal Conversion and Avoidance Enhanced Interviewer training Use of monetary incentives
16
Fieldwork Call records history –Determine best of day/day of week to attempt contact –Special concerns: spouse present, works at night, translator needed –Gauge level of respondent fatigue
17
Contact History Logs Record day, time, interim contact code, comments Contact and non-contact categories
18
Contact Categories Eligible household member not home Language problem Respondent too busy (Appointment set) Respondent refused
19
Non-Contact Categories Did not answer door Unable to reach (gated, locked door) No one home Telephoned – no answer Telephoned – answering machine
20
Follow-up Attempt Strategy First Log records up to 10 attempted contacts Second log records additional 10 attempted contacts
21
Completed Cases Final Disposition Categories –Completed interview –Completed partial interview –Non-interview (no one home) –Non-interview (refused) –Non-interview (language barrier)
22
Item Non-Response in Self Administered Questionnaires Branching Instructions –Not seen –Followed correctly Could skip questions Answering others that do not apply to them
23
Information on a Questionnaire Verbal language Numeric Symbolic –Check boxes –Arrows –Other symbols Graphic –Brightness –Color –Shape
24
Branching Instructions Variations in Branching Instructions affects reading and comprehension rates Respondents have a greater tendency to miss information to the right of an answer category. When branching instruction is printed in same font or point size as rest of text, it is unlikely to be detected as something important to attend to.
25
Respondent Perceptions A difference exists between perception of the process and what is really needed Respondents may focus on the primary reason for the survey or the most interesting questions and answers Can be a mismatch between what the writer wants to write and what the reader wants to read Instructions are often ignored
26
Knowledge and Memory Stimulate memory of the needed facts Use feedback to allow the respondent to detect and correct errors in their responses
27
Question Complexity -I Simple questions are easier to process than more complex or difficult ones A question that contains a single branching instructions should be easier for respondents to process than a question that contains multiple branching instructions The location of a question on a page may make a difference
28
Question Complexity II Lengthy or complex questions may exceed a respondent’s capacity to process them Questions with a high number of response categories may place more demands on short-term memory Primary effect – respondents tend to choose earlier response choices And may follow instructions for later choices
29
Question Complexity - III Alternating branches – branches sandwiched between categories that do not branch – respondents may follow wrong instruction Respondents may not see branching instructions Switching between write-in and close-end questions may cause respondents not to pay attention to instructions Write-in space may not be large enough for answer Questions at the bottom of a page have greater item non-response Distance between the answer box and branching instructions may lead to higher error rates
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.