Download presentation
Presentation is loading. Please wait.
Published byEustace Fox Modified over 8 years ago
1
Conducting High Quality Surveys: Frameworks and Strategies for Reducing Error and Improving Quality Lija O. Greenseid, Ph.D. Senior Evaluator, Professional Data Analysts Julie Rainey Vice President, Professional Data Analysts Demonstration session for the American Evaluation Association Conference November 5, 2011
2
2 Today 1.Overview of Survey Quality Framework and Total Survey Error 2.Discuss four strategies for increasing quality of surveys 3.Time to reflect on survey constructs, critique a survey, and hear case-study examples 2
3
3 Why is survey quality important? Poor quality surveys can lead to poor predictions or decisions (e.g. 1936 Presidential Election Landon vs. FDR) Poor quality surveys waste time and resources 3
4
4 Survey Quality Framework Survey quality is a complex, multi- dimensional concept (Juran & Gryna, 1980) Different perspectives on survey quality by producers of survey data (e.g., evaluators and survey researchers) and users of survey data (e.g., clients, stakeholders, the public) 4
5
5 Evaluator Values in Survey Quality Large sample sizes High response rates Internally consistent responses Good coverage of target populations 5 Survey producers (evaluators and researchers) value data quality attributes:
6
6 Client Values in Survey Quality Survey users (evaluation clients) take data accuracy for granted. Clients prioritize survey implementation factors: 6 Timeliness Accessibility and interpretability Usability of data Cost
7
7 Balancing Act High quality surveys use available resources to balance values of data producers and users 7 Rigorous survey methodology Responsiveness to clients needs “Fitness for Use”
8
8 A Utilization-focused Approach to Methods Decisions “The primary focus in making evaluation methods decisions should be on getting the best possible data to adequately answer primary users’ evaluation questions given available time and resources.” Patton, 1997, p. 247 8
9
9 Total Survey Error Model Groves et al. (2004)
10
10 1. Know what you really need to know and how to measure it Today’s presentation: Four strategies to improve survey quality 2. Create a respondent-friendly questionnaire 3. Reach your population of interest 4. Encourage your sample to respond Total Survey Error Model (Groves et al. 2004)
11
11 Strategy 1: Know what you really need to know and how to measure it Construct validity = agreement between a theoretical concept and a specific measurement (“did it measure what it was supposed to measure”) Intelligence vs. IQ test Program impact vs. Quit rate for a stop-smoking program
12
12 Survey question development process 1.Define your constructs of interest 2.Decide how to measure your constructs –Review existing validated instruments –Modify existing items (with permission) –Create your own instrument 3.Craft good questions (see Dillman’s work) 4.Conduct cognitive interviews (see Tourangeau’s work)
13
13 Example: Program Satisfaction Overall, how satisfied were you with the service you received from the stop-smoking program? Very satisfied Mostly satisfied Somewhat satisfied Not at all satisfied
14
14 Exercise: Think, Pair, Share Let’s think more deeply about “program satisfaction.” 1.Think: What do you really need to know about “program satisfaction”? (Dig deep) 2.Write: Write one good survey question and appropriate response options related to “program satisfaction.” 3.Pair & Share: Find a partner. Exchange your questions, read, and respond as if you were taking the survey. Then discuss: 1.What do you believe the question is asking? 2.What do the specific words mean to you? 3.What information do you need to be able to recall to answer the question? 4.Do the response categories match your own internally generated answer?
15
15 Strategy 2: Create a respondent- friendly questionnaire Understand how participants interact with mail and online survey questionnaires Consider impact of technology on survey experience and responses
16
16 Visual processing and questionnaires Science of how people perceive and attend to visual information (Tourangeau, Cooper, & Conrad, 2003) –Middle means typical –Top means first (and left means first) –Near means related –Up means good –Like means close Consistent with HeuristicMildly InconsistentStrongly Inconsistent Strongly agree It depends Agree Strongly agree It depends Disagree Strongly disagree Disagree Strongly disagree Agree Strongly disagree It depends Disagree
17
17 Solutions Help respondents organize information on page quickly and accurately –Navigational clues –Use of graphic design principles: color, font, size, spacing Pilot test, including on a variety of platforms for web surveys
18
18 Web surveys on mobile devices Completion of web surveys by mobile devices is present and will continue to increase Younger, more diverse, and lower income individuals use mobile devices more frequently to answer web surveys Stapleton (2011)
19
19 Borrowed with permission from Stapleton (2011)
20
20 Internet surveys: Promise and peril Web surveys allow for inexpensive, quick access to survey data Tools like SurveyMonkey and Zoomerang allow anyone to conduct a professional looking web-survey Low barriers to entry can lead to “garbage in – garbage out”
21
21 Questionnaire critique Review then critique the sample survey in terms of design and layout. (Do not review it for content.) What do you see as the primary design and layout issues? What would be your top priorities for improving the survey?
22
22 Strategy 3: Reach your population of interest
23
23 Internet survey FAIL
24
24 Program evaluations & sampling Often surveying a known population –Program participants –Organizational staff –Parents of school children, etc. Sometimes necessary to sample from a population of interest (use a “sampling frame”) –Community members –Individuals with certain characteristics –Program stakeholders
25
25 Sample frame selection Possible population-based sampling frames: –Landline telephone numbers (RDD) –Cell phone numbers –Postal addresses (postal delivery service file) Sample frame selection dependent on time, resources, and population of interest
26
26 Telephone use and coverage 26.6% of American households (and growing) have a cell-phone only Tend to be younger and lower income individuals (National Center for Health Statistics, 2011; Blumberg and Luke, 2010) Cell-phone only households by state
27
27 Internet use and coverage 21% of Americans (but decreasing) are not internet users Non-internet users tend to be older, less educated, lower income, and more rural (Horrigan, 2009)
28
28 Postal address coverage Best coverage, however home addresses are not stable for all populations Lower coverage in urban high density areas and rural areas Mail survey administration time longer than phone or web-based
29
29 Survey frames: New options Landline + cell phone samples Address-based sampling Opt-in online panels Online probability-based panels
30
30 How does this relate to your work? Turn to a partner. Discuss: What populations are you trying to reach? What options are best for reaching these populations? (e.g., landline phone surveys, cell phone surveys, online panels, mailed surveys) How will choices for how you attempt to reach them bias who you reach? 30
31
31 Strategy 4: Encourage your sample to respond Surveying as social exchange theory (Dillman, Smyth & Christian, 2009) –Increase perceived rewards –Reduce perceived costs –Establish trust Strategies to encourage response: –Mixed-mode surveys –Incentives
32
32 Mixed-mode surveys Mixed-mode surveys use more than one survey mode to collect data. Examples of survey modes: –telephone –mail –internet –face-to-face –interactive voice response –text messaging Different than “mixed methods” studies in which both qualitative and quantitative methods are used within one study. 32
33
33 Mixed-mode surveys and error Using more than one survey mode can improve the validity and reliability of survey data: Increase response rates (decrease non-response error) Improve survey coverage, especially from among harder-to-reach populations (decrease coverage error) Shorten survey administration timeframes (protects against threats to external validity, provide timely data to clients) 33
34
34 Data Quality Measurement effects can be strong across modes –Social-desirability can play a strong role in in- person modes (telephone and in-person) –Primacy/recency effects an issue with visual modes (paper and web) Unimode construction (same wording and construction across modes) a best practice (Dillman, Smyth, Christian, 2009) 34
35
Case study: Web plus phone vs. web plus mail 7-month follow-up surveys of participants in a web-based tobacco cessation program (mailed recruitment letter, web survey, either phone or mail follow-up to non-respondents, $10 incentive for completion) 35
36
Impact on outcomes Percent of respondents stating did not use tobacco in last 30-days was significantly higher for web respondents than for phone respondents 36
37
37 Incentives Small pre-paid cash incentives work to boost response rates ($1-$5) Incentives are also cost-effective Incentives bring in people less interested / less inclined to participate Effect of incentives on response bias being studied but appear promising 37
38
38 Case Study: Incentive experiment PDA tested impact of a $2 pre-paid incentive vs. a $10 promised incentive Found no significant difference in response rates across two conditions $2 pre-paid significantly less expensive 38 Response Rate by Incentive Type (ns)
39
39 Final thoughts: Understand and respect your respondents
40
40 Respect the privilege of asking people to trust you with their honest answers Abuse of survey privileges hurts other legitimate needs to gather data Always try to minimize respondent burden and maximize their feelings of contributing to a useful study Survey “karma” – answer surveys if you want others to answer yours
41
41 General approach to improving quality Design survey studies to meet client’s needs Consider impact of survey choices on sources of error Know your population and tailor your survey Survey field is changing rapidly: –Keep up on literature –Join American Association of Public Opinion Research –Read Survey Practice (surveypractice.org) –Contribute to knowledge in field 41
42
42 AAPOR American Association of Public Opinion Research www.aapor.org Memberships (includes Public Opinion Quarterly journal): Student memberships: First year free; renewals are $25 per year. Individual memberships range from $55 - $130 per year depending on income. Employer paid: $150. AAPOR T-shirt Slogan Contest Past Winners: –“I lost my validity at AAPOR” –“AAPOR: Freqs and Geeks” –“Public Opinion Research: Fighting the war against error” –“Trust us – we’re 95% confident” –“If you don’t like the estimate, just weight” 42
43
43 References & Recommended Reading American Association of Public Opinion Research. (2010, March). AAPOR Report on Online Panels. Available online from: http://www.aapor.org/AM/Template.cfm?Section=AAPOR_Committee_and_Task_Forc e_Reports&Template=/CM/ContentDisplay.cfm&ContentID=2223 http://www.aapor.org/AM/Template.cfm?Section=AAPOR_Committee_and_Task_Forc e_Reports&Template=/CM/ContentDisplay.cfm&ContentID=2223 American Association of Public Opinion Research. (2010, March). AAPOR Cell Phone Task Force Report. Available online from: http://www.aapor.org/Cell_Phone_Task_Force.htm Biemer, P.P. (2010). Total survey error: Design, Implementation, and Evaluation. Public Opinion Quarterly, 74 (5), pp. 817-848. Blumberg S.J. et al. (2011, April). Wireless substitution: State-level estimates from the National Health Interview Survey. National Center for Health Statistics. Available online from: http://www.cdc.gov/nchs/nhis.htm.http://www.cdc.gov/nchs/nhis.htm Dillman, D. et al. (2009, March). Response rate and measurement differences in mixed mode surveys using mail, telephone, interactive voice response, and the internet. Social Science Research, 38(1), pp. 1-18. Dillman, D., Smyth, J.D., and Christian, L.M. (2009). Internet, mail, and mixed-mode surveys: The tailored design method, 3 rd edition. Hoboken, NJ: Wiley Fahimi, Monsour, (2010). Address-Based Sampling (ABS): Merits, Design, and Implementation. Presentation to National Conference on Health Statistics. Available online from: www.cdc.gov/nchs/ppt/nchs2010/17_Fahimi.pdfwww.cdc.gov/nchs/ppt/nchs2010/17_Fahimi.pdf 43
44
44 Groves, R. et al. (2004). Survey Methodology. New York: Wiley. Horrigan, J.B. (2009, June). Home Broadband Adoption 2009. Pew Internet and American Life Project. Available from: http://www.pewinternet.org/Reports/2009/10- Home-Broadband-Adoption-2009.aspxhttp://www.pewinternet.org/Reports/2009/10- Home-Broadband-Adoption-2009.aspx Juran, J. & Gryna, F. (1980). Quality planning and analysis, 2 nd edition. New York: McGraw Hill. Stapleton, C. (2011, May). The Smart(Phone) Way to Collect Data. Presentation to the American Association of Public Opinion Researchers, Phoenix, Arizona. Tourangeau, R., Couper, M., & Conrad, F. (2003). Spacing, Position, and Order: Interpretive Heuristics for Visual Features of Survey Questions. University of Michigan: Institute for Social Research. Available from: http://www.isr.umich.edu/src/smp/Electronic%20Copies/119.pdf http://www.isr.umich.edu/src/smp/Electronic%20Copies/119.pdf Tourangeau, Rips, & Rasinski. (2000). The Psychology of Survey Response. New York: Cambridge University Press. Weisberg, H.F., Krosnick, J., & Bowen, B.D. (1996). An introduction to survey research, polling, and data analysis, 3 rd edition. Sage Press. Yeager, D.S., Krosnick, J., Chang, L-C., Javitz, H., Levendusky, M., Simpser, A. & Wang, R. (2009). Comparing the accuracy of RDD telephone surveys and Internet surveys conducted with probability and non-probability samples. Available from: http://www.knowledgenetworks.com/insights/docs/Mode-04_2.pdf http://www.knowledgenetworks.com/insights/docs/Mode-04_2.pdf 44
45
45 Contact Information Lija Greenseid, Ph.D. Senior Evaluator lija@pdastats.com Julie Rainey Vice President julie@pdastats.com 612-623-9110 www.PDAstats.com 45
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.