Download presentation
Presentation is loading. Please wait.
2
Effects of Design in Web Surveys
Vera Toepoel Tilburg University The Netherlands 1/17/2019
3
CentERdata: Two Online Panels
1. CentERpanel:l Exists for 17 years 2000 households Respondents fill out questionnaires every week Online interviews as method, but: Probability sample drawn from address sampling frame of Statistics Netherlands Recruitment of new panel members address-based Includes households without internet access (less than 20%): Equipment 1/17/2019
4
CentERdata: Two Online Panels
2. LISS Panel Grant from The Netherlands Organisation for Scientific Research 5000 households Established in 2007 (we fielded 1st questionnaire!) Respondents fill out questionnaires every month Online interviews as method, but: Probability sample drawn from address sampling frame of Statistics Netherlands Contacted by letter, telephone or visit Includes households without internet access (less than 20%): Equipment 1/17/2019
5
1 item per screen 1/17/2019
6
4 items per screen 1/17/2019
7
10 items per screen 1/17/2019
8
1/17/2019
9
Answer categories 1/17/2019
10
Open-ended 1/17/2019
11
Vertical: positive to negative
1/17/2019
12
Horizontal 1/17/2019
13
Numbers 1 to 5 1/17/2019
14
Numbers 5 to 1 1/17/2019
15
Numbers 2 to -2 1/17/2019
16
Trained Respondents: Panel conditioning
Content (knowledge on topics) Prepare for future surveys Develop attitudes Procedure (question-answering process) Learn how to interpret questions Answer strategically Speed through the survey 1/17/2019
17
Procedure (answer process)
Differences between trained and fresh respondents with regard to web survey design choices Items per screen Response category effects Question layout 1/17/2019
18
Overall: Difference in mean duration of the entire survey between panels: 436 seconds for the trained panel and 576 seconds for the fresh panel. 1/17/2019
19
Experiment 1: Items per screen
Social Desirability Scale 10 items 3 different formats: 1 item per screen 5 items per screen 10 items per screen 1/17/2019
20
Experiment 1: Items per screen
Trained respondents had higher inter-item correlations for multiple-item-per-screen formats. No significant difference in item non-response. Mean score of the Social Desirability Scale showed no evidence for social desirability bias. The mean duration to complete the ten social desirability items did not differ significantly between panels. 1/17/2019
21
Experiment 2: Answer Categories
1/17/2019
22
Experiment 2: Answer Categories
1/17/2019
23
Experiment 2: Answer Categories
Category effect found No difference in category effect between trained and fresh respondents 1/17/2019
24
Experiment 3: Question Layout
Question: Overall, how would you rate the quality of education in the Netherlands? Answer: 5-point scale Six formats: Reference format (decremental) Reverse scale: incremental Horizontal layout Add numbers 1 to 5 to verbal labels Add numbers 5 to 1 to verbal labels Add numbers 2 to -2 to verbal labels 1/17/2019
25
Experiment 3: Question Layout
Decremental vs. incremental: T+ F Vertical vs. horizontal layout: - No numbers vs. numbers 1 to 5:- Numbers 1 to 5 vs. numbers 5 to 1: T+F Numbers 5 to 1 vs. Numbers 2 to -2: T+F Trained respondents more easily selected one of the first options. T=significant differences in Trained panel F=significant differences in Fresh panel 1/17/2019
26
Design Effects in Web Surveys: Comparing Trained and Fresh Respondents
Overall little differences between trained and fresh respondents Trained respondents are somewhat more sensitive to satisficing: Shorter completion times Higher inter-item correlations for multiple-items-per-screen formats Select first response options more often 1/17/2019
27
Current and Future Research
It has been little more than a decade since systematic research was begun on visual design effects in web surveys. In the last decade dozens of studies have been conducted It is now important that we begin to understand the importance of each of the visual effects Can we reduce visual effects by effective question writing?! 1/17/2019
28
Effective Question Writing
Tourangeau, Couper, and Conrad (POQ 2007) suggest there may be a hierarchy of features that respondents attend to: Verbal language>numbers> visual cues Question: Can the effects of visual layout be diminished through greater use of verbal language and numbers? 1/17/2019
29
Experiment 1: Visual Heuristics (joint with Don Dillman)
Tourangeau, Couper, and Conrad (POQ 2004; 2007): Middle means typical: respondents will see the middle option as the most typical Left and top means first: the leftmost or top option will be seen as the ‘first’ in conceptual sense Near means related: options that are physically near each other are expected to be related conceptually Up means good: the top option will be seen as the most desirable Like means close: visually similar options will be seen as closer conceptually Experimental conditions: Polar point or fully labeled scale With or without numbers (1 to 5) 1/17/2019
30
Middle Means Typical Fully labeled: even spacing
Fully labeled: uneven spacing 1/17/2019
31
Left and Top Means First
Fully labeled with color: consistent ordering Fully labeled with color: inconsistent ordering 1/17/2019
32
Near Means Related Polar point with numbers: separate screens
Polar point with numbers: single screen 1/17/2019
33
Up Means Good Polar point with numbers: incremental
Polar point with numbers: decremental 1/17/2019
34
Like Means Close Polar point Polar point with color 1/17/2019
35
Like Means Close Polar point with numbers (1 to 5)
Polar point with different numbers (-2 to 2) 1/17/2019
36
Labels, numbers and visual heuristics: is there a hierarchy?
1. Middle Means Typical 2.Left and Top Means First 3. Near Means Related 4. Up Means Good 5. Like Means Close Effect heuristic? no yes Numbers reduced effect? Color: yes Dif. #: no Effect heuristic in fully labeled scales? 1/17/2019
37
Experiment 2: Pictures in web surveys (joint with Mick Couper)
Replicate study Couper, Tourangeau, and Kenyon (POQ 2004) 1. No Picture 2. Low frequency picture 3. High frequency picture Add verbal instructions A. No verbal instruction B. Instruction to include both high and low frequency instances C. Instruction to include only low frequency instances 1/17/2019
38
Low and High frequency picture
1/17/2019
39
Can verbal instructions reduce the effects of pictures?
MANOVA main effect instructions lambda=.597, p<.0001 main effect pictures lambda=.964, p<.0001 interaction instructions*pictures lambda=.9691,p<.0001 This suggests that while both the main effect and interaction are significant, instructions explain more of the variation in the answers than pictures! 1/17/2019
40
Future Research How to reduce visual design effects in web surveys
1/17/2019
41
LISS data Every researcher (irrespective of nationality) who wants to collect data for scientific, policy or societal relevant research can collect data via the LISS panel at no cost Proposals can be submitted through Existing data free available for academic use longitudinal core studies proposed studies disseminated through 1/17/2019
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.