Download presentation
Presentation is loading. Please wait.
Published byMackenzie Stanley Modified over 11 years ago
1
Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal, Québec, Canada Kerry Levin & Jennifer OBrien, Westat
2
2 Overview of Presentation A brief review of the web design system and its origins Design issues we encountered Opportunities for experimental investigation
3
3 Background The Advanced Technology Program (ATP) at the National Institute of Standards and Technology (NIST) is a partnership between government and private industry to conduct high-risk research Since 1990, ATPs Economic Assessment Office (EAO) has performed rigorous and multifaceted evaluations to assess the impact of the program and estimate the returns to the taxpayer. One key feature of ATPs evaluation program is the Business Reporting System (BRS).
4
4 General Description of the BRS Unique series of online reports that gather regular data on indicators of business progress and future economic impact of ATP projects ATP awardees must complete four BRS reports per calendar year– three short quarterly reports and one long annual report
5
5 General Description of the BRS There are several different types of instruments (each with a profit and nonprofit version): 1. Baseline 2. Annual 3. Closeout 4. Quarterly The BRS instruments are a hybrid survey/progress report that ask respondents attitudinal questions as well as items designed to gather information on project progress. The Baseline, Annual, and Closeout reports are between 70 and 100 pages in length. Due to this length and complexity, web administration is the most logical data collection mode
6
6 Design issues: Online logic checks vs. back-end logic checks 1. Examples of online logic checks (i.e., hard edits) Sum checking Range checks 2. Examples of back-end logic checks Frequency reviews Evaluation of outliers
7
7 Online sum checking: Example
8
8 Online range checking: Example
9
9 Back-end checking: Frequency reviews and outlier evaluations At the close of each cycle of data collection, the data for each instrument are carefully reviewed for anomalies Frequency reviews are conducted to ensure that there were no errors in skips in the online instrument Although the BRS includes range checks for certain variables, the ranges are sometimes quite large, therefore an evaluation of outliers is a regular part of our data review procedures
10
10 Use of pre-filled information in the BRS The BRS instruments make use of two types of pre-filled information: 1.Pre-filled information from sources external to the instrument (i.e., information gathered in previous instruments or information provided by ATP such as issued patents) 2.Pre-filled information from sources internal to the instrument (i.e., information provided by the respondent in earlier sections of the current report)
11
11 Pre-filled information: External source example
12
12 Pre-filled information: Internal source example
13
13 Required items While most items in the BRS instruments are not required, the few that are fall into two categories: 1.Items required for accurate skips later in the instrument 2.Items deemed critical by ATP staff
14
14 Required items: Example item important for skip pattern
15
15 Required items: Example item critical to ATP
16
16 Unique Design: Financial items
17
17 Administration issues in the BRS: Multiple respondents Each ATP-funded project has multiple contacts associated with it It is rarely the case that a single respondent can answer all items in the survey. Westat provides only one access ID per report, however, therefore the respondents are responsible for managing who at their organizations are given access to the BRS online system
18
Experimental investigations using the BRS
19
19 Reducing item nonresponse: The Applicant Survey The ATPs Applicant Survey is not one of the BRS instruments, but is regularly administered via the web to companies and organizations that applied for ATP funding In 2006, Westat embedded an experiment within the Applicant Survey to test which of two different types of nonresponse prompting would result in reduced item nonresponse
20
20 Reducing item nonresponse: The Applicant Survey 904 respondents were randomly assigned to one of three conditions: 1) Prompt for item nonresponse appeared (if applicable) at the end of the survey; 2) Prompt for item nonresponse appeared (if applicable) after each section; 3) No prompt (control group).
21
21 Reducing item nonresponse: The Applicant Survey End of Survey:After each section:
22
22 Reducing item nonresponse: The Applicant Survey Both prompts for item nonresponse appeared effective, and to an equal degree. Group Percentage of completed surveys containing missing data Mean number of missing items per completed survey Prompt at end of survey 23.0%0.9 Prompt after each section of survey 23.3%1.1 No prompt (control) 39.8%2.2 p<.02p<.008
23
23 Boosting response rates: The days of the week experiment Literature suggests that there are optimal call times for telephone surveys. But are there also optimal days of the week to email survey communications? Optimal day to email was measured by: The overall response rate The time it takes to respond
24
24 Boosting response rates: The days of the week experiment Three different experimental conditions: 1) Monday cohort 2) Wednesday cohort 3) Friday cohort The invitation email and up to 3 reminders were all sent on the same day, either Monday, Wednesday, or Friday.
25
25 Boosting response rates: The days of the week experiment Experimental Group Total Eligible Total Completes Response Rate Monday16114187.6 Wednesday15914088.1 Friday15214192.8
26
26 Time to Complete the Survey Email 1Email 2Email 3Email 4 Monday14.9%32.3%65.2%87.6% Wednesday15.737.767.388.1 Friday18.438.867.192.8 Cumulative Response Rates
27
27 Boosting response rates: The days of the week experiment Friday cohort trends toward higher response rates, but all cohorts require the same amount of effort to achieve their respective response rates Overall, there is some evidence that the day of the week does matter
28
28 Conclusion The BRS has presented us with various design and administration challenges We have had the chance to fine-tune and address a variety of issues that that have come to our attention As researchers encounter new issues in the administration of web surveys, the BRS offers a place to study them
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.