Jessica Behm, Westat/Rockville Institute

Slides:



Advertisements
Similar presentations
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Response Rate in Surveys Key resource: Dillman, D.A.,
Advertisements

Census and Statistics Department Introduction to Sample Surveys.
Self-Administered Surveys: Mail Survey Methods ChihChien Chen Lauren Teffeau Week 10.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
Multi-State Conservation Grant Program: Coordination with AFWA National Fish Habitat Board Meeting Salt Lake City, UT June 25-26, 2013.
EXAMINING QUALITY OF LIFE FROM THREE PERSPECTIVES: A Study of Deinstitutionalization of Persons with Disabilities Dawn Hall Apgar, PhD Paul Lerman, DSW.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation ONLINE SURVEYS.
How to Assess Student Learning in Arts Partnerships Part II: Survey Research Revised April 2, 2012 Mary Campbell-Zopf, Ohio Arts Council
URBAN INSTITUTE Justice Policy Center The OJJDP Evaluation of Teen Courts (ETC) Project Janeen Buck Jeffrey Butts October 23, 2002 National Youth Court.
Making Sense of the Social World 4th Edition
Slide 1 Incentives in Surveys with Farmers Third International Conference on Establishment Surveys Montreal, Quebec, Canada June 19, 2007 Slide Kathy Ott.
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
Is research in education important?. What is the difference between Qualitative and Quantitative Research Methods?
1 American Community Survey Categories of Frequently Asked Questions –Purpose –Scope –Content –Operations.
The Challenge of Non- Response in Surveys. The Overall Response Rate The number of complete interviews divided by the number of eligible units in the.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
By: James Crain, Iowa State University Rebecca Christoffel, Iowa State University Peter Fritzell Jr., Iowa Department of Natural Resources Chris Jennelle,
Using Surveys to Design and Evaluate Watershed Education and Outreach Day 5 Methodologies for Implementing Mailed Surveys Alternatives to Mailed Surveys.
UNIT 2 DR. MARIE MELE Criminology I. How do we measure crime? Police Records Victim Surveys Offender self-reports.
Influencing Mode Choice in a Multi-Mode Survey May 2012 AAPOR Conference Presentation Geraldine Mooney Cheryl De Saw Xiaojing Lin Andrew Hurwitz Flora.
Numeracy & Quantitative Methods Laura Lake. A census: - collecting information from each and every person of interest. A sample: - when the population.
Final Presentation of Survey Results December 8, /8/20141 The.
Ageing in a Foreign Land 2017 Conference
RESEARCH METHODS Lecture 22
A Case Study in Accessing Hard-to-Reach Populations
Teresa Koenig, M.Ed., Westat Mariel Leonard, University of Mannheim
Transforming the future of public health in Missouri
Bell Ringer List five reasons why you think that some new businesses have almost immediate success while others fail miserably.
Context for the experiment?
2007 Household Travel Survey
Professional Communications
Newton Playground Committee
Identification and Specialization as a Waterfowl Hunter
Data collection.
Research Process №5.
Research Methods Lesson 1 choosing a research method types of data
Workshop Presentation
Perkins Annual Reports
Starter Look at the photograph, As a sociologist, you want to study a particular group in school. In pairs think about the following questions… Which group.
Section 3: Sweep implementation
2017 National Conference on Ending Homelessness Engaging Individuals with Lived Experience of Homelessness in the Point-in-Time Count July 19, 2017 Peter.
Faculty use of digital resources and its impact on digital libraries
Section 2: Types of longitudinal studies
A 4 Step Process (Kind of…)
Poster title heading blah blah blah blah blah blah
Recommendations for Research Poster Background/ Significance
Increasing Response Rate
Your name and Credential . Conclusions and implications
Definitions Covered Descriptive/ Inferential Statistics
Survey Design & Use.
PART II: The Application Process
Types of Research in Sociology
Effects of Design in Web Surveys
[Poster Title] Background Survey Analysis Results Conclusions
UNODC-UNECE Manual on Victimization Surveys: Content
Survey Research Explain why social desirability is a problem in asking questions. Explain why the order in which questions are asked is important. Explain.
Third International Seville Conference on Future-Oriented Technology Analysis (FTA): Impacts and implications for policy and decision-making 16th- 17th.
Why Conduct This Study? Provide evidence of the impact and benefits of living in a Life Plan Community on multiple dimensions of wellness.
Community Sport Report Card: A project with CSCO
Data collection.
LOW VOTER TURNOUT Canada, like many democracies around the world, has faced a clear decline in voter participation over the last 30 years. Here is a snapshot.
Panel care in the Austrian EU-SILC
National Recreational Boating Safety Survey (NRBSS): Data in Relation to the Future of Recreational Boating in the United States Susan Willis, Ph.D., Gina.
Types of interview used in research
Changes in the Canadian Census of Population Program
NAME OF STUDY/CATCHY TITLE
Deciding the mixed-mode design WP1
Poster title heading blah blah blah blah blah blah
Data collection.
Dealing with nonresponse
Presentation transcript:

Survey Package Design and Response Rates in an Outdoor Recreational Study Jessica Behm, Westat/Rockville Institute Cynthia Helba, Ph.D., Westat/Rockville Institute Mina Muller, Westat/Rockville Institute W. Sherman Edwards, Westat/Rockville Institute Regina Yudd, Ph.D., Westat/Rockville Institute

Background and Overview 50-State Survey of Fishing, Hunting and Wildlife-Associated Recreation Sponsored by Association of Fish and Wildlife Agencies (AFWA) Funded by Multistate Conservation Grant (#F15AP001640) from the Sport Fish and Wildlife Restoration Program of the US Fish and Wildlife Service and AFWA Survey conducted every 5 years since 1955

Survey Redesign Rockville Institute, a non-profit organization affiliated with Westat, began a redesign in January 2015 Objectives: Transition from interview to mail Address problems facing interview studies Work in tandem with Census to enable mode comparison

Survey Design Pretest Main Study Initial household screener followed by assignment to one of three survey types (Fishing, Hunting, Wildlife-Associated Recreation) Pretest Household Screener (post card and follow-up mailing by USPS) 2 waves of detailed data collection Main Study Household Screener (post card and two follow-up contacts) 2 or 3 waves of detailed data collection

The Problem... Low response rate on the pretest screener Why? Expected: 42% Actual: 16% Why?

Explanations? Look unofficial or commercial? Does it look like we’re asking for money? Not likeable or meaningful?

What we know about survey covers Covers should be compelling, have a clear title, and identify the sponsor. Slant and Dillman, 1994; Dillman, Smyth, and Christian, 2014 But, there is little empirical evidence to guide cover design Nederhof (1988) found complex graphic cover led to increase in response rates Unable to replicate in several different settings Dillman and Dillman 1995; Gendall ,1996; Gendall,1999; and Leslie, 1996. Gendell (2002) found that likeability is key to increasing response rates but the effect is likely to be very small.

What we know about survey covers—continued. Westat is working with BJS on a companion survey to the National Crime Victimization Survey (NCVS) Tested likely respondent preference: Creative cover vs. BJS official seal cover. Every respondent preferred the BJS seal. Survey appeared credible and immediately identified as originating with the U.S. government. Inclusion of seal made participants say they were more likely to respond quickly.

What did we do? Second pretest in 4 states. Arizona, Kentucky, Missouri, and Pennsylvania were recruited by AFWA Used experience with NCVS and the literature to guide change in cover. Made other package changes. Also made change to incentives.

What did we do to the cover? 1. Replaced the old cover with two new survey covers

What did we do—continued? 2. Changed the sponsorship

What did we do—continued? Modified the envelope.

What happened? Response rates doubled in pretest #2 Cash incentive for a portion of the sample? Third mailing? Four states selected for pretest #2? Differences in cover? 1 t=-0.843, p=.40 2 t=1.1583, p=.2469 Response Rate (RR1) Pretest #1 Screener – Overall (N=1,913) 16.63% AZ, KY, MO, PA 19.05% All other states 16.38%1 Pretest #2 Screener (N=1,913) 30.40% Picture cover 31.70% Drawing cover 29.14%2

What happened—continued? Changes to the sponsorship of the survey package? State sponsorship provided a more “official” sense to the package 3 t=3.9083, p=0.00 Response Rate Main Study Screener State agency logos and letters (N=4,000) 24.93% AFWA logo and letters (N = 288,044) 21.98%3

Who returned the survey? Package changes, including sponsorship, had a slight impact on increasing return rates for non-participants between Pretest #1 and Pretest #2 Not statistically significant 4 t=1.4119, p=.1583 Return rates Household Screener Participant Non-participant Pretest #1 86.4% 13.6% Pretest #2 82.7% 17.3%4

What does this mean? These analyses provide one finding and some suggestions that might shape survey efforts moving forward: Strong finding: Official look and sponsorship was associated with increased response rates. Suggestive findings: Possible difference in response for photograph vs. line drawing cover bears consideration. Package design and sponsorship may yield different rates of returns for participants and nonparticipants.

Contact Information Jessica Behm (JessicaBehm@Westat.com) Cynthia Helba, PhD (CynthiaHelba@Westat.com) Mina Muller (MinaMuller@Westat.com) Sherm Edwards (ShermEdwards@Westat.com) Regina Yudd, PhD (ReginaYudd@Westat.com)