Improving Survey Response Rates: The Effect of Embedded Questions in Web Survey Email Invitations Nick Inchausti, SurveyMonkey Mingnan Liu, Facebook.

Slides:



Advertisements
Similar presentations
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Response Rate in Surveys Key resource: Dillman, D.A.,
Advertisements

Webinar Process Contents 1. Prior to Webinar Promotion Registration 2. During Webinar 3. Follow-Up After Webinar 4. Archive & Persistent Lead Generation.
Sampling and Response: The experience of conducting an online student survey. Donna Poade.
ESS Mixed mode experiment ESRA Conference, Ljubljana, July 2013 Alun 18 July 2013.
Go to Slide Master View to edit Footer 1 Introduction to the Online Help Desk Unit Liaison Meeting June 19, 2009.
Summary of Key Results from the 2012/2013 Survey of Visa Applicants Who Used a Licensed Adviser Undertaken by Premium Research Prepared: July 2013.
Using an Intelligent Tutoring System to increase parent engagement in student learning By Zach Broderick, Christine O’Connor, Courtney Mulcahy, Cristina.
Written statements provide valuable information from the people directly involved and those observing from the sidelines. Civil Air Patrol’s Safety Management.
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
The Click Company. Overview Analyze advertising & sales rep data in 40 randomly selected territories Influence that sales rep and advertising has on annual.
Instructions for completing the CTOP web survey for periodic reporting.
What is Important in Study Design? And Why? Roxy Peck Cal Poly San Luis Obispo
SW388R7 Data Analysis & Computers II Slide 1 Multiple Regression – Basic Relationships Purpose of multiple regression Different types of multiple regression.
Guide: Setting up your Virtual Giving Tree on the Classy Platform
6/21/07 1 ‘ Best Practices’ Ask Away June 21, st. Day of Summer Debra Aggertt Illinois Virtual Reference Coordinator
Timing Isn’t Everything, But Money Talks How to encourage a face-to-face household panel to go online? University of Essex, July 2013 Gerry Nicolaas Carl.
GFP in the IUID Registry – A Basic Look Walt Clark, CPPM Raytheon IIS.
Summary of Key Results from the 2013/2014 Survey of Visa Applicants Who Used a Licensed Adviser Survey undertaken by: Premium Research Report prepared:
Student Engagement Survey Results and Analysis June 2011.
ESurvey Adjustments The following slides should be used to educate internal audiences at your organization on esurvey adjustments. We recommend that you.
Customer Service Experiences of Statistics Finland 3rd Nordic Marketing Conference 14th June 2005, Helsinki Jaana Andelin.
IAF Certification/ Registration Bodies’ Member Satisfaction Program September 19, 2003 Final Report Summary.
Understanding the Decision to Participate in a Survey and the Choice of the Response Mode Anders Holmberg and Boris Lorenc European Conference on Quality.
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
What is Symbaloo? Home Always a fast start, your favorite websites, online tools, and resources organized on your homepage. Personal Organize your favorite.
Apple vs Samsung Tablets Time Completion & Word Count Amaryllis Mavragani & Konstantinos Tsagarakis Business and Environmental Technology Lab Environmental.
Sociologists Doing Research Chapter 2. Research Methods Ch. 2.1.
What Day of the Week Would You Like to Answer Our Survey? Maria Andreasson, B.A. Laboratory of Opinion Research (LORE) Department of Political Science.
The measurement effect in PC smartphone and tablet surveys Valerija Kolbas University of Essex ISER Ipsos-MORI The European Survey Research Association.
2011 ACSI Survey Summary HDF/HDF-EOS Workshop Riverdale, MD April 18, 2012.
1 Cognitive Aspects Associated with Sample Selection Conducted by Respondents in Establishment Surveys La Toya Barnett Thomas Rebecca L. Morrison Grace.
Step 1 Lead Notifications Dear Partner, New leads have been assigned to your organization based on customer preference and are available for you.
Middlebury’s Community Health Network. Objective Describe the network of organizations that has emerged in each Blueprint HSA to support population and.
Thank you very much for including the HD*Calc software ( on your website or social media platform. The National Cancer Institute.
Miami, Florida, 9 – 12 November 2016
IAEWS Benchmark Study September 2011
Solving Equations Using Division and Multiplication
HW Page 23 Have HW out to be checked.
Sampling Which do we choose and why?.
Consumer Satisfaction Research
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
Introduction to the Online Help Desk
Context for the experiment?
Outline Introduction Method CMS Pro Denali Competition/Prospects
Supplier Portal Self-Registration
Signup.com Overview and Interactive Demo.
Instructor Name Instructor Title Library Name
Orange County’s Retirement & Investment Services Withdrawal Strategy Campaign October 5, 2015.
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Questionnaire Sampling and Terms.
Optimizing the Use of Your SCORE Volunteer Account
Business and Management Research
Samples, Experimental , & Observational Studies
2 independent Groups Graziano & Raulin (1997).
Exploring the relationship between Authentic Leadership and Project Outcomes and Job Satisfaction with Information Technology Professionals by Mark A.
You’re In the Right Place
Chapter 18 – Sampling Distribution Models
Mid Course Survey Our Course in Action.
Survey Design & Use.
NHS Adult Inpatient Survey 2018
Business Organization and Administration
Chapter 8: Estimating with Confidence
Business and Management Research
Introducing Schoolwires Forms & Surveys Module
Sample Surveys Idea 1: Examine a part of the whole.
Qualtrics for data collection
Samples, Experimental , & Observational Studies
Improving Community Outreach in Academic Libraries: Examining Outreach Methods and Opinion at The university of Oklahoma Christina Morel.
Live Event resources Pre- event checklist Planning template
Presentation transcript:

Improving Survey Response Rates: The Effect of Embedded Questions in Web Survey Email Invitations Nick Inchausti, SurveyMonkey Mingnan Liu, Facebook

Response Rates Response rates are a critical indicator of data quality for surveys, and we as survey researchers are always trying to get to 100% response For mixed-mode or web surveys, email invitations are widely used as the first point of contact to reach respondents We wanted to find out what happens if you begin data collection in the initial survey invitation itself?

Literature Review Email personalization sometimes boosts response rates (Heerwegh, 2005), but sometimes has no effect (Porter & Whitcomb, 2003) A white background and simple header had higher response rates than other conditions (Whitcomb & Porter, 2004) Mentioning the purpose of an email (requesting for survey participation) and the sponsor of the survey in the email subject line also had an impact on survey participation (Porter & Whitcomb, 2005) Several other factors, including the length of the email, placement of the URL, and the estimated time of the survey, have also been explored in survey experiments in order to improve survey participation and response rate (Kaplowitz, Lupi, Couper, & Thorp, 2012; Keusch, 2012; Trouteaud, 2004).

Research Question New product feature: ability to embed survey question within an email Research Question: Do survey invitations that include an embedded question have higher response rates than standard invitations? Click rate Completion rate Data quality check: comparison of responses to the first question

Embedded Question Screenshot: embedded survey invitation email Screenshot: standard survey invitation email

Experiment Design Sample frame: group of SurveyMonkey customers who had agreed to participate in research projects and provided their email addresses Data collected July 27 – August 8, 2016 Initial email invitation to complete a survey, follow-up reminder 4 days later Random assignment into 1/2 conditions: standard and embedded first question 4333 valid emails for embedded condition 4347 valid emails for standard condition 13-question survey, identical in each condition Experience with SurveyMonkey Satisfaction with the survey platform Interest in additional features

Results – Improved email click rate Higher click rate for embedded survey than the standard survey (32.0% vs. 26.2%) Statistically significant p<.001 This means that respondents in the embedded condition were more likely to click on the embedded question and start the survey than the respondents in the standard condition to click on the “Begin survey” button.

Results – Improved survey completion rate Higher completion rate for the embedded survey than the standard survey (29.1% vs. 24.4%) Statistically significant p<.001 Completion rate = # complete # sent

Results – Small negative effect on survey drop-out rate   Valid email Clicked Completed % Completed/Clicked Embedded 4333 1388 1261 90.8% Standard 4347 1141 1059 92.8% The proportion of respondents who completed the survey divided by the number who clicked into the survey was slightly higher for those in the standard condition… Which means that the embedded version had a slightly higher drop-out rate… But this difference is not statistically significant (p=.07).

Results – No effect on data quality Do we get different responses when we ask a question embedded in an email vs. in the survey itself? No. The ratio of the two NPS scores between the embedded and standard email invites was 0.98, suggesting the two responses to the first question were almost identical for the two conditions.

Summary of Results Using an emailed survey invitation with the first question embedded: Improves the email click rate Improves the survey completion rate Has only a small negative effect on the survey drop-out rate Has no effect on data quality, in terms of responses to the first (embedded) question

Discussion Overall, successful test for adding a new feature. Additional advantage: even if respondents drop out of the survey, their answers to the first question will be recorded in the embedded condition. In the standard condition, if respondents drop out before completing the first page, all data will be lost.      Future research opportunities: Embedding more than one question (or the whole survey?) in the email itself Experimenting with different survey lengths and question types Will this work with a different population of respondents?

Thank you Contact us at: Nicki@surveymonkey.com Mingnanliu@fb.com