The measurement effect in PC smartphone and tablet surveys Valerija Kolbas University of Essex ISER Ipsos-MORI The European Survey Research Association.

Slides:



Advertisements
Similar presentations
Qwizdoms Rapid Learning Environment (RLE) helps accelerate and improve learning through the use of interactive software, instant feedback, and Click Play.
Advertisements

Earth Science in secondary schools in England & Wales Teacher attitudes and levels of teaching.
LIST QUESTIONS – COMPARISONS BETWEEN MODES AND WAVES Making Connections is a study of ten disadvantaged US urban communities, funded by the Annie E. Casey.
Surveys and Questionnaires. How Many People Should I Ask? Ask a lot of people many short questions: Yes/No Likert Scale Ask a smaller number.
COLLECTING DATA ON A SAMPLE OF RESPONDENTS Designing survey instruments.
1 UNIVERSITY OF BRADFORD IT SERVICES STUDENT SURVEY Prepared by Stuart Wright, Senior Research Executive, June 2009 J4130
Learning Community II Survey Spring 2007 Analysis by Intisar Hibschweiler (Core Director) and Mimi Steadman (Director of Institutional Assessment)
Wiltshire Youth Cricket League National Cricket Playing Survey 2014 Results.
Third Party Advertising Evaluation: American Express eStatement Topline July 2008.
Marketing 1-1 Creating, communicating, and delivering value to customers. Managing customer relationships that benefit the organization and its stakeholders.
An Introductory Instructional Module on Twitter as a Communication Tool for University Students Matt Spencer University of Hawaii at Mānoa
Miguel Martins, Senior Lecturer – Marketing Department eLearning Celebration 22 July 2007 Improving performance on a Marketing module through the use of.
2012 SOCIAL MEDIA AND MOBILE TECHNOLOGY SURVEY PRELIMINARY RESULTS.
1 What happens if you offer a mobile option to your web panel? Results from a pilot study comparing a desktop and mobile device survey Vera Toepoel (Utrecht.
CHAPTER 5: CONSTRUCTING OPEN- AND CLOSED-ENDED QUESTIONS Damon Burton University of Idaho University of Idaho.
CBR 106: Developing a Client Satisfaction Survey.
Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:
The impact of open-ended questions: A Multivariate Study of Respondent Engagement Steven Gittelman, Ph.D. Mktg, Inc. New York.
In what ways are our superstitions and beliefs in the supernatural different to Chinese people?  The following is an example to help with your Tangjia.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
NorthSky Nonprofit Network Creating Customer Satisfaction Surveys Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates
Nursing Excellence Conference April 19,2013
Effects of best possible self, strengths and gratitude interventions on SWB 10 th Annual Australian Quality of Life Conference, Melbourne, 20 th November.
A statistical method for testing whether two or more dependent variable means are equal (i.e., the probability that any differences in means across several.
ERA Manager Training December 19, Propriety and Confidential. Do not distribute. 2 ERA Manager Overview In an effort to reduce the need for Providers,
Mixed mode in LFS: questionnaire design and mode-effects Liisa Larja, Statistics Finland 9th Workshop on Labour Force Survey Methodology Rome, 15 th of.
Use of Eye Movement Gestures for Web Browsing Kevin Juang Frank Jasen Akshay Katrekar Joe Ahn.
Understanding the Decision to Participate in a Survey and the Choice of the Response Mode Anders Holmberg and Boris Lorenc European Conference on Quality.
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
User Study Evaluation Human-Computer Interaction.
Types of Data in FCS Survey Nominal Scale – Labels and categories (branch, farming operation) Ordinal Scale – Order and rank (expectations, future plans,
Developing a Tool to Measure Health Worker Motivation in District Hospitals in Kenya Patrick Mbindyo, Duane Blaauw, Lucy Gilson, Mike English.
The implication of developmental opportunities and diversity on social perception at CPUT: a case study Petro Coreejes-Brink Debbie Becker 29/9/2010LIASA.
Apple vs Samsung Tablets Time Completion & Word Count Amaryllis Mavragani & Konstantinos Tsagarakis Business and Environmental Technology Lab Environmental.
Panel Study of Entrepreneurial Dynamics Richard Curtin University of Michigan.
Protestant Pastors’ Reaction to a Statement on Islam.
A Case Study of Interaction Design. “Most people think it is a ludicrous idea to view Web pages on mobile phones because of the small screen and slow.
The Impact of Mobile First and Responsive Web Designs Kevin Tharp
Examination of Public Perceptions of Four Types of Child Sexual Abuse Prevention Programs Brandon Kopp Raymond Miltenberger.
EDTECH Module 7 Technology Survey by J.D. Winterhalter.
STUDENTS’ SELF-REPORTS ON LEARNING EXPERIENCES IN UW-ACE (Fall 2004) Survey and Report Prepared by Vivian Schoner, PhD Strategic Consultant, Research and.
Response option order effects. Scale lengths and horizontal or vertical layout. Johan Martinsson University of Gothenburg
What is meant by mode effect on measurement? A research study to identify causes of mode effects Gerry Nicolaas.
Survey Research Chapter 7. The Nature of Surveys  Definition  Advantages  Disadvantages –Errors.
Questionnaires Questions can be closed or open Closed questions are easier to analyze, and may be done by computer Can be administered to large populations.
Acknowledgments We thank Dr. Yu, Dr. Bateman, and Professor Szabo for allowing us to conduct this study during their class time. We especially thank the.
Fall 2002CS/PSY Empirical Evaluation Data collection: Subjective data Questionnaires, interviews Gathering data, cont’d Subjective Data Quantitative.
Students as Change Agents Exploring issues of Student Engagement among On- Campus MSc Students Denise Ryder, Jonathan Doney, Nii Tackie-Yaoboi With Nadine.
College Student’s Beliefs About Psychological Services: A replication of Ægisdóttir & Gerstein Louis A. Cornejo San Francisco State University.
Investigating Free On-line Survey Tools for Use in a Business Statistics Class Joan Donohue University of South Carolina 1.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
INTERVIEWS AND QUESTIONNAIRES Week 8 Research Methodology NJ Kang.
TEXT MESSAGING AS A FORM OF COMMUNICATION A presentation about the hypothesis, survey and results.
Survey of 1,000 Protestant Pastors Protestant Pastors Views on Ministry.
Shimon Sarraf Jennifer Brooks James Cole Xiaolin Wang National Survey of Student Engagement Indiana University Bloomington What is the Impact of Smartphone.
Faculty Well-Being Survey: Assessment Activities Presentation for the NC State Assessment Work Group May 2, 2007 Nancy Whelchel, PhD Assistant Director.
UKZN Employee Engagement Survey – 2013 Overall Report 1.
DESIGNING GOOD SURVEYS Laura P. Naumann Assistant Professor of Psychology Nevada State College.
The Delivery Matters: Examining Reactivity in Question Answering Riley E. Foreman, Marshall T. Beauchamp, and Erin M. Buchanan Missouri State University.
Conducting surveys and designing questionnaires. Aims Provide students with an understanding of the purposes of survey work Overview the stages involved.
Bivariate Association. Introduction This chapter is about measures of association This chapter is about measures of association These are designed to.
Adapting questionnaires for smartphones:
Improving Survey Response Rates: The Effect of Embedded Questions in Web Survey Invitations Nick Inchausti, SurveyMonkey Mingnan Liu, Facebook.
Survey Design Web Surveys.
Presentation by: Nora, Katherine, Carmen, and Shadia
Evaluation of a multimodal Virtual Personal Assistant Glória Branco
Effects of Design in Web Surveys
Give your feedback about the module online!
Give your feedback about the module online!
Deciding the mixed-mode design WP1
Presentation transcript:

The measurement effect in PC smartphone and tablet surveys Valerija Kolbas University of Essex ISER Ipsos-MORI The European Survey Research Association Conference July, 2015 Reykjavik

Background and motivation Limiting surveys to the PC mode affects the size and representativeness of the sample PC, smartphones, tablets differ in:  size of the screen  input method  speed of connection  processing power Different effect on the measurement error

Background and motivation Smartphone  more break-offs  longer completion time  More straightlining Mixed results Other indicators  question order effect  primacy effects  open-ended answers  response distribution … but non-optimized mobile design affects completion rates, satisfaction with the survey  Tablet  Less primacy effect  Less straightlining  Less or comparable completion tim e

Background and motivation PC and mobile response distributions are equally affected by response formats Drop-boxes –preference for first options Grids – preference for visible options, straightlining - no conclusive evidence which format is better for mobiles

Research Question How mode of administration and response format affect survey responses Indicators of measurement error:  overall satisfaction rates  straightlining  response distribution  length of open-end questions

Survey and Questionnaire design National Satisfaction Survey 2014 Administered to final year higher education students in UK 22 core questions using a 5-point Likert scale 2 open-end questions Mixed-mode: self-selected mail, phone, web 5 response formats – randomly allocated

Sample composition PC N=5529 Smartphone N=3196 Tablet N=551 Radio-button Drop-box () Drop-box (0) Drop-box (-) Drop-box (+) Radio-button Web Survey Sample N=9276

Screenshots: response design Once clicked a list with options appears on a separate screen. Responses always in the same order

Screenshots: response design All questions are visible on the screen. Requires horizontal scrolling. Portrait or landscape viewin g. Requires vertical scrolling.

10 Measurement Effect between PC smartphone tablet responses Comparisons made across all three modes, but within one radio-button response format

Measurement Effect between PC smartphone tablet FpPCsmartphonetablet Straightlining3.9<.056.9%10.3%6.4% Modal responses <1>.05 ‘Mostly Agree’ ‘Definitely Agree’ & ‘Mostly Agree’ Mean values Positive feedback 1.3> Negative feedback 1> MANOVA to test for differences F=2.3, p<.05

ME between PC, smartphone, tablet summary Smartphone straightlining significantly higher Tablet straightlining rate the lowest Signs of the visibility effect for smartphone Other quality indicators comparable between all three modes

13 Measurement Error between different response designs presented on a smartphone and a tablet Comparisons made across five survey response formats within smartphone and tablet modes

MANOVA test of differences F=2.7, p<.05 Fp device2.7<.05 format1.7<.05 interaction  Both device and format affected data quality

Responses in Drop-box with a positive initial option Smartphones  higher selection of initially suggested response.  less moderately positive responses  11.1% straightliners  Mean 4.3 Tablets  Similar selection of extreme and moderately positive responses  More negative responses  5.6% straightliners  Mean 4.1

Responses in Drop-box with a negative initial option Smartphones  More extreme negative responses  Less positive responses  8.4% straightliners  Mean 4 Tablets  More positive responses  Extremely low ‘Definitely Disagree’ frequency  2.5% straightliners  Mean 4.1

Responses in Drop-box with a middle initial option Smartphone and Tablet Comparable results  Weak evidence of selecting the middle option  Most frequent ‘Mostly Agree/Disagree’ selection across formats  6.5% and 6.1% of straightliners  Mean 4 an 3.9

ME indicators summary Smartphone – answers affected by response formats.  Initially suggested response is selected more often Tablet – no strong effect of response formats. No significant differences in the length of open answers between formats.

Potential Limitations No reverse-coding Similar question wording Not counterbalanced question or response order Instructions universal for each response format Self selected device condition Survey sample: highly educated, IT literate, similar age, highly motivated

Thank You