Jennifer Crafts, Westat Audrey Kindlon, National Science Foundation,

Slides:



Advertisements
Similar presentations
Developing Satisfaction Surveys: Integrating Qualitative and Quantitative Information David Cantor, Sarah Dipko, Stephanie Fry, Pamela Giambo and Vasudha.
Advertisements

Making the Case for Metadata at SRS-NSF National Science Foundation Division of Science Resources Statistics Jeri Mulrow, Geetha Srinivasarao, and John.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
From Start to Pilot: Comprehensive Redesign of a Questionnaire Measuring Foreign Direct Investment Alfred D. Tuttle Rebecca L. Morrison United States Census.
Chapter 13 Survey Designs
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
Consumer Expenditure Survey Redesign Jennifer Edgar Bureau of Labor Statistics COPAFS Quarterly Meeting March 4, 2011.
Chapter Three Research Design.
PPA 502 – Program Evaluation Lecture 4a – Qualitative Data Collection.
Survey Designs EDUC 640- Dr. William M. Bauer
Proposal Writing.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Development of Evaluation Instruments based on ICT Skills Standards for Teachers (ISST) 7th May, 2014.
Business and Management Research
Gemini Project Overview Adam Safir Division of Consumer Expenditure Surveys December 2010.
Evaluating a Research Report
Exploratory Research Design Week 02
Research Design.
SURVEY RESEARCH.  Purposes and general principles Survey research as a general approach for collecting descriptive data Surveys as data collection methods.
1 Learning Objectives: 1.Understand data collection principles and practices. 2.Describe the differences between collecting qualitative and quantitative.
8. Observation Jin-Wan Seo, Professor Dept. of Public Administration, University of Incheon.
SURVEY RESEARCH AND TYPES OF INFORMATION GATHERED.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Regional Seminar on Promotion and Utilization of Census Results and on the Revision on the United Nations Principles and Recommendations for Population.
Copyright 2010, The World Bank Group. All Rights Reserved. Recommended Tabulations and Dissemination Section B.
Record Keeping Studies – Love ‘Em or Leave ‘Em National Science Foundation Division of Science Resources Statistics International Conference on Establishment.
Margaret Blake RESEARCH METHODS FESTIVAL: Evaluating questionnaires with cognitive testing.
1. 2 Issues in the Design and Testing of Business Survey Questionnaires: Diane K. Willimack U.S. Census Bureau Economic Census The International.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
United Nations Expert Group Meeting on Revising the Principles and Recommendations for Population and Housing Censuses New York, 29 October – 1 November.
Research to Develop and Pretest the Tribal Enrollment Question for American Indians and Alaska Natives Rodney L. Terry, Laurie Schwede, Aleia Fobia,
Division of HIV/AIDS Managing Questionnaire Development for a National HIV Surveillance Survey, Medical Monitoring Project Jennifer L Fagan, Health Scientist/Interview.
Data Collection Techniques
From Question to Action: Creating In-House Surveys as a part of Data Driven informed Decision Making David Consiglio EDUCAUSE Connect april 22, 2015.
Rachel Vis-Visschers & Vivian Meertens QDET2, 11 November 2016
DATA COLLECTION METHODS IN NURSING RESEARCH
Merja Kallio-Peltoniemi QDET2/12th November 2016
Context for the experiment?
THINK Public Relations
Current status of the planning of the AC
Lecture3 Data Gathering 1.
Heather Ridolfo, Virginia Harris and Emilola Abayomi
Research strategies & Methods of data collection
Survey Research.
Key findings on comparability of language testing in Europe ECML Colloquium 7th December 2016 Dr Nick Saville.
National Center for Education Statistics
The Missing Link: From Concepts to Questions in Economic Surveys
Research strategies & Methods of data collection
Collaboration with Google Drive
Qualitative vs. Quantitative Research
Business and Management Research
Statistics and Research Desgin
Data Collection techniques Marina Signore
CIS 210 Systems Analysis and Development
Data and Data Collection
Improving and Using Family Survey Data
Web Usage in a Business Panel Survey
UNODC-UNECE Manual on Victimization Surveys: Content
RESEARCH METHODS Lecture 19
Focus of SRS COV and Follow-Up Actions by SRS
Research strategies & Methods of data collection
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
COMP444 Human Computer Interaction Usability Engineering
Marketing Research: Course 4
National Science Foundation
User CENTERED DESIGN IB TOPIC 7.
2.7 Annex 3 – Quality reports
Multi-Mode Data Collection
Turkish Statistical Institute Demographic Statistics Department
Presentation transcript:

Qualitative Framework for Iterative Development of NCSES’s Microbusiness Survey Jennifer Crafts, Westat Audrey Kindlon, National Science Foundation, National Center for Science and Engineering Statistics Brad Chaney, Westat QDET2, Miami, November 2016

Topics Survey sponsor Overview of the Microbusiness, Innovation, Science and Technology (MIST) survey NSF approach to survey development Qualitative framework for development and testing Research objectives Methods Recommendations

The National Center for Science and Engineering Statistics (NCSES*) One of the few named “entities” in the National Science Foundation (NSF) with stated responsibilities “...to provide a central clearinghouse for the collection, interpretation, and analysis of data on scientific and engineering resources and to provide a source of information for policy formulation by other agencies of the Federal Government…” *Created by the America Competes Reauthorization Act of 2010 (December 6, 2010) as successor to Science Resources Statistics (SRS)

Overview of the MIST Survey Rationale: The National Academy of Sciences’ Committee on National Statistics (CNSTAT) recommended developing a new survey to measure R&D and innovation data Population: Very small (i.e., micro), independent U.S. businesses with fewer than five employees Planned modes: Web and paper Planned content: Research & Development activities and expenditures Innovation, intellectual property, technology transfer Entrepreneurial strategies Entrepreneur demographic and workforce characteristics

NCSES’s Approach to Survey Development Data user workshops to elicit data needs Input from expert panels Respondent-centered development methods Iterative test/revise cycles

Research Objectives for Development and Testing Assess, across respondents: Motivation to respond Relevance of questionnaire topics Feasibility of response Question comprehension Ease and accuracy of response without record check Identify causes of reporting issues and errors Observe navigability

Survey Development and Testing Methods # Interviews Format Exploratory interviews 17 (in 3 rounds) In-person, online Cognitive interviews 21 (in 3 rounds; plus 4 re-interviews) Debriefing interviews (Pretest) 20 Online Usability test sessions 9 (2 rounds; plus 5 re-tests) Debriefing interviews (Pilot Test) 26 (in 2 phases) Total 93 (plus 9 re-contacts)

Research Objectives Method Exploratory interviews Cognitive interviews    Method Research Objectives Motivation to respond Relevance of questionnaire topics Feasibility of response Question comprehension Ease and accuracy of response Reporting issues and errors Navigability Exploratory interviews Cognitive interviews Debriefing interviews (Pretest) Usability testing sessions Debriefing interviews (Pilot test)

Exploratory Interviews  Method Research Objectives Motivation to respond Relevance of questionnaire topics Feasibility of response Exploratory interviews   XX XX = Primary focus during interviews Interview Procedure Interviewer and respondent explored topics, looked at borrowed and new questions to address relevance and feasibility Respondent Issues Topics not relevant Key concepts (R&D, innovation, number of employees) difficult to understand and answer about

Cognitive Interviews Interview Procedure Respondent: Completed survey    Method Research Objectives Motivation to respond Relevance of questionnaire topics Feasibility of response Question comprehension Ease and accuracy of response Cognitive interviews   XX Interview Procedure Respondent: Completed survey Interviewer: Observed, administered retrospective debriefing Respondent Issues Although some changes worked, there were still comprehension issues Skipped instructions and definitions Over-reported research and innovative activities

Original Approach for Asking About R&D

Revised Approach for Asking About R&D

Debriefing Interviews (Pretest)    Method Research Objectives Motivation to respond Relevance of questionnaire topics Feasibility of response Question comprehension Ease and accuracy of response Reporting issues and errors Debriefing interviews (Pretest) XX X XX  X = Continue to monitor during interviews Interview Procedure Interviewer: Shared screen to show completed survey; probed on motivation to respond, sections/questions respondent mentioned as difficult, inconsistent or atypical responses Respondent Issues Reporting errors Inconsistent responses Item non-response

Usability Testing Sessions    Method Research Objectives Motivation to respond Relevance of questionnaire topics Feasibility of response Question comprehension Ease and accuracy of response Reporting issues and errors Navigability Usability testing sessions   X  X  X XX  Test Procedure Respondent: Completed survey (face-to-face or via shared screen) Moderator: Observed; reviewed survey responses and administered retrospective debriefing Respondent Issues Labels for navigation options were misunderstood Error message text was misunderstood

Debriefing Interviews (Pilot Test)    Method Research Objectives Motivation to respond Relevance of questionnaire topics Feasibility of response Question comprehension Ease and accuracy of response Reporting issues and errors Navigability Debriefing interviews (Pilot test)   XX X (for web responders) Interview Procedure Interviewer: Shared screen to show completed survey; probed on motivation to respond, difficult sections/questions, inconsistent or atypical responses Respondent Issues Questionnaire itself was easy to understand and answer However, motivation was problematic

Methods and Research Objectives    Method Research Objectives Motivation to respond Relevance of questionnaire topics Feasibility of response Question comprehension Ease and accuracy of response Reporting issues and errors Navigability Exploratory interviews XX Cognitive interviews Debriefing interviews (Pretest) X XX  Usability testing sessions X   X Debriefing interviews (Pilot test) (for web responders) Methods and Research Objectives

Recommendations Plan for involvement of respondent population throughout development/redesign process Ensure breadth of representation Use multiple methods to elicit respondent input Define research objectives for each method Plan for iterative, small-scale testing Address feasibility early Continuously assess motivation to respond, comprehension, and reporting accuracy If possible, test until revisions work well for respondents (saturation) and no new significant issues surface

Questions? Contacts: jennifercrafts@westat.com Audrey Kindlon akindlon@nsf.gov www.nsf.gov/statistics/ Brad Chaney bradchaney@westat.com