Deirdre Giesen & Joep Burger EESW 9-11 September 2013 Measuring and understanding response quality in the SBS questionnaires.

Slides:



Advertisements
Similar presentations
The Response Process Model as a Tool for Evaluating Business Surveys Deirdre Giesen Statistics Netherlands Montreal, June 20th 2007 ICES III.
Advertisements

Paul Smith Office for National Statistics
Preparing Data for Quantitative Analysis
17 September SME Statistics OECD Workshop SME data and methodologies in the EU - item 5 Paul Feuvrier / Eurostat.
Flexible Budgets and Performance Analysis. Learning Objective 1 Prepare a flexible budget.
Administrative Data Used for Short Term Business Statistics by Bente Dyrberg.
Chapter 3 Producing Data 1. During most of this semester we go about statistics as if we already have data to work with. This is okay, but a little misleading.
Sampling and Experimental Control Goals of clinical research is to make generalizations beyond the individual studied to others with similar conditions.
Session 2 : The Downturn & Irish Business Richard McMahon Central Statistics Office.
Course Content Introduction to the Research Process
Does mode matter? Comparing response burden and data quality of a paper and an electronic business questionnaire. Deirdre Giesen Statistics Netherlands.
McGraw-Hill /Irwin© 2009 The McGraw-Hill Companies, Inc. INVENTORIES: MEASUREMENT Chapter 8.
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect5_1.
1 1 Web and Paper Questionnaires seen from the Business Respondent’s Perspective Gustav Haraldsen, Statistics Norway Jacqui Jones, UK Office of National.
The ECB Survey of Professional Forecasters Luca Onorante European Central Bank* (updated from A. Meyler and I.Rubene) October 2009 *The views and opinions.
Example 10.6 Employee Empowerment at ArmCo Company Hypothesis Test for Differences Between Population Proportions.
Factors that Associated with Stress in Nursing Faculty in Thailand
Household Surveys ACS – CPS - AHS INFO 7470 / ECON 8500 Warren A. Brown University of Georgia February 22,
Using survey data collection as a tool for improving the survey process Silvia Biffignandi, Antonio Laureti Giulio Perani University of Bergamo Istat Istat.
EMBA Presentation October 20, Overview  Budgets Within the Architecture of the Organization  Types of Budgets  Budget Preparation Theories 
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Bureau of Labor Statistics All Employee Payroll Project Current Employment Statistics: New Hours and Earnings Series BEA Data Users Conference April 13,
Market Analysis.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 13.
Chapter Eight The Concept of Measurement and Attitude Scales
12th Meeting of the Group of Experts on Business Registers
Research methods in psychology Simple revision points.
A generic tool to assess impact of changing edit rules in a business survey – SNOWDON-X Pedro Luis do Nascimento Silva Robert Bucknall Ping Zong Alaa Al-Hamad.
FAMILY SURVEY STUDY Infant & Toddler Connection of Virginia Statewide Meeting September 25,
Mixed mode in the data collection of SBS statistics within Statistics Sweden Cecilia Hertzman Seminar om Statistical Data Collection, Geneva
Eurostat Overall design. Presented by Eva Elvers Statistics Sweden.
100 years of the UK Census of Production Paul Smith and Stephen Penneck Office for National Statistics.
Implementation of quality indicators in the Finnish statistics production process Kari Djerf Statistics Finland Q2008, Rome Italy.
The application of selective editing to the ONS Monthly Business Survey Emma Hooper Office for National Statistics
Research on the experience of disabled staff within the NHS workforce Peter Ryan & Mike Edwards Findings from the NHS 2014 staff survey and the 2014 Electronic.
Using administrative registers in sample surveys European Conference on Quality in Official Statistics 3-–6 May 2010 Kaja Sõstra Statistics Estonia.
Use of Administrative Data Seminar on Developing a Programme on Integrated Statistics in support of the Implementation of the SNA for CARICOM countries.
Business Project Nicos Rodosthenous PhD 04/11/ /11/20141Dr Nicos Rodosthenous.
4-1 Copyright © 2010 Pearson Education, Inc. Chapter Four Exploratory Research Design: Secondary Data.
VALIDITY AND VALIDATION: AN INTRODUCTION Note: I have included explanatory notes for each slide. To access these, you will probably have to save the file.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Essex Dependent Interviewing Workshop 17/09/2004 British Household Panel Survey.
Statistical Expertise for Sound Decision Making Quality Assurance for Census Data Processing Jean-Michel Durr 28/1/20111Fourth meeting of the TCG - Lubjana.
Turnover for the Commercial and Industrial Machinery Repair and Maintenance Sector in Sweden Johan Åhman
Discussion Session 5: Data collection Maiki Ilves Statistics Estonia
Outlining a Process Model for Editing With Quality Indicators Pauli Ollila (part 1) Outi Ahti-Miettinen (part 2) Statistics Finland.
Household Surveys: American Community Survey & American Housing Survey Warren A. Brown February 8, 2007.
Multivariate Analysis and Data Reduction. Multivariate Analysis Multivariate analysis tries to find patterns and relationships among multiple dependent.
Pre-printing experiences at Statistics Sweden Anders Holmberg Department of Research & Development Statistics Sweden SE Örebro Sweden Tel:
1 Flexible Budgets and Performance Analysis Chapter 10.
A PROJECT REPORT On Total Quality Management in Manufacturing Organization In partial fulfillment o f the requirement for the award of the degree IN [
1 1 Reflections about the Impact Business Questionnaires have on Perceived Response Burden and Survey Quality Gustav Haraldsen Statistics Norway
Characteristics and Sources of Information
Topic (i): Editing nearer the source Work Session on Statistical Data Editing Vienna, Austria April 2008.
Managing Multi Mode Collection Instruments in the 2011 UK Census Frank Nolan, Heather Wagstaff, Ruth Wallis Office for National Statistics UK.
Matching, adverse selection and labour market flows in (post)transition setting: The case of Croatia Iva Tomić The Institute of Economics, Zagreb Polona.
Questionnaire-Part 2. Translating a questionnaire Quality of the obtained data increases if the questionnaire is presented in the respondents’ own mother.
Formulation of the Research Methods A. Selecting the Appropriate Design B. Selecting the Subjects C. Selecting Measurement Methods & Techniques D. Selecting.
Demonstration Problem
SYSTEMS OF MESUAREMENT
EMBA Presentation October 15, 2016
Sampling Fundamentals 2
Improving data quality in business surveys for National Statistics
The Question … Ahmed owns a stationary manufacturing business. He has changed the prices of some of his products. Ahmed has also changed his paper supplier.
Jeroen Pannekoek, Sander Scholtus and Mark van der Loo
The change of data sources in the Spanish SILC
Chapter 7 - FASHION Designing Fashion The Design Process
2.7 Annex 3 – Quality reports
Item 9 Validation in UOE data collection
Presentation transcript:

Deirdre Giesen & Joep Burger EESW 9-11 September 2013 Measuring and understanding response quality in the SBS questionnaires

Outline Why this research project? Data used Quality indicators used Results Issues for discussion

Purpose research project 1.Develop quality indicators for quality “raw data”. 2.Explore how this quality is related to –characteristics respondent –paradata response process –characteristics data collection design –perceived response burden

Possible use of such indicators allow quick response during data collection patterns in measurement error may guide adjustments in data collection design

Data used Raw data Structural Business Survey Manufacturing & Commercial Services about 28thousand responses annually New design since SBS2006 Data Customer Satisfaction Survey respondents SBS2006 (n=1262) & SBS2007 (1468)

Quality indicator 1: Item Response Restricted to 5 core variables for which empty fields are not plausible 1.Number of persons working 2.Persons working in full time equivalents (FTE) 3.Total income 4.Total costs 5.Results (income-costs) Assumption: higher item response = higher quality

Quality indicator 2: Consistency Twelve (partly related) consistency rules defined. E.g. 1a If value for # of persons employed than value for FTE 1b If value for FTE than value for # persons 2 if values for # persons employed & fte employed than # persons employed <= fte employed Assumption: higher consistency = higher quality

Quality indicator 3: use of balancing item “Other costs” should only be a small part of total costs Electronic forms may increase use of “other costs” Assumption: lower % of total costs assigned to “other costs” = higher quality

Background characteristics Size class Type of Industry Timeliness (in time, = 2 months late) New design Electronic or paper questionnaire Perceived burden (easy / difficult, little work / much work).

Item response by time and size class

Rule consistency rate by time and size class

Use of balancing item by time and size class Left: % using item Right: relative value

First impressions findings Overall quality of indicators studied high Overall not many systematic and large differences in quality for characteristics studied (timeliness, new design, type of industry, mode) Smaller businesses lower quality than larger businesse Electronic mode (probably; automatic calculation) some positive effects on item response and consistency; no effect on use of “other costs”. New design has decreased item response of FTE and increased item response on # working persons

Issues for discussion Any similar studies done? What were your findings? Possibly: quality indicators studied do not discriminate enough? Possibly: our background variables not so relevant for explaining differences in quality, but what else do we have readily available?? Your ideas on our ideas for next steps: develop questionnaire specific indicators; check validity indicators with ”golden standard” construct overall quality indicator

Old design 15 # PR FTE PR FTE WP # WP

New design 16 # PR FTE PR # WP FTE WP

E-questions on working persons screen 1 of 3 # PR

E-questions on working persons screen 2 of 3 # WP

E-questions on working persons screen 3 of 3 FTE PR FTE WP

% IR PS2003–2005 (n=81000) PR=Payroll WP=Working persons FTE=Full time equivalent 20 # PR 77% FTE PR 72% FTE WP 84% # WP 72%

IR 2006 & 2007 n=58000 PR=Payroll WP=Working Persons FTE=full time equivalent 21 76% #PR 88% #WP 69% FTE WP 71% FTE PR