ESSnet on linking of micro-data on ICT usage

Slides:



Advertisements
Similar presentations
OECD World Forum Statistics, Knowledge and Policy, Palermo, November
Advertisements

Natale Renato Fazio, Stefano Menghinello, Carmela Pascucci and Carla Sciullo Foreign trade and multinational enterprises statistics Division ISTAT ITALY.
Statistics NZs experience in using Administrative Data in an Integrated Programme of Economic Vince Galvin General Manager Strategy & Communications.
ICT impact assessment by linking data Economic and Labour Market Review October, 2009 Analysis of ICT statistics in 13 countries, building economic analysis.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
United Nations Statistics Division Scope and Role of Quarterly National Accounts Training Workshop on the Compilation of Quarterly National Accounts for.
Results and next steps from the ESSnet Admin Data Alison Pritchard Business Outputs & Developments, Office for National Statistics, UK 4 December 2012.
The UK Productivity Puzzle, : Evidence using Plant Level Estimates of Total Factor Productivity presentation by Richard Harris This work contains.
New Zealand’s International Trade Towards an integrated approach February 2011.
Statistics on enterprise groups – the EGR potential European Commission – Eurostat Directorate G: Global business statistics.
Regional GDP Workshop. Purpose of the Project October Regional GDP Workshop Regional GDP Scope Annual Current price (nominal) GDP By region.
Eurostat Overall design. Presented by Eva Elvers Statistics Sweden.
Linking micro data for the analysis of ICT effects Mika Maliranta, ETLA Istat – Stat Fin Workshop, June 26th and 27th, Rome.
BCO Impact Assessment Component 3 Scoping Study David Souter.
Compilation of Distributive Trade Statistics in African Countries Workshop for African countries on the implementation of International Recommendations.
ESSnet on linking of micro-data on ICT usage Progress Report Mark Franklin UK Office for National Statistics Cologne: 27 October 2011.
EU Code of Practice Peer Review 2006 – 8 :A Peer’s Perspective Frank Nolan Office for National Statistics UK.
USE OF E- COMMERCE DATA International comparisons and a micro-perspective Michael Polder, OECD-STI/EAS Business Statistics User Event: How E-commerce is.
Developing the prototype Longitudinal Business Database: New Zealand’s Experience Julia Gretton IAOS Conference Shanghai, China, October 2008
Copyright 2010, The World Bank Group. All Rights Reserved. Managing processes Core business of the NSO Part 1 Strengthening Statistics Produced in Collaboration.
Business data linking recent UK experience. business data in the UK common register (IDBR) since 1994 key law: Statistics of Trade Act 1947 data collection.
Ownership and fragmentation in UK manufacturing presentation by Richard Harris This work contains statistical data from ONS which is Crown copyright and.
A good measure of productivity Eric Bartelsman Vrije Universiteit Amsterdam and Tinbergen Institute Washington, World Bank, October 31, 2005.
United Nations Statistics Division Developing a short-term statistics implementation programme Expert Group Meeting on Short-Term Economic Statistics in.
Short Training Course on Agricultural Cost of Production Statistics
A register on Multinational Enterprise groups
RPES Project Support Meeting
Herman Smith United Nations Statistics Division
EU-KLEMS project: Progress in Economic Underpinnings and Measurement
Development of Strategies for Census Data Dissemination
Oslo City Group Clean Technology Satellite Account
Cost analysis of key statistical products
Cost of Production: Uses and Users
Working with Sensitive or Confidential Data John Southall Bodleian Data Librarian Subject Consultant for Economics, Sociology, Social Policy and.
Presentation by Eurostat
UNECE Work Session on Gender Statistics, Belgrade,
TRADE MICRODATA: OECD PERSPECTIVES
The SEEA indicator initiatives A preliminary note
WG ON DATA TEMPLATE AND ANALYTICAL INDICATORS
"Development of Strategies for Census Data Dissemination".
ICT and E-Business Branch
Sharne Bailey, Tony Byrne UK, Office for National Statistics
Guidelines on Integrated Economic Statistics
Structural Business Statistics Data validation
UNECE/EFTA/Eurostat workshop
2. An overview of SDMX (What is SDMX? Part I)
1 Introduction: Micro Economics for Managers. 2 Economics & Economic Analysis What do you mean by Economics? A simple definition of economics: “It is.
6.1 Quality improvement Regional Course on
Enhancing statistical practices to improve data sharing
Evaluation in the GEF and Training Module on Terminal Evaluations
Guidelines on Integrated Economic Statistics
Director General of the National Accounts
Information Society Statistics
Quality-Adjusted Labour Input
ESS Vision 2020.
UNODC-UNECE Manual on Victimization Surveys: Content
Draft Methodology for impact analysis of ESS.VIP Projects
Guidelines on Integrated Economic Statistics
World Bank, Washington DC
High-level Working Group on Statistical Confidentiality
3.4 Modernisation of Social Statistics
Director Be Birmingham Third Sector Assembly, Annual Conference
Satellites and beyond GDP
Director Be Birmingham Third Sector Assembly, Annual Conference
Executive Director, Atkinson Review
Metadata on quality of statistical information
ESS Vision 2020.
Key messages e-Frame Conference on Measuring Well-Being and Fostering the Progress of Societies Martine Durand, OECD Paris, 28 June 2012.
Meeting of the EHIS Technical Group Luxembourg January 2012
Data Architecture project
Presentation transcript:

ESSnet on linking of micro-data on ICT usage Progress Report Mark Franklin UK Office for National Statistics Cologne: 27 October 2011

Agenda Context: Brief overview of project Some project issues Q & A What is the project about? Where does the project sit in the statistical system? Brief overview of project Building on Feasibility Study Some project issues Q & A

Making better use of data existing in the statistical system: Purpose of project Making better use of data existing in the statistical system: Produce new policy relevant indicators without the need to collect more data and without increasing the burden on enterprises. Re-use data for purposes beyond the initial objectives for collecting such data. Focus on economic impacts of ICT usage. But the methodology can be generalised to a range of policy issues and data sources.

Where does the project sit in the statistical system? Project indicators are examples of distributed micro data or meso data Meso data sit between macro and micro forms of data Illustrate by the data-generating processes …

adding-up constraints Macro: data process Surveys “Black box” Compilation process Macro Indicators Admin data Judgements, adding-up constraints etc Macro indicators (national accounts, trade, inflation, public finances etc) cannot be reproduced purely from survey data. Macro indicators are contingent on national accounts conventions (SNA, ESA), e.g. GFCF asset classes. Macro indicators are rich in structure and consistency (with other indicators, and other countries’ data), but poor in detail.

Micro: data process Published micro Indicators Clean data, Re-weight etc Run survey [Some NSIs] Micro dataset available to researchers in safe centre Micro indicators can in principle be reproduced purely from survey data. Micro indicators are contingent on survey design, e.g. E-Commerce survey. Micro datasets are rich in detail, poor in consistency and structure. In particular, cross-country analysis of microdata is difficult.

Meso: data process Meso Indicators, Country #1 Survey #1, Country #1 Common data-generating Code, Multiple Countries Meso Indicators, Country #2 Survey #2, Country #1 Meso Indicators, Country 3…. Survey #3 …, Country #1 Meso indicators can be reproduced purely from (micro-data versions of) survey data. Design is contingent on survey design, informed by policy relevance, e.g. ICT usage characteristics of firms by quartile of productivity; Cut survey data by industry, size class, whether multinational, young/old etc. Exploits richness of firm-level variation; yet consistent between countries.

Example: Should governments subsidise investment in broadband networks? Evidence based policy making – need evidence on relationship between broadband access and firm performance across a group of countries. Could design a new survey to investigate the relationship (Q1:Do you have access to broadband? Q2: What is your growth of turnover/employment? …), but… Costly Time consuming Difficult to co-ordinate across countries Add to “red tape” burden on survey respondents What’s wrong with using ‘macro’ indicators? Not the same firms! What’s wrong with using ‘micro’ indicators? Cannot identify impacts of policy changes from a single country study Multi-country micro studies are rarer than hens teeth. Structural / “micro” policy also referred to as “supply-side” economics or “Reagonomics”. Concerned with improving the performance of individual firms, with getting more output from scarce resources, with improving productivity. [Bullet #1] [Bullet #2] – why a group of countries? Because we need to control for other changes, like a control group in a drugs trial. [Bullet #3] [Bullet #4

Meso: Indicators MexElec – Manufacturing exluding electicals. This slide shows data from the industry/country datasets. In this case the MEAN of fast-internet penetration by quartile of the TFP distribution in a single year (2004). Ranked by average penetration. Interpretation: Higher productivity quartile of firms generally displays higher DSLPCT… But this is not always the case And there is substantial variation in the level of penetration across the sample.

Meso: analysis Meso – Micro plus re-allocation (including the dynamic process by which high performance firms grow at the expense of low-performing firms). Each dot is an observation on an industry/country/year, showing productivity against the share of workers with access to broadband. Regressions show positive and significant coefficients between a range of ICT indicators and productivity at the industry level.

Project Overview 15 NSIs. Steering group of 5 NSIs make recommendations to whole group 22 months: December 2010 – October 2012 2 contracted academic partners, plus liaison with other research bodies 7 Workstreams: Co-ordination and financial management (ONS) Metadata Review (ONS) (Lessons for) survey strategies (Stats Norway) Impact analysis (CBS) Dissemination (ONS) Technical infrastructure (Stats Sweden) Data dissemination (CBS) Outer circle – basics – Metadata stage, data assembly and cleaning, mapping to code, code execution, qa of outputs. Middle circle – workstream leaders (data sharing), research leads and champions/mentors. Draft sections of project report. Inner circle – project steering group – decision-making, strategic guidance.

Builds on 2006-08 Feasibility Study Broader scope: More participants Longer runs of annual datasets New datasets, in particular the Community Innovation Survey Develop and generate meso indicators, and conduct some exploratory analysis of ICT impacts using these indicators Develop a schema for providing access to indicators Explore lessons for survey strategies.

Project Issues - 1 Choice of indicators and data boundaries (workstreams (b) and (d)) E-commerce variables: a range of different views across the project group over what variables are most relevant CIS variables: initial set of indicators agreed by analytical steering group, coded by academic contractor, being tested by steering group Cycling through metadata-indicators-analysis is a time-consuming process.

Project Issues - 2 Data sharing (workstreams (f) and (g)) Meso indicators are not micro-data, but are derived from micro-data, and hence subject to disclosure control Two dimensions to this issue: Internal: Secure FTP platform on which cross-country meso indicators are compiled. Access restricted to analytical steering group, subject to confidentiality agreements. External: Develop a protocol under which the cross-country meso indicators could be made available to outside researchers, and beyond the life of this project.

Any questions?

Economic Interpretation Division Office for National Statistics Mark Franklin Economic Interpretation Division Office for National Statistics Mark.Franklin@ons.gov.uk +44 (0)1633 455981 This work contains statistical data from ONS which is Crown copyright and reproduced with the permission of the controller of HMSO and Queen's Printer for Scotland. The use of the ONS statistical data in this work does not imply the endorsement of the ONS in relation to the interpretation or analysis of the statistical data. This work uses research datasets which may not exactly reproduce National Statistics aggregates.

Blank slide

Feasibility study on national survey strategies Workstream C - led by NOR. Objectives to carry out a feasibility study on redesign of national survey strategies, including a study of the existing practices. to present strategies for improving data representativeness including their cost-benefit analysis. The study will cover linked datasets provided by the participating NSIs. Components Analysis of existing surveys and practices to improve representativeness of linked data. Presentation of the main challenges to data linking. Ways to improve representativeness of the linked data.

Project time line The approach comes in three main stages. Metadata checking: Metadata is simply “data about data”. For example some of the metadata collected about variables held by countries include quotes of survey questions, numerical range of the variable, unit of variable, information on the source survey (frequency, sampling frame etc). This is a crucial process because if the same piece of code is to be run in every country then it must know what the characteristics of the data are. More information later. Code development and running: A discussion process with project members. Most productive development takes place at face to face meetings. The project has had a number of meetings where representatives from all of the countries and the project academic gather to make decisions on the content of core code and themes. For example what analysis should the project be focussing on? What variables should be merged? What are the key variables to be studied? The code is distributed to the project members who then run it on their individual firm level data. Results are reported to central location via a secure server. Developing the code is a cyclical process, in that after the code is run there is a stage of fixing bugs and re-releasing the code (sometimes with further analytical powers). For example we are now using version 2.3 of the code. Analysis and reporting of data