Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quality of MIP indicators: Assessment of data and metadata

Similar presentations


Presentation on theme: "Quality of MIP indicators: Assessment of data and metadata"— Presentation transcript:

1 Quality of MIP indicators: Assessment of data and metadata
ESTP Course Luxembourg 9-11 December 2014 Ivana Jablonska & Julien Bollati, MIP TF

2 Outline Quality framework of MIP Indicators Work done by now
Future steps / Discussion

3 Quality framework of MIP Indicators

4 (Draft) MIP Regulation COM(2013) 342 Final
Article 5 The Commission (Eurostat) shall regularly assess the quality of the MIP relevant data (…). The quality assessments shall, as appropriate, make full use of the work carried out, and the results obtained, in the context of existing quality frameworks for MIP relevant data.

5 Quality principles Eurostat mission: to be the leading provider of high quality statistics on Europe European statistics Code of Practice (28th September 2011)

6 Quality principles Public commitment on European Statistics by the ESCB

7 What for? The big question: Our preliminary answer:
What about the quality of the MIP indicator? Our preliminary answer: In order to answer to this question we need to have a look to the inventories/quality report and run a risk assessment in the framework of a stocktaking exercise. Per country and per indicator

8 What for? The small question: Our preliminary answer:
How safe are the MIP indicator? Our preliminary answer: Let's run an expert opinion poll among our team in order to make a ranking. Per country and per indicator

9 Our safeness definitions
Information on sources and methods is clear in inventories/quality reports Sources cover the necessary basic information The compilation practices are in line with legal requirements and good/best practices

10 Our safeness definitions
Risk under control: Information on sources and methods is generally available and mostly clear in inventories/quality reports Sources cover most of the necessary basic information, estimation methods are only used to compensate for a small part of the basic information Compilation practices are in line with legal requirements, but most other Member States use different practices

11 Our safeness definitions
Potential risk: Information on sources and methods is partially available in inventories/quality reports, or fully available but suggesting an incomplete implementation of the methodology The sources of basic information are incomplete Compilation practices are not adequate compared to other Member States or not in line with legal requirements

12 Our safeness definitions
Not known: Information on sources and methods is generally poor It is not possible to quantify whether there is any risk of significant revisions to the data

13 Guiding principles Factual assessment of data and metadata (vs perception) Standardised approach towards different domains and countries

14 The outcomes Based on the reading of data, metadata and quality reports Footnotes and proposed text for the Statistical Annex to the AMR Check for quality improvements with respect to the previous year Clustering by MIP headlines according to risk profile

15 Work done by now

16 The questionnaire We have developed a structured template with over 30 questions

17 Topics covered Institutional environment Resources
CoP / PC Principle 1 Professional independence Institutional environment Authority responsible Legal and institutional environment Sharing of responsibilities Resources Adequacy of resources Cost and burden Principle 2 Mandate for data collection Principle 5 Statistical confidentiality Principle 6 Impartiality and objectivity CoP / PC Principle 3 Adequacy of resources Principle 10 Cost effectiveness

18 Appropriate statistical procedures
Topics covered Quality Management Completeness and timeliness of information provided in the existing inventories and quality reports Quality control procedures Clarity Communication with Eurostat CoP / PC Principle 4 Commitment to quality Principle 8 Appropriate statistical procedures

19 Topics covered Methodological soundness Reliability of the methodology
Implementation of regulations/guidelines/recommendations Expiry of derogations Unexplained breaks in the series CoP / PC Principle 7 Sound methodology

20 Topics covered Revisions Size of routine revision policies
Information on major revisions Data analysis Completeness Timeliness CoP / PC Principle 12 Accuracy and reliability CoP / PC Principle 13 Timeliness and punctuality

21 Coherence and comparability
Topics covered Internal coherence Aggregation checks, outlier tests External coherence Consistency with similar or related data sets Other risks Others CoP / PC Principle 14 Coherence and comparability

22 Sources used BoP Indicators Eurostat database BoP book 2007 (ECB)
SDDS (IMF) Quality reports 2013 (Eurostat) Assessment of the QR 2012 (Eurostat) Quality Report on BoP of MS to European Parliament 2011 (Eurostat) Quality Report on Euroarea data 2013 (ECB)

23 Sources used Financial Sector Indicators Eurostat database
Manual on sources and methods for the compilation of ESA95 financial accounts 2002 (Eurostat) Manual on sources and methods for the compilation of ESA95 financial accounts, 2nd edition – 2011 (Eurostat) Communication from the Commission to the European parliament, the Council and the Eurogroup (Eurostat) Other (websites, other documents)

24 Sources used General Government Debt Eurostat database
EDP Inventories on sources and methods (MS to Eurostat) EDP Mission reports Other (websites, other available documents)

25 Sources used Share of world export Eurostat database
European Union balance of payments/international investment position statistical methods (ECB) SDDS - Balance of Payments (IMF) BOP Quality Reports (Eurostat)

26 Sources used Nominal Unit Labour Cost Index
Databases (Eurostat, OECD, National institutes) SDDS metadata (IMF) Joint OECD/Eurostat questionnaire on NA employment and hours worked Task Force Report on the Quality of the LFS 2009 (Eurostat)

27 Sources used House Price Index Eurostat Database
HPI inventories mapped to ESMS (Eurostat)

28 Sources used Unemployment rate Eurostat database
Quality reports (Eurostat / MS)

29 Sources used Real Effective Exchange Rate
"Self assessment" made by Eurostat

30 Fitness Index On the basis of the scores assigned to the different questions we have compiled a single Fitness for Purpose index. This was obtained aggregating the quantitative scores and the weights assigned to the questions. The final index is normalised and ranges between 0 (maximum risk) and 100 (totally safe).

31 Benchmarking Our answers were checked by domain managers in Eurostat. More in detailed they were asked to: Challenge our scores Complement with additional information (if any) Special case for General Government Debt

32

33 Assessment by Country The information is already partially available but it has not been fully exploited yet. Example

34 Evolution of the Safebook excercise
2013 2014 Only internal MIP project Endorsed by Eurostat management MIP team making assessment Benchmarked by domain managers Only IDR Countries (+ Croatia) All Member States Improved questionnaire

35 Future Self-assessment by domain (short term)
Assessment by Member States (long term) Run the exercise for MIP auxiliary indicators

36 What is your experience?
Brainstorming group work How would you imagine doing a similar quality assessment exercise? Is a quality assessment done for your domain? If yes, how is it done? If not, would it be needed?


Download ppt "Quality of MIP indicators: Assessment of data and metadata"

Similar presentations


Ads by Google