Presentation is loading. Please wait.

Presentation is loading. Please wait.

Group Discussions - Summary

Similar presentations


Presentation on theme: "Group Discussions - Summary"— Presentation transcript:

1 Group Discussions - Summary
Seminar on ESA 2010 Quality assessment 6 April 2016, Instituto Nacional de Estadística (INE), Madrid, Spain Group Discussions - Summary

2 Summary of Group Discussions
Proposed quantitative indicators Group 2 Supporting metadata for quality assessments Group 3 Complementary in-depth analysis and overall process

3 Group 1: Proposed quantitative indicators
Can National Accounts data quality be expressed with quantitative indicators? The group agreed that we measure the categories, but different opinions on the concrete indicators to be used were expressed Maybe not directly, but we are trying to identify good proxies and also recognise clearly their limitations

4 Group 1: Proposed quantitative indicators
Main controversy: number / magnitude of revisions as proxy for accuracy and reliability Might lead to wrong incentive to revise less in order to have a good score in the quality report Revision indicator is not perfect, but acceptable as proxy Eurostat will clarify how revisions are measured No quantitative indicators in the quality reports Metadata on revision policy and on major revisions Overview of metadat provided in country quality report High level overview of revision practice in the Eurostat assessment report

5 Group 1: Proposed quantitative indicators
The group agreed largely with the Keep / Drop proposals Additionally suggested to be dropped: Number of subsequent data transmissions: Ok Indicator 'delivery date of validated data minus legal delivery date' was questioned Eurostat will clarify Coherence: coherently wrong versus incoherently right Current proposal is best proxy and good starting point To articulate the information about data which the user would see, we consider the indicator is relevant for data compliance

6 Group 2: Supporting metadata for quality assessments
General remarks We should add the SIMS numbers to the table to show the clear link between the categories and SIMS  double-check that we only use categories that are defined in SIMS 2.0 Confirmed that the metadata fields (ESMS) are basically a one-off exercise with annual review and not an annual reporting exercise We are not clear on the expected level of granularity between high level (ESA), sub-domain level and existing inventories. Sub-domain level is preferred but impact on implementation to be reviewed. We are not clear on the implementation time table Prepare a roadmap with milestones and deadline for the implementation of the metadata So far only MSs (DK, DE and ES) have implemented the standard for metadata

7 Group 2: Supporting metadata for quality assessments
Suggested to be added: Confidentiality policy Emphasis on the difference between “real” confidentiality (number of enterprises) and “wrong” confidentiality (low reliability). In principle low reliability should be flagged as such, but is often flagged C

8 Group 2: Supporting metadata for quality assessments
Suggested for removal “deviations between methodology and compilation”  we all apply ESA. Methods are explained in “sources and methods”  OK Number of series breaks  can be seen in data and does not have a direct quality aspect  OK Cost and burden  difficult to measure, not comparable across countries, definition is not clear (what to include or not to include), risk of double counting. Also there is not really a clear link to quality. Cost and burden: Take it out from the quality exercise Use another opportunity for detailed analysis Document the reasons for breaks in metadata

9 Group 2: Supporting metadata for quality assessments
To be reviewed: data sources and compilation methods In SIMS those are two fields, we might split. In GNI inventories they are in the same chapter, so we might still keep together but would violate SIMS? To be clarified: “changes between periods and series breaks” Is it metadata or quality, because it might be close to the data itself (i.e. historic data versus current data)

10 Group 2: Supporting metadata for quality assessments
To be clarified: “Statistical processing” Source data and data compilation appears also in “Accessibility and clarity”  duplication? Clarified: “Meta data availability and metadata completeness” Is understood to describe the quality process itself (this exercise), can be pre-filled and possibly extended nationally if additional info is available

11 Group 3: Complementary in-depth analysis and overall process
Suggested to do an annual exercise only (~12 page template) and no in-depth reviews. LFS Quality Report process and model paved the way forward – Report around 20 pages for Member States. Scope and content of in-depth reviews in not clear

12 Group 3: Complementary in-depth analysis and overall process
Reduction of quality indicators welcomed (template of 60+ pages to 45+ pages to around 12 pages) In-depth analysis each year not clear and overlaps with annual report proposal. Need to be reviewed and content determined – then whether inclusion in annual report. In addition to regular annual quality reporting, countries would need to report every year on a different topic. This was concluded as difficult to accept that the amount of work was reduced.

13 Group 3: Complementary in-depth analysis and overall process
Eurostat needs to apply similar critique to the in-depth analyses as with the annual report and whether they can be incorporated in the annual reporting and which items should be dropped and delete the overall in-depth analysis. No periodic in-depth analysis. Instead include some reasonable number of indicators into the annual quality report. Eurostat will make a proposal.


Download ppt "Group Discussions - Summary"

Similar presentations


Ads by Google