Download presentation
1
The Data Quality Assessment Framework
OECD Meeting of National Accounts Experts October 2001
2
Purpose of this Presentation
To describe: The IMF’s Data Quality Assessment Framework (DQAF), and Experience to date with the DQAF for Reports on Observance of Standard and Codes (ROSCs) and beyond.
3
Plan for Presentation Origins of DQAF DQAF Approach Links to SDDS/GDDS
Framework: what is it? Process: how was it developed? Draft framework: an overview The DQAF suite of assessment tools The work ahead Links to SDDS/GDDS Working with the DQAF In describing the work on data quality, I will touch upon: To date, this is staff work undertaken within the Statistics Department.
4
Origins of Recent Work SDDS and GDDS: broadening the scope of data standards to strengthen the link with data quality Provision of data by members to the IMF: a need to be clearer about what is called for ROSC’s: a need for an even-handed approach to assessing data quality
5
Increased Interest in Data Quality
More widely, interest in quality follows from explicit use of statistics in policy formulation and goal setting: Inflation targeting (spotlight on CPI) Stability Pact in the context of EMU (spotlight on debt/deficit ratios to GDP) UN Conferences on Least Developed Countries (inclusion and graduation from list is based on specified economic indicators)
6
The IMF’s Approach
7
The IMF’s Approach Data Quality Reference Site at the IMF’s Dissemination Standards Bulletin Board The Site provides an introduction to the topic of data quality and includes a selection of reference materials and articles on data quality issues.
8
The DQAF: What is its Purpose?
Its potential uses To guide data users—to complement the SDDS and GDDS To guide IMF staff in assessing data for IMF surveillance and operations, in preparing ROSCs, and in designing Technical Assistance To guide country efforts (self-assessment)
9
The DQAF: Requirements
Given these differing potential uses, the framework should be: Comprehensive Balanced between experts’ rigor and generalists’ bird’s-eye view Applicable across various stages of statistical development Applicable to the major macroeconomic datasets Designed to give transparent results Arrived at by drawing on national statisticians’ best practices
10
The DQAF : What Is It? Generic Dataset- Specific etc. etc. etc. GFS
BOP NA
11
How the DQAF Was Developed
We engaged a national statistical office to help develop the generic framework In parallel, IMF staff worked on frameworks for several datasets National accounts was reviewed in June 2000 National accounts (revised) and four other specific frameworks were circulated informally in the international statistical community for comment in August-September 2000
12
How the DQAF Was Developed
Drafts were discussed in topical or regional meetings, e.g. East Asian Heads of NSOs ECB Working Group on Money and Banking Statistics IMF BOP Statistics Committee GFS Expert Group meeting
13
How the DQAF Was Developed
IMF staff tested the frameworks in the field A paper for the Statistical Quality Seminar in December 2000 presented: Revised generic framework Revised BOP dataset-specific framework Alternatives for a preview (“lite”) tool Sample summary presentations of results To access the paper:
14
DQAF: an Overview Uses a cascading structure - and for each element,
Five dimensions of quality - and for each dimension, Elements that can be used in assessing quality - and for each element, Indicators that are more concrete and detailed - and for each indicator, Focal issues that are tailored to the dataset - and for each focal issue Key points
15
DQAF: an Overview The five dimensions of the IMF’s Data Quality Assessment Framework 1. Integrity 2. Methodological soundness 3. Accuracy and reliability 4. Serviceability 5. Accessibility
16
DQAF: an Overview Also, some elements/indicators are grouped as “prerequisites of quality” Pointers that are relevant to more than one of the five dimensions Generally refer to the umbrella agency Example: quality awareness
17
Prerequisites for Quality
Legal and institutional framework Roles and responsibilities of statistical agencies Data sharing and coordination between data producing agencies Access to administrative and other data for statistical purposes Nature of reporting Resources Quality awareness
18
Elements of Integrity Professionalism Transparency Ethical standards
19
Elements of Methodological Soundness
Concepts and definitions Scope Classifications Basis for recording: accounting rules and valuation principles
20
Elements of Accuracy Source data
Statistical techniques: compilation procedures and statistical methods and adjustment Assessment and validation
21
Elements of Serviceability
Relevance of the national accounts program Timeliness and periodicity Consistency Revision policy and practice
22
Elements of Accessibility
Data accessibility Metadata accessibility: documentation Assistance to users: service and support
23
Indicators of Consistency
Temporal consistency Internal consistency Intersectoral consistency
24
Focal Issues for Internal Consistency
Internal consistency of the annual accounts Internal consistency between quarterly and annual estimates
25
Key Points Internal Consistency of the National Accounts
Discrepancies between approaches shown? Size of discrepancies? Differences between growth rates? Supply and use framework applied? Do total supply and use match? Does net lending/borrowing match between sectors?
26
General Reactions “Welcome initiative” “Fills important gap”
“Is careful and thoughtful” “Provides basis for coherent and practical way forward in a complex field”
27
General Reactions Some other points
Is the framework really operational for small countries? Can it be used without giving a “black mark” for points that are irrelevant to a country? Is the framework able to identify “poor” statistics prepared within a developed statistical system?
28
General Reactions Some other points (cont’d)
Expand the range of datasets covered Coordination with other organizations working on data quality is important Continue working in a consultative manner
29
The DQAF Suite of Tools DQAF “Lite”
Background: interest in a version that might serve as a diagnostic preview or for a non-statistician’s assessment IMF is field testing a “Lite” made up of 13 indicators.
30
The DQAF Suite of Tools Summary presentation of results
Background: Interest in a presentation of results for, e.g., policy advisors IMF is testing a summary presentation For each dataset, a one-page table At the two-digit level (21 elements) On a 4-point scale, from “practice observed” to “practice not observed” With an “n.a.” column With a “comments” column
31
Data Quality Assessment Framework Summary for [dataset]
Note: O = Practice Observed; LO = Practice Largely Observed: MNO = Practice Materially Nonobserved; NO = Practice Nonobserved; NA= Not Applicable Comment: only if different from O.
32
The DQAF Suite of Tools Dataset (6-digit) Generic (3-digit) “Lite”
Summary of Results etc. Dataset (6-digit) etc. Dataset Specific (5-digit) etc. GFS BOP NA
33
Work ahead Test the suite Refine and revise the suite
in a wider range of country situations especially with non-statisticians Refine and revise the suite Complete supporting materials A Glossary Supporting Notes for specific datasets A Methodology (a how-to-do-it guide) Develop frameworks for other datasets
34
Links to the SDDS/GDDS Summary: The DQAF complements the SDDS/GDDS
All of the elements of the SDDS/GDDS are also found within the DQAF
35
Links to SDDS/GDDS The purpose and scope of the SDDS/GDDS and DQAF differ: In SDDS/GDDS, as dissemination standards, quality is a dimension. That dimension takes an indirect approach to dealing with, e.g., accuracy--it calls for dissemination of relevant information. In DQAF, as an assessment tool, quality is the umbrella concept. That concept covers collection, processing, and dissemination of data.
36
Links to SDDS/GDDS The DQAF definition of “quality” has been brought into line with the emerging consensus that quality is a multidimensional concept. Some aspects relate to the product Some aspects relate to the institution
37
Links to SDDS/GDDS DQAF is “more active” in dealing with, e.g., conformity with international guidelines, accuracy, and reliability. SDDS and to a lesser degree GDDS left users on their own to make judgments DQAF guides users in making such judgments by providing two structured dimensions: Methodological soundness Accuracy and reliability
38
Working with the DQAF The earlier list of potential uses of the DQAF included “To guide IMF staff “ Largely this refers to staff of the IMF Statistics Department Interrelated uses: in assessing data for IMF’s use in surveillance and operations, in preparing ROSCs, and in designing technical assistance
39
Working with the DQAF We are now using the DQAF in the field
In capacity building advisory missions In ROSCs
40
Working with the DQAF What do we see from the experiences? Advantages
Provides more structure to technical assistance Promotes consistency across staff/experts Potentially provides input for useful database Places data standards in the center of work on the international financial architecture Challenges Puts premium on consistency Calls for explicit judgments
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.