Download presentation
Presentation is loading. Please wait.
Published byArtúr Sipos Modified over 5 years ago
1
PERF project EGPA introduction to the working groups
Dr. Wouter Van Dooren Assistant professor Research Group Public Administration & Management University of Antwerp, Belgium EUPAN-HRWG1-IPSG , Bruges
2
Contents Where we came from, and where we stand
Reflections on the general remarks Purposes of the working groups How we selected indicators
3
Where we came from, and where we stand
PERF = performance of governance Public governance is mainly about public administration, but with an increasing awareness for the role of non-state actors Public governance is mainly about the machinery of government, an intermediary producer for other sectors that are final producers (see figure on next slide) Performance indicators should give an indication of the extent to which PA ‘works’ Performance indicators should trigger dialogue for improvement and will seldom provide a conclusive answer
4
Where we came from, and where we stand
5
Where we came from, and where we stand
Why this focus on governance/public administration, and not on policy sectors? Core business of EUPAN Added value of indicators in this field Significant use of public resources Remedy the weak evidence base for PA policies Why this focus on indicators and not data? Building a shared understanding; which not implies total agreement Discussions about indicators are discussions about concrete practices Learning in order to potentially improve national data collection efforts Compatibility with OECD’s (or at least agreeing on what to disagree)
6
Where we came from, and where we stand
Draft analytical table by EGPA/ Belgian Presidency Discussed in Leuven Feedback Sent out to EGPA network for academic input (13/15 responses) Sent out to EUPAN correspondents for input Responses from 8 partners: Austria Hungary Luxembourg Netherlands Slovakia Spain Sweden EFQM Extended table (base + EUPAN + EGPA + OECD Selection of indicators for discussion in the WG’s (see below) (or at least agreeing on what to disagree)
7
Where we came from, and where we stand
(or at least agreeing on what to disagree)
8
Where we came from, and where we stand
(or at least agreeing on what to disagree) 8
9
Where we came from, and where we stand
(or at least agreeing on what to disagree) 9
10
2. Reflections on the general remarks
Supportive comments Notification of national initiatives Close cooperation with OECD – initiatives Macro and/or micro perspective? Macro through aggregation of micro data (e.g. trust, budget deficit) Micro as exemplary for a broad set of processes e.g. Time to fill out taxes as an important micro process that represents a macro issue of administrative burdens -Composites How to deal with political choices and local preferences? Common challenges – sustainability National initiatives in Sweden; benchmark with DK, FI, No, UK perceptions / quaity Social Cultural Planning Office in NL, new study, but sectoral
11
2. Reflections on the general remarks
Building block 1: whole of government Subjective/objective Are these indicators actionable? Can we rely on existing measurement efforts (WB, IMD, ...)? Building block 2: policy capacity Is it possible? Is it comparable? What is capacity? Having, doing, and impact? Subjective In the case of the subjective ones, there are no official surveys on the subject so they are not very reliable. (ES) internal use (initiatives in some countries) // comparative? -> potentially by looking at increment, change? Actionable – what to do? What is the norm? E.G. trust data
12
2. Reflections on the general remarks
Building block 3: transparency and integrity National initiatives are under consideration Interpretation of volume indicators is difficult Building block 4: staffing fewer comments, best covered field (see also OECD/EUPAN HR Learning team) Sweden – model to follow up PM volume indicators – number of breaches, “Indicators such as “the number of breaches”, “the number of appeals” etc. pose the important question of whether the occurrence of such events should be interpreted as a sign of a large extent of integrity or of a low extent of integrity. For example, if there are many breach/corruption etc. cases – I emphasize: by “cases” one always means “known cases” – this can be interpreted either as (i) corruption is widespread or (ii) anti-corruption measures are effective so that real cases are revealed”
13
2. Reflections on the general remarks
Building block 5: Budgeting Right label? Should be whole financial cycle Relevance of the indicators? Building block 6: service delivery service/organisational level versus central level aggregation? How to measure quality besides quantity of services? Subjective In the case of the subjective ones, there are no official surveys on the subject so they are not very reliable. (ES) internal use (initiatives in some countries) // comparative? -> potentially by looking at increment, change? Actionable – what to do? What is the norm? E.G. trust data ICT, administrative burdens, process management Building block 7: organisation - modernisation what should be the main issues here?
14
3. Purpose of the working groups
How we propose to organising the work? each WG discusses indicators in different building blocks: WG1: Whole of government block (1) and modernising block (7) WG2: Policy capacity block (2) and integrity block (3) WG3: Staffing (4), budgeting (5) and service delivery (6) blocks each participant is asked to provide a scoring on a 5 point scale for feasibility and utility of the indicators in your national context this scoring is anonymous – we are interested in professional judgement National initiatives in Sweden; benchmark with DK, FI, No, UK perceptions / quaity Social Cultural Planning Office in NL, new study, but sectoral
15
3. Purpose of the working groups
Methodology Scoring of the selected indicators Discussion on the indicators Possibility to revise the initial scoring and to add indicators Scoring and discussions are analysed for reporting in Genval National initiatives in Sweden; benchmark with DK, FI, No, UK perceptions / quaity Social Cultural Planning Office in NL, new study, but sectoral
16
4. How we selected the indicators
We preferred ... -More precise indicators over general indicators More novel indicators over established indicators Indicators that are not tied to a specific policy sector Indicators that in a more evident way may tell us what works Indicators that are on PA, rather than politics Indicators for a broad range of facets (not all) And, undoubtedly, a certain amount of randomness and prejudice. participants can propose to discuss indicators that have been left out National initiatives in Sweden; benchmark with DK, FI, No, UK perceptions / quaity Social Cultural Planning Office in NL, new study, but sectoral
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.