PERF project EGPA introduction to the working groups

Slides:



Advertisements
Similar presentations
EuropeAid ENGAGING STRATEGICALLY WITH NON-STATE ACTORS IN NEW AID MODALITIES SESSION 1 Why this Focus on Non-State Actors in Budget Support and SPSPs?
Advertisements

Open All Areas Partners: difficult to find them (internal and external) and to get them to commit, different goals, coordination of the cooperation, different.
Data Sources and Quality Improvements for Statistics on Agricultural Household Income in 27 EU Countries Berkeley Hill Emeritus Professor of Policy Analysis.
COST 356 EST - Towards the definition of a measurable environmentally sustainable transport CONTACTS Dr Robert Joumard, chairman, INRETS, tel
ESPON Seminar 15 November 2006 in Espoo, Finland Review of the ESPON 2006 and lessons learned for the ESPON 2013 Programme Thiemo W. Eser, ESPON Managing.
SECTOR POLICY SUPPORT PROGRAMMES A new methodology for delivery of EC development assistance. 1.
Revising priorities in the statistical programme Management Group on Statistical Cooperation * 24 & 25 March 2011 * Carina Fransen.
IPSG Expert Meeting – Customer Satisfaction Mapping 1st December 2005.
Customer Satisfaction Work – the Way Forward Johanna Nurmi, Finland IPSG,
ESSnet on linking of micro-data on ICT usage Progress Report Mark Franklin UK Office for National Statistics Cologne: 27 October 2011.
World Summit on Arts & Culture -Transforming People, Transforming Lives June 2006, NewcastleGateshead, UK RECENT WORK IN THE FIELD OF CULTURAL STATISTICS.
Government at a Glance Key Challenges for Government Reassessing the Role of Government Transparency and Accountability Building the Right Capacities The.
Kick-off meeting Szekesfehervar 6-7 July 2009 Development of Innovative Business Parks to Foster Innovation and Entrepreneurship in the SEE Area Presentation.
Country Level Programs
GENDER TOOLS FOR ENERGY PROJECTS Module 2 Unit 2
Project Cycle Management
Governance, Fragility and Conflict Assessments in German Development Cooperation DeLoG Annual Meeting 2017 Thematic session 2: Decentralization and local.
OECD Strategic Education Governance A perspective for Scotland
Exploring the Role of Cultural and Policy Context in Distributed Leadership Practices in the US and Denmark The Comprehensive Assessment of Leadership.
Ivor Beazley, World Bank
44th Meeting of the Standing Committee Bonn, Germany, October 2015 Report on activities of the Strategic Plan Working Group Ines Verleye,
Open All Areas Difficulties met in the process
Marcom International for OSHA
Learning Forward Annual Conference Session F28
THINKING LIKE AN ECONOMIST
DG Environment, Unit D.2 Marine Environment and Water Industry
Pilot actions.
Workshop on New Approaches to Statistical Capacity Development
1 – STUDY ON REFERENCE INDICATORS+
EPAN – Lisbon ad hoc group Welcome - Agenda Introduction
Public Sector Modernisation How do governments learn?
EUPAN DG-Troika 3rd May 2007, Berlin Medium-Term Programme (MTP)
Activities of the Human Resources Working Group
School of Dentistry Education Research Fund (SDERF)
eGovernment Working Group
Patrick Staes and Ann Stoffels
Building Knowledge about ESD Indicators
Portuguese Presidency
Human Resources Management Performance Assessment: Czech Presidency
Consider the Evidence Evidence-driven decision making
Government at a Glance 2011: links to EUPAN’s indicators project
EUPAN DG-Meeting Innovative Public Services Group (IPSG)
Study Group on Performance in the Public Sector
Joint inspections and co-operation in Scotland
"Environmental Expenditure Statistics"
DG Troika – 26 October – Portugal
EUPAN Quality Event November 2010.
Swedish EUPAN Organisation
Ricardo Furman Senior Evaluation Officer- Geneva
TECHNOLOGY ASSESSMENT
Preparing Ministerial Recommendations for the Medium-Term Programme (MTP)
Education and Training Statistics Working Group, May 2011
Luxembourg, 03rd December 2015
The Strengthened Approach to Supporting PFM reforms
Innovative Public Services Group Meeting eGovernment Working Group
Customer Satisfaction Work – the Way Forward
Task Force 3, Cultural Industries Kutt Kommel
The Estonian experience with ex-ante evaluation – set-up and progress
Customer Satisfaction Measurement Work
Reflections on Revising the Guidance: An Evaluation
EUPAN Handbook.
EUPAN DG-Troika 3rd May 2007, Berlin Medium-Term Programme (MTP)
A modest attempt at measuring and communicating about quality
E-GOVERNMENT WG MEETING
Innovative Public Services Group Meeting eGovernment Working Group Dr
STRUCTURE AND METHODS OF CO-OPERATION
A Framework for the Governance of Infrastructure - Getting Infrastructure Right - Jungmin Park, OECD Budgeting & Public Expenditures Division 2019 Annual.
Field monitoring Project (number and title)
Challenges of Sharing Ambiguous Goals Institutional arrangements to ensure multi-stakeholder partnership for the SDGs Eunju KIM Korea Institute of Public.
eContentplus 2007 Work Programme
Presentation transcript:

PERF project EGPA introduction to the working groups Dr. Wouter Van Dooren Assistant professor Research Group Public Administration & Management University of Antwerp, Belgium EUPAN-HRWG1-IPSG1 25-11-2010, Bruges

Contents Where we came from, and where we stand Reflections on the general remarks Purposes of the working groups How we selected indicators

Where we came from, and where we stand PERF = performance of governance Public governance is mainly about public administration, but with an increasing awareness for the role of non-state actors Public governance is mainly about the machinery of government, an intermediary producer for other sectors that are final producers (see figure on next slide) Performance indicators should give an indication of the extent to which PA ‘works’ Performance indicators should trigger dialogue for improvement and will seldom provide a conclusive answer

Where we came from, and where we stand

Where we came from, and where we stand Why this focus on governance/public administration, and not on policy sectors? Core business of EUPAN Added value of indicators in this field Significant use of public resources Remedy the weak evidence base for PA policies Why this focus on indicators and not data? Building a shared understanding; which not implies total agreement Discussions about indicators are discussions about concrete practices Learning in order to potentially improve national data collection efforts Compatibility with OECD’s G@G (or at least agreeing on what to disagree)

Where we came from, and where we stand Draft analytical table by EGPA/ Belgian Presidency Discussed in Leuven Feedback Sent out to EGPA network for academic input (13/15 responses) Sent out to EUPAN correspondents for input Responses from 8 partners: Austria Hungary Luxembourg Netherlands Slovakia Spain Sweden EFQM Extended table (base + EUPAN + EGPA + OECD G@G) Selection of indicators for discussion in the WG’s (see below) (or at least agreeing on what to disagree)

Where we came from, and where we stand (or at least agreeing on what to disagree)

Where we came from, and where we stand (or at least agreeing on what to disagree) 8

Where we came from, and where we stand (or at least agreeing on what to disagree) 9

2. Reflections on the general remarks Supportive comments Notification of national initiatives Close cooperation with OECD – initiatives Macro and/or micro perspective? Macro through aggregation of micro data (e.g. trust, budget deficit) Micro as exemplary for a broad set of processes e.g. Time to fill out taxes as an important micro process that represents a macro issue of administrative burdens -Composites How to deal with political choices and local preferences? Common challenges – sustainability National initiatives in Sweden; benchmark with DK, FI, No, UK perceptions / quaity Social Cultural Planning Office in NL, new study, but sectoral

2. Reflections on the general remarks Building block 1: whole of government Subjective/objective Are these indicators actionable? Can we rely on existing measurement efforts (WB, IMD, ...)? Building block 2: policy capacity Is it possible? Is it comparable? What is capacity? Having, doing, and impact? Subjective In the case of the subjective ones, there are no official surveys on the subject so they are not very reliable. (ES)  internal use (initiatives in some countries) // comparative? -> potentially by looking at increment, change? Actionable – what to do? What is the norm? E.G. trust data

2. Reflections on the general remarks Building block 3: transparency and integrity National initiatives are under consideration Interpretation of volume indicators is difficult Building block 4: staffing fewer comments, best covered field (see also OECD/EUPAN HR Learning team) Sweden – model to follow up PM volume indicators – number of breaches, “Indicators such as “the number of breaches”, “the number of appeals” etc. pose the important question of whether the occurrence of such events should be interpreted as a sign of a large extent of integrity or of a low extent of integrity. For example, if there are many breach/corruption etc. cases – I emphasize: by “cases” one always means “known cases” – this can be interpreted either as (i) corruption is widespread or (ii) anti-corruption measures are effective so that real cases are revealed”

2. Reflections on the general remarks Building block 5: Budgeting Right label? Should be whole financial cycle Relevance of the indicators? Building block 6: service delivery service/organisational level versus central level  aggregation? How to measure quality besides quantity of services? Subjective In the case of the subjective ones, there are no official surveys on the subject so they are not very reliable. (ES)  internal use (initiatives in some countries) // comparative? -> potentially by looking at increment, change? Actionable – what to do? What is the norm? E.G. trust data ICT, administrative burdens, process management Building block 7: organisation - modernisation what should be the main issues here?

3. Purpose of the working groups How we propose to organising the work? each WG discusses indicators in different building blocks: WG1: Whole of government block (1) and modernising block (7) WG2: Policy capacity block (2) and integrity block (3) WG3: Staffing (4), budgeting (5) and service delivery (6) blocks each participant is asked to provide a scoring on a 5 point scale for feasibility and utility of the indicators in your national context this scoring is anonymous – we are interested in professional judgement National initiatives in Sweden; benchmark with DK, FI, No, UK perceptions / quaity Social Cultural Planning Office in NL, new study, but sectoral

3. Purpose of the working groups Methodology Scoring of the selected indicators Discussion on the indicators Possibility to revise the initial scoring and to add indicators  Scoring and discussions are analysed for reporting in Genval National initiatives in Sweden; benchmark with DK, FI, No, UK perceptions / quaity Social Cultural Planning Office in NL, new study, but sectoral

4. How we selected the indicators We preferred ... -More precise indicators over general indicators More novel indicators over established indicators Indicators that are not tied to a specific policy sector Indicators that in a more evident way may tell us what works Indicators that are on PA, rather than politics Indicators for a broad range of facets (not all) And, undoubtedly, a certain amount of randomness and prejudice.  participants can propose to discuss indicators that have been left out National initiatives in Sweden; benchmark with DK, FI, No, UK perceptions / quaity Social Cultural Planning Office in NL, new study, but sectoral