Download presentation
Presentation is loading. Please wait.
Published byAmie Jacobs Modified over 6 years ago
1
Budgetary Processes and Public Expenditure Management Core Course
Program Evaluation and Outcome Measures May 24, 2000 David Shand World Bank
2
An Introductory Schema
The Public Sector Production Function Inputs Activities Outputs Outcomes Costs Process Measures Efficiency Effectiveness We need to examine both methodological and performance management issues.
3
A Three-Part (?) Inter-related Approach
Outcome indicators Project analysis (cost-benefit studies of capital projects) Program evaluation (benefits may be expressed in $) How are they inter-related? Indicators suggest a result, but not the reasons; program evaluation attempts to establish cause-effect relationships A possible fourth element - customer satisfaction measures/surveys A possible fifth element - process measures Relationship with “social indicators”. These may indicate broad societal needs and guide spending priorities
4
Why Outcome Indicators May be Problematical
Difficulty in defining objectives - must be measurable, not general statements How to determine cause/effect relationships - what has caused the outcome? Outcomes may not emerge for some years
5
Performance Measures - General Comments
Full diagnosis of performance requires an understanding of the public sector production function Allow adequate time and recognize complexity (easy to do badly - inherent limitations) Ensure comprehensiveness, but don’t over-complicate Understanding inter-relationships between the different measures.
6
Performance Measures - General Comments
Measure the right thing how to legitimate performance measures (planning processes, consultation with stakeholders) In principle it applies to all public sector activities - but difficult in some areas (e.g. research, policy advice, foreign affairs) Setting targets - at what level? - attainable with difficulty?
7
Performance Measures - General Comments
Reviewing performance - basis of comparison? Previous performance targets - what is reasonably achievable, with effort with comparable organizations (benchmarking, including international comparisons) Using the information for performance improvement for accountability (including contracting)
8
Some Illustrations Health
Social Indicators - life expectancy, infant mortality. Outcome indicators - successful cure rates? Outputs - number or hospital beds, number of inoculations What are the alternatives or choices within health programs? Issues that might be addressed by a program evaluation, e.g. road safety
9
Some Illustrations (cont’d)
Education Social indicators - literacy rates Outcome Indicators - Educational attainment, Exam results? Outputs - level of enrolments, pupil-teacher ratios Parents/student/employer satisfaction in surveys What are the alternatives or choices within education programs? Issues that might be addressed by a program evaluation.
10
Some Illustrations (cont’d)
Police Social Indicators - crime rates Outcome Indicators? Prevention of crime or apprehension? Crime clearance rates Response times? Citizen Surveys? Highway construction A capital project example Road Safety Consideration of alternative programs which may cut across organizational boundaries
11
Some General Comments on Program Evaluation
Is more an art than a science. May have significant subjective elements Is therefore an input to decision-making, rather than making the decision itself Need to be able to explain why a result is as it is? What is the connection between program outputs and outcomes? May be less practicable and therefore have less influence in some expenditure areas than others, e.g. foreign affairs
12
Some General Comments on Program Evaluation
How should we allocate scarce evaluation resources? We cannot evaluate everything at the same time. Ex ante or ex post? Remember that every program has an alternative, e.g. no program, or a different scale or level of funding. Defining target groups/intended beneficiaries is often an important aspect of determining objectives.
13
Some General Comments on Program Evaluation
Must be demand driven, not just supply driven Is the necessary information available or does it have to be created? Decisions must generally be made with imperfect information (avoid “paralysis by analysis”) Don’t overlook the input (cost) side Few programs are clearly totally ineffective - most have their supporters/clients
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.