Presentation is loading. Please wait.

Presentation is loading. Please wait.

Seminar on Performance Budgeting and Fiscal Transparency, Tangier, Morocco, April 21-23, 2009 Session 8. Monitoring and Evaluation: Challenges and Issues.

Similar presentations


Presentation on theme: "Seminar on Performance Budgeting and Fiscal Transparency, Tangier, Morocco, April 21-23, 2009 Session 8. Monitoring and Evaluation: Challenges and Issues."— Presentation transcript:

1 Seminar on Performance Budgeting and Fiscal Transparency, Tangier, Morocco, April 21-23, 2009 Session 8. Monitoring and Evaluation: Challenges and Issues Nowook Park (npark@kipf.re.kr)npark@kipf.re.kr Center for Performance Evaluation and Management Korea Institute of Public Finance

2 2 Contents Impact of performance budgeting in Korea 1 Roles of the MOF, line ministries, parliament, and audit office 2 3 4 How to motivate performance? How to engage politicians? 5 Performance Information : Indicators and Evaluations

3 1. Impact of Performance Budgeting in Korea

4 4  Evaluation results  Quality of performance information has not improved much  Programs are showing better results  Link between evaluation results and budget  Evaluation results are utilized at every stage of budget process  Moving away from incremental budgeting  Evaluated programs are subject to bigger budget change compared to other programs Observations from Program Review Process

5 5 Evaluation Results by Total Score

6 6 TotalEffective Moderately Effective AdequateIneffective 2005 Number5552810034087 (%)(100.0)(5.0)(18.0)(61.3)(15.7) 2006 Number577309438865 (%)(100.0)(5.2)(16.3)(67.2)(11.3) 2007 Number5846613934831 (%)(100.0)(11.3)(23.8)(59.6)(5.3) Evaluation Results by Ratings

7 7 Total Score (100) Planning(30) Management (20) Results (50) Sub total (30) Design (15) Performance Planning (15) 200560.123.113.89.315.121.9 200659.922.914.38.614.722.3 200766.023.414.29.215.527.1 Evaluation Results by Section

8 8  MOSF encouraged ministries/agencies to use the results in reshuffling budget allocation  MOSF announced at least 10% budget-cut would be done to “ineffective” programs, in principle  MOSF submitted evaluation results to the National Assembly upon their request  Evaluation results are open to public since 2006 Utilization of Evaluation Results

9 9 Use of Performance Information by Agencies (2005)

10 10 Use of Performance Information by MOSF (2005)

11 11 Use of Performance Information by Legislature (2005)

12 12 Rating ‘05 Budget (A) ‘06 BudgetDifferenceRatio Agency (B) MPB (C) Final (D) B-AC-AD-ARatio_1Ratio_2Ratio_3 Effective15,60024,94818,95522,4899,3483,3556,889 0.520.380.50 Mod. Effective 92,994104,335107,055105,76211,34214,06212,768 0.320.330.28 Adequate208,066204,473195,625201,214-3,593-12,441-6,852 0.100.05 Ineffective33,08128,64429,00728,505-4,437-4,074-4,576 -0.15-0.25-0.19 Total349,740362,400350,642357,970 ( Unit : 100,000won, % ) Link between Evaluation and Budgeting (2005)

13 13 Rating ‘06 Budget (A) ‘07 BudgetDifferenceRatio Agency (B) MPB (C) Final (D) B-AC-AD-ARatio_1Ratio_2Ratio_3 Effective8,8919,4679,3378,872575446-19 0.110.100.06 Mod. Effective 33,15635,70135,36435,6542,5452,2082,498 0.120.080.09 Adequate297,180296,769290,481289,969-411-6,699-7,211 0.100.040.03 Ineffective11,4316,0395,4005,380-5,392-6,031-6,051 -0.15-0.24-0.25 Total350,658347,975340,582339,875 ( Unit : 100,000won, % ) Link between Evaluation and Budgeting (2006)

14 14 Rating ‘07 Budget (A) ‘08 BudgetDifferenceRatio Agency (B) MPB (C) B-AC-ARatio_1Ratio_2 Effective17,11218,21117,5031,099391 0.180.12 Mod. Effective 266,051295,121291,31929,07025,268 0.200.13 Adequate146,034153,026142,0446,992-3,990 0.280.15 Ineffective3,8703,4573,066-414-805 -0.001-0.15 Total433,067469,815453,932 ( Unit : 100,000won, % ) Link between Evaluation and Budgeting (2007)

15 15 Programs have been subject to larger budget change after evaluation Coefficient of Variation in Funding Change (Excluding Programs of which funding change is greater than 200%) CV Year (B 04 -B 03 )/B 03 (B 05 -B 04 )/B 04 (B 06 -B 05 )/B 05 (B 07 -B 06 )/B 06 (B 08 -B 07 )/B 07 20053.12.79.2 20062.7 -14.3 20072.53.13.9 Moving Away from Incremental Budgeting

16 16  Budget allocation based on performance is spreading among line ministries  Utilization of SABP results in budget requests  Changing practices of program management  Evaluation activities become active among line ministries  Need for program evaluations are recognized and accepted among line ministries Cultural Changes among Line Ministries

17 2. Roles of the MOF, line ministries, parliament, and audit office

18 18 Ministry of Strategy and Finance (MOSF)  Evaluate line ministries’ program  Issuing guidelines on performance-based budgeting to line ministries/agencies  Evaluating their program’s performance  Use performance information in budget formulation  Encouraging line ministries/agencies to use performance evaluation results in preparing their budget requests  Incorporating the performance information into its decisions during budget formulation

19 19 Line ministries/agencies  Producing performance information in compliance with central budget authority’s initiative  Submitting strategic plans, annual performance plans and performance reports  Evaluate their programs based on SABP checklist  Conduct program evaluation  Use performance information in budget request

20 20 The National Assembly  Before the enactment of National Finance Act, there had not been official role of the National Assembly  They had requested evaluation results for budget deliberation on an ad hoc basis  The National Assembly receives annual performance plan (from 2008) and report (from 2009) with budget documents  The National Assembly Budget Office intends to analyze budget in connection with performance information

21 21 The National Audit Office  Before the enactment of National Finance Act, there had not been official role of the National Audit Office  As an independent audit office within the Administration, it has not played any official role  However, they examined the operation of performance budgeting system and produced report  The National Audit Office is assigned with the role of verifying annual performance report from 2009

22 3. Performance Information : Indicators and Evaluations

23 23 Status  Slight increase in outcome measures but there are big room for further improvement  Outcome performance measures increase by 3.2 percentage points, while output performance measures decrease by 4.1 percentage points  More than 40% of programs do not have relevant performance indicators in SABP Performance Indicators Total Type of indicators InputProcessOutputOutcome FY 2007 2,016 73 (3.6) 140 (7.0) 837 (41.5) 966 (47.9) FY 2008 2,037 111 (5.4) 125 (6.1) 761 (37.4) 1,040 (51.1)

24 24 Examples Performance Indicators GoodBadUgly Support the IT business to go abroad - Customer satisfaction - Increase the amount of export($) Student Loans Program - The amount of loans - The number of loans Percentage of attainment - rate of planned goal achievement about employment (x) -rate of employment (o) Recommendation : Using ‘actual value of attainment’ instead of ‘percentage of achievement’. (FY 2008 manual of KPART) Recommendation : Using outcome measure ex) Enrollment rates: The postsecondary enrollment gap between low- and high-income high school graduates will decrease each year.

25 25 Problems  Difficulties to develop outcome measures in some programs  Program producing results after long period Using milestone indicators may help However, using different evaluation cycle needs to be considered  Successful management of program may not change outcome measures Reduce scope of outcome measures Particular program may not worth evaluating  Data do not exist Investment in data is necessary Compare cost and benefit of producing new data Performance Indicators

26 26 Lessons  It is desirable to develop outcome measures, but it is not possible for every program  Be aware of some exceptions and apply it flexibly according to program type Performance Indicators

27 27 Status  Self-Assessment of Budgetary Program  Review based on the checklist  Among the checklist, there is a question whether program evaluation is conducted or not  It encourage line ministries to conduct program evaluation at least once in three years  In-depth Program Evaluation  In-depth evaluation for selected programs by the central budget authority Program Evaluations

28 28 Status - KPART Program Evaluations Has your program been evaluated in depth using verifiable data ? Has your program evaluation been conducted by independent organization ? ex) external institution, audit office, internal evaluation expert, etc. Does evaluation cover important issues of program ? Note) ‘yes’ in case that evaluation is in progress or conducted within 3 years latest YES ! Q. 3-1. Did you conduct comprehensive program evaluation objectively?

29 29 Problems  Little experience with program evaluation among program managers  Lack of data to prohibits meaningful program evaluation  Lack of fund to conduct program evaluation  Difficulties in maintaining independence of evaluators  Trade off between independence and expertise in the society where social ties are strong and dense Program Evaluations

30 30 Example : Bad Program Evaluations Promotion of the intellectual property (IP) Obstacle Purpose Measuring the impact of program performance Realign the way of investment Monitoring the performance management system Issue Does the number of IP center affect the number of industrial property ? MethodologyEstimation based on the fixed effect model Insufficient Data - absence of data which is needed to conduct the evaluation - using customer satisfaction (or number of counseling) instead of the number of intellectual property in each localities Lack of understanding on program evaluation among program managers Result : Couldn’t find the evidence for the regional impact of the IP center !

31 31 Example : Good Program Evaluations Rural development program Purpose Evaluate the impact of the program Monitoring the program in progress Issue Does the amount of fund improve the condition of the undeveloped-area ? MethodologyData analysis and the survey Result of Effectiveness Find the evidence for the regional impact! - There was positive relation between the program’s spending and the infrastructure - In addition, the infrastructure created value-added business Result of Management Find the problems of the management system! - The management system is not developed and the each manager’s autonomy is limited

32 32 Lessons  Plan ahead for data collection  Data should go hand in hand with program management  Data management is a part of program management  Be creative in securing fund for evaluation  Evaluation should be part of program management  In budget negotiation, evaluation plan should be presented Program Evaluations

33 4. How to motivate performance? How to engage politicians?

34 Institutionalized Incentives for line ministries  How performance information will be used in budget preparation is stipulated in guidelines for budget preparation  In principle, 10% budget cut for ineffective program is encouraged to line ministries  It draws attention from program managers but not from high ranking decision makers  Results from SABP is used as a part of evaluation information for the whole department  Now high ranking decision makers pay attention  But their attention is focused more on getting good scores than on improving programs’ performance

35 Forms of Information does Matter  Performance information (performance indicator) at monitoring level has not been able to draw attention from decision makers  They are internally useful information, but not utilized by central budget authority Remains to be seen how they will be utilized by involved parties  Performance indicators are good starting point for communication rather than decision making  Performance information (program ratings) from program reviews are actively used by central budget authority  It summarizes various information into simple ratings  Power of Simplicity!

36 Politician’s Engagement  Although there is no formal mechanism for politician’s engagement, politicians show interest in performance information  They requests performance information on an ad hoc basis  There has not been much suspicion on the evaluation results from the Administration so far  It remains to be seen whether politicians will use performance information more systematically with the submission of annual performance plan and report to the National Assembly,  Publicizing performance information may help to force politicians to pay attention to PI

37


Download ppt "Seminar on Performance Budgeting and Fiscal Transparency, Tangier, Morocco, April 21-23, 2009 Session 8. Monitoring and Evaluation: Challenges and Issues."

Similar presentations


Ads by Google