The political side of social program evaluation

Slides:



Advertisements
Similar presentations
Special Multilateral Fund of the Inter-American Council for Integral Development – FEMCIDI Inter-American Committee of Education - CIE.
Advertisements

Financing of OAS Activities Sources of cooperation Cooperation modalities Cooperation actors Specific Funds management models and resources mobilization.
Slovenian experience MEASURES TO STRENGHTEN THE CIVIL DIALOGUE AND PARTNERSHIP Irma Mežnarič Brussels - 10 October 2006.
Application of the PROJECT CYCLE MANAGEMENT in Piedmont Region.
A NEW METRIC FOR A NEW COHESION POLICY by Fabrizio Barca * * Italian Ministry of Economy and Finance. Special Advisor to the European Commission. Perugia,
EuropeAid 1 Analysing and Addressing Governance at sector level – ongoing work Presentation at food security seminar European Commission, EuropeAid, Unit.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
International Conference on Small States and Economic Resilience April 2007 Valetta, Malta Islands and Small States Institute Government intervention.
SYSTEM OF EVALUATION AND MANAGEMENT CONTROL RESULTS-BASED BUDGETING THE CHILEAN EXPERIENCE Heidi Berner H Head of Management Control Division Budget Office,
1 Community Budget and Agricultural Policy Reform: The Tony Blair Proposal A German Point of View Ulrich Koester University of Kiel Germany.
Ray C. Rist The World Bank Washington, D.C.
Márcia Paterno Joppert Brazilian Monitoring and Evaluation Network October 2013 Multipaper session: The State of Evaluation Practice in the Early 21st.
1 FISCAL INSTITUTIONS IN MEXICO: WHAT REFORMS ARE PENDING? November, 2003.
National Evaluation and Results Management System– Sinergia – Two decades of lessons and experiences Directorate of Monitoring and Evaluation of Public.
DECENTRALIZATION AND RURAL SERVICES : MESSAGES FROM RECENT RESEARCH AND PRACTICE Graham B. Kerr Community Based Rural Development Advisor The World Bank.
Marseille, 5 April 2013 Union for the Mediterranean Senior Officials' Meeting.
Eurasian Corporate Governance Roundtable
Integration of Regulatory Impact Assessment into the decision making process in the Czech Republic Aleš Pecka Department of Regulatory Reform and Public.
ZHRC/HTI Financial Management Training Session 1: Financial Management Overview.
Background, achievements and challenges in the external evaluation process of Oportunidades November 19 th, 2013.
DIRECCIÓN DE PLANEACIÓN Y NORMATIVIDAD DE LA POLÍTICA DE EVALUACIÓN The Importance of the Institutionalization of M&E: The Mexican Case Gonzalo Hernández.
Presentation on Managing for Development Results in Zambia By A. Musunga Director M&E MOFNP - Zambia.
States and Government Companies Murilo Barella Brasília – 12 Março 2013.
Results-Based Management
Public funding of NGOs in Romania Budapest, Hungary November, 2010 Octavian Rusu, Legal adviser.
What gets lost along the way? Chances and pitfalls of government led implementation procedures for GRB The case of Austria Dr. Elisabeth Klatzer European.
Common framework Guidelines for Pilot Actions Debrecen 2013 Municipality of Debrecen Department of Sociology University of Debrecen External expert.
December To share best practices from the experience of 100 NSDS implemented over the last years. To take into account international community.
Slide Eastern Finance Association Annual Meeting 2009Andreas Dietrich SME Credit Availability Around the World: Evidence from the World Bank’s Enterprise.
Gabriela Pérez Yarahuán Universidad Iberoamericana Exploring Politics, Accountability and Evaluation Use in the Mexican Federal Government Education Programs.
DIRECCIÓN DE PLANEACIÓN Y NORMATIVIDAD DE LA POLÍTICA DE EVALUACIÓN The Importance of the Institutionalization of M&E: The Mexican Case Gonzalo Hernández.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
CONSISTENCY AND RESULTS EVALUATION IN MEXICO SEMPTEMBER 2008 Gonzalo Hernández Licona.
Stakeholder analysis for project design Ingvild Oia, Programme Specialist,UNDP Photo by: Konomiho/flickr.
Suggestions for Speedy & Inexpensive Justice Presentation to the Committee of the Whole The Senate of Pakistan 19 August 2015.
BUILDING THE FOUNDATIONS FOR A STATE M&E SYSTEM Information for M&E in the State of Yucatán 1.
PACIFIC AID EFFECTIVENESS PRINCIPLES. Purpose of Presentation Provide an overview of Pacific Principles on Aid Effectiveness Provide an overview of Pacific.
Evaluation of EU Structural Funds information and publicity activities in Lithuania in Implementing recommendations for Dr. Klaudijus.
Ministry of Social Development SEDESOL Mexico Design and Evaluation of Social Programmes Gonzalo Hernandez, Gustavo Merino, Ana Santiago, Miguel Szekely.
PUBLIC EXPENDITURE AND FINANCIAL ACCOUNTABILITY (PEFA)-PERFORMANCE MEASUREMENT FRAMEWORK Module 4: The Assessment Process, Stakeholders Involvement & Quality.
Governance Reform in Cambodia: Decentralization and Deconcentration and Local Governance Lecture 8 1 Public Administration Reform and Decentralized Governance.
Evaluation Unit EuropeAid Martyn Pennington Head of Evaluation Unit- Devco B2 Workshop on Lessons Learned from International Joint Evaluations French Ministry.
Fourth CLEAR Global Forum in Mexico City THEME 1: The role of civil society in the accountability of public policies and the institutionalization of Evaluation.
Corporate-level Evaluation on IFAD’s Private Sector Development and Partnership Strategy 6 th Special Session of the IFAD Evaluation Committee 9 May 2011.
PRESENTATION TO THE ECONOMIC ASSOCIATION OF ZAMBIA – 4 TH DECEMBER 2008 AT PAMODZI HOTEL Ministry of Local Government and Housing.
1 The Cases of Chile and Colombia Presentation to a LAC Region Summer Seminar 9 August 2006 Keith Mackay Independent Evaluation Group
Seminar on the Evolution of National Statistical Systems Panel Discussion: Prospects and Risks for the Future: How to Manage Uncertainties? Eduardo Pereira.
SEL1 Implementing an assessment – the Process Session IV Lusaka, January M. Gonzales de Asis and F. Recanatini, WBI
Technical Assistance Office TCP Projects 2005 Contractual and Financial Management Administrative and Financial Handbook Prepared by IA, 14/12/2001 SOCRATES.
Governance in Central and Eastern Europe Cheryl W. Gray Europe and Central Asia Region World Bank.
Reporter’s Notebook Guidelines on how to fill up the annual report form.
REAL ESTATE TAXATION SYSTEM IN ALBANIA AND CHALLENGES FOR A EUROPEAN FISCAL SYSTEM Puleri Thodhori 1 Kripa Dorina 2 1) 2) University of Tirana, Faculty.
DIRECCIÓN DE PLANEACIÓN Y NORMATIVIDAD DE LA POLÍTICA DE EVALUACIÓN An Independent Evaluation Unit in Mexico: CONEVAL Gonzalo Hernández Licona 2014.
African Agriculture Planning Scorecard. The challenge Wide variations in how African countries practice agricultural planning and budgeting Some practices.
Participation in social policy decision-making in Hungary
Country Level Programs
PUBLIC EXPENDITURE REVIEW CLINIC FOR BENIN 13 December, 1999
The Platform of European Social NGOs
Joint Seminar Brussels 2017.
April 21 Francesca Recanatini, WBI
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
FIGHTING CORRUPTION AND POVERTY: ARE WE GETTING IT RIGHT?
Budget Formulation: good practices
Monitoring and Evaluation using the
Evaluation in the GEF and Training Module on Terminal Evaluations
Global Experience And Framework For Fiscal Decentralization
Momade Saide, Ministery of Planning & Development Hanoi, February 2007
Panel “Key performance indicators for Serbian higher education“
How can we make healthcare purchasing in Kenya more strategic?
Role of Evaluation coordination group and Capacity Building Projects in Lithuania Vilija Šemetienė Head of Economic Analysis and Evaluation Division.
Presentation transcript:

The political side of social program evaluation Ministry of Social Development MEXICO Gonzalo Hernández Licona

Objective Analyse the institutional challenges facing Mexico, specifically SEDESOL, in constructing a Monitoring and Evaluation (M&E) system. How can we institutionalize an M&E system? The political constraints and challenges How can we go beyond the impact evaluation of Oportunidades?: More programs evaluated in a regular basis Constructing a Results-based management system Insert M&E within the Social Policy process

Outline Analytical framework What have we done and what’s next? What type of M&E system we’re aiming for? The need of creating and designing institutions in order to build an M&E system What have we done and what’s next? Conclusions

Social Policy Identification of social problems and objectives Analysis Program design Program operation and resources Budget Monitoring and Evaluation

Evaluation: why and how Helps re-designing and improving programs Supports efficient use of public funds Adds objective and technical elements to the social policy debate Promotes Transparency and social accountability Emphasis on results We need to measure indicators …but consider qualitative evaluations (Long run) impact evaluation and (frequent) monitoring of every day operation External (and good) evaluators Objective and useful evaluations: search for program’s stakeholders cooperation and participation

Decision taking Firm Social Program Process Well-being poverty health infrastructure income nutrition perception education social cap. satisfaction I. Identifying benefits II. Measuring Impact III. Information IV. Who should evaluate? V. Monitoring VI. Who demands evaluations? VII. What do we do with the results? Profits What would have happened without.. .? Counter-factual I-C. IRR In general precise The firm pays for it Lots of information Who pays for it? The firm itself External auditors The program External evaluators In order to know the whole process It doesn’t work usually looking for results Not clear Public resources Owners; share-holders Efficient use of information Results: what for?

I. Identifying benefits Precise Rules of Operation Better rules now  Still problems due to the fear for auditors  We still have social programs that have only political objectives: Opciones productivas, PET, Acuerdos para el Campo: Vivienda rural, adultos mayores del campo 

II. Measuring Impact: How? Guidelines for the annual Evaluation of Programs issued in 2002 by the Ministry of Finance and the Ministry of Audit (Contraloría) The guideline indicates the need to measure impact in every program every year Very ambitious...  …but helpful in the short run to accelerate the creation of a culture of evaluation  Pressure from donors  Progresa CIMO Probecat..

II. Measuring Impact: Promoting long-run impact evaluations External support: technical and financial (WB, IDB, Conacyt, international academics)  At least 9 impact evaluations in SEDESOL  Progresa-Oportunidades 1997-2004 Liconsa fortified milk Microsimulation: Oportunidades, Liconsa, Diconsa Food program Micro-regions Strategy Habitat Housing program Tu Casa Coahuila State’s Piso Firme Jóvenes con Oportunidades

II. Measuring Impact: The politics of the evaluation design Progresa: Centralized program with relatively little participation from beneficiaries or local authorities: it favoured randomization. Decentralized programs sometimes should seek for other methodologies An experimental design requires political support We tried to include in the SDL the possibility to have randomization, when feasible. 

III. Information From the institutional point of view it is not clear who should pay for the information  The program? Sedesol? Hacienda? Donors? Sometimes programs hide information from evaluators  Next step: La Contraloría 

IV. Who should evaluate? Presupuesto de Egresos de la Federación External Evaluators  National evaluators International evaluators not allowed sometimes  Creation in 2002 of the Under Secretariat of Planning, Prospective and Evaluation  Social Development Law  External evaluators  The SDL explicitly bans consulting firms from evaluating social programs 

IV. Who should evaluate?: The Human capital of External evaluators Not enough (good) evaluators  Impact evaluation Monitoring A good evaluator should be able to evaluate a program with the existing ingredients: experiments are rare. Impact vs Monitoring  Seminars (impact evaluation, monitoring, qualitative evaluation, power calculations, etc.)  We’re promoting partnerships between national and international evaluators 

IV. Who should evaluate?: The bidding process The bidding process favours the cheapest proposal  Public universities don’t have to go through the bidding process  Monopolies Excess transparency  Changing the external institution every year Annual contracts Incentives to present good results in order to evaluate again in the future 

V. Monitoring There is no obligation to do this  Indicators demanded by Hacienda, Función Pública, Presidencia, Congress, with little management purpose: Highly inefficient  International support: WB, IDB Create a true Results-based management system for every program  New Dirección General de Evaluación y Monitoreo de Programas Sociales, in order to promote internally the construction of a monitoring system 

VI. Who demands evaluations? 1. International donors  IDB support: the need for evaluation Receptive authorities: Levy, Gómez de León  Internal battle (técnicos vs rudos) 2. The opposition in Congress  Since 2000, Congress demands annual external evaluations for every public program. Presupuesto de Egresos de la Federación (PEF) In 1999 the PRI was not a majority in Congress The opposition feared the use of social programs for the 2000 election There were good and reasonable academics in Government 

VI. Who demands evaluations? The Social Development Law (2003-2004), institutionalizes the evaluation process  National Council for the Evaluation of Social Policy Evaluation of programs not only in Sedesol Poverty measurement We’re including in the Social Development Law (SDL) the obligation to evaluate every new federal program  Law of Transparency and Public Access to Information  Democracy 

VII. What do we do with the results? In the past almost nothing  The results were useful to Progresa in order to survive, but it was not useful for the everyday management A “small” institutional change: Our internal indicator  % of programs evaluated every year % of external recommendations attended by programs The weaknesses and recommendations (summary) are officially sent to every program manager The programs have to give an official answer on what actions will they implement The office of internal affairs (contraloría interna), demands to see proofs of the actions taken every year

VII. What do we do with the results? Evaluations are taken more into consideration Better reports by evaluators Programs make changes Evaluations became this year a tool for the budget process within Sedesol  We still need to link more closely (and formally, evaluations with the budget process) 

VIII. The politics of the evaluation process: very important If we aim for an objective but yet useful M&E, we need to take into account stakeholders We need the participation of stakeholders involved with evaluations The DGEM published internal rules for the evaluation process:  In the process operators should participate in TORs, analysis and reviewing sessions of developed work Continuous dialogue with external agents DGEM runs the party

Decision taking Marks Social Program Process Well-being poverty health infrastructure income nutrition perception education social cap. satisfaction I. Identifying benefits II. Measuring Impact III. Information IV. Who should evaluate? V. Monitoring VI. Who demands evaluations? VII. What do we do with the results? ? ? What would have happened without.. .? Counter-factual  Lots of information Who pays for it?  The program External evaluators  ? It doesn’t work usually looking for results  Not clear Public resources   ? Results: what for? 

Conclusions The Evaluation must be part of the Social Policy process Building a Monitoring and Evaluation System is a political task, that requires technical elements It is important to institutionalize the process and take into consideration the program’s stakeholders for the evaluation process In Mexico, it is crucial to build-up on the evaluators, policy-makers and congress official’s technical abilities

Too many programs, not many evaluations Between 1990 and 2002 Mexico spent almost 550 billion dollars in social policy There were very few evaluations: Probecat, Liconsa, Diconsa, Progresa The market itself doesn’t solve the need for evaluation in social programs