1 MONITORING & EVALUATION FOR CDD OPERATIONS BEYOND THE RHETORIC & MYTHS Susan Wong EASSD/EACIQ
2 Favorite Top 5 Myths: 1. “ CDD will solve all our development problems ” 2. “ One size fits all, one instrument fits all …” 3. “ M + E = PR ” 4. “ Just push a button & all your information problems will be solved ” 5. “ Fix the format and that will solve the problem. ”
3 Monitoring & Evaluation Monitoring Project managers learn what works, what doesn ’ t, and why. Measure progress against workplans Provide feedback for real time decision-making Evaluation What impact are we having? Have we reached the project ’ s stated goals over the long term?
4 State of M&E Bank review found that only 5-10% of Bank projects had sound evaluation plans, including defined impact indicators and comparison groups. (Ezemenari et al, 2000) Document review of 34 ongoing EAP/CDD projects found that 7 projects or 20% had baseline, comparison groups, qualitative work
5 What Do We Normally Monitor? Progress against workplan (inputs, outputs) Examples: Are funds being used as planned? Are poor, women, and vulnerable groups participating in the process?
6 What Do We Usually Measure in CDD for Impact? Poverty/ Welfare Dimensions Has CDD been effective at reducing poverty? Does it reach the poor effectively? Infrastructure Has CDD improved access to services, quality, utilization? Are CDD projects cost effective compared to other service delivery mechanisms?
7 Local Governance/ Empowerment Do CDD projects promote changes in local governance and empowerment? transparency, participation, inclusion esp. of women and vulnerable groups, accountability, greater demand for improved service delivery, satisfaction w/services Social Dynamics Increases in social capital, conflict resolution
8 Issues & Constraints Faced Measuring multiple results in CDD (multi-sectoral, measuring empowerment … ) Lack of in-country specialized skills and capacity esp. for impact evaluations Limited government commitment to evaluation Timing Costs – who foots the bill? Many are paid thru TFs Need to do a better job of using information from M&E keep the info relevant and flowing
9 These constraints can be overcome …. … but it takes effort, some ingenuity … and money!
10 How Are We Monitoring? Examples of monitoring mechanisms in some EAP/CDD projects: Internal project monitoring, information systems Independent monitoring by civil society groups, village committees Grievance & complaint resolution mechanisms Case studies Supervision
11 Community Participatory Monitoring in Aceh
12 Training for Independent NGO Monitors
13 How Do We Evaluate? Key Guiding Principles in Impact Evaluation: Comparison/control groups – the counterfactual Baseline data Quantitative & Qualitative methods
14 Evaluation Tools Some tools for evaluating in EAP/CDD projects: Household surveys Qualitative case study work Cost effectiveness & EIRR analyses External thematic evaluations (procurement, micro- finance, quality of infrastructure, etc.) Audits
15 Examples from Indonesia Kecamatan Development Project For KDP1, spent $2.4 million out of loan (<1%) over 4 years for M&E. For KDP2, $2.9 million. Monitoring – field monitoring, community part.mon, case studies, NGO/journalist independent monitoring, grievance/complaints, financial reviews, supervisions Evaluation – impact survey, audits, special thematic studies (infra, loans, cost-effectiveness, corruption, communications) Studies – “ shine a light on dark areas ”
16 Some KDP Evaluation Findings: Cost effectiveness – avge 56% less expensive than equivalent works under Min of Public Works High rates of return – avge EIRR 39% to 68% High quality infrastructure – 2004 independent evaluation found that 94% of 108 projects sampled were ranked good or v. good technical quality. Wider evaluation for KDP2 due out in May 2005 Quick Disbursing – over last 4 FYs, KDP had on avge a 25% higher disbursement ratio than RD projects; 34% higher disbursement ratio than HD projects.
17 KDP Evaluation Findings (cont) Poverty/welfare impacts – 1 st stage targeting indicates pro-poor targeting. Insufficient data available for poverty targeting w/in kecamatan. Per capita expenditures in KDP vs. non-KDP areas show increases. (Preliminary data, Alatas forthcoming) Participation – Participation in KDP activities shows increases over the years, esp. for women in decision- making meetings. Corruption – Audits show leakage <2% of project costs. High buy-in & satisfaction among communities, central and local governments.
18 Examples Using the Information Example 1: Economic Loan Portfolio Evaluations showed poor performance in KDP1 led to complete redesign for KDP2 Example 2: 2004 Corruption Study Randomized interventions for: (a) increasing community participation in monitoring; (b) increasing probability of external audits 600 villages in East/Central Java,road projects Methodology and findings to be incorporated into next project cycle, e.g. written invitations & accounts
19 Corruption Study (photos courtesy of Ben Olken)
20 Philippines KALAHI Baseline Courtesy of Rob Chase, SDV Phasing - 3 rd Phase treatment municipalities - Before project starts in those municipalities Sampling - 4 survey provinces - Using existing data, match 2 treatment municipalities and 2 control municipalities in each province - Visited 2,400 HH - HH survey and barangay official survey Measuring impact on: - Poverty, access, empowerment & governance
21 Take Aways (& what I would have done differently) Tap into Trust Funds! M&E takes an enormous amount of time & different levels of expertise. Hire specialized assistance for certain areas (e.g. impact surveys). On MIS, start w/ the basics & add the bells and whistles later. Keep it as simple as possible. Ensure that information flows in multiple directions. Think thru the levels of information needs. Take the time to find out what those needs are. Embed results and M&E findings into mgmt decision- making.