Evaluation and Monitoring Methodologies Strengthening the Legislature – Challenges and Techniques K. Scott Hubli, NDI.

Slides:



Advertisements
Similar presentations
1st Meeting of the Working Party on International Trade in Goods and Trade in Services Statistics - September 2008 Australia's experience (so far) in.
Advertisements

MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop The MICS4 Process.
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
MAURIZIA TOVO WORLD BANK Whats a good policy?. What is a policy? A set of principles intended to govern actions adopted or proposed by a government, party,
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Donald T. Simeon Caribbean Health Research Council
Project Monitoring Evaluation and Assessment
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Ray C. Rist The World Bank Washington, D.C.
National Commission for Academic Accreditation & Assessment Preparation for Developmental Reviews.
PPA 502 – Program Evaluation Lecture 3b – Outcome Monitoring.
Lessons Learned for Strong Project Delivery & Reporting Sheelagh O’Reilly, Kristin Olsen IODPARC Independent Assessors for the Scottish Government IDF.
Purpose of the Standards
Session Five: Evaluating learning and teaching in the classroom Short Course in Learning and Teaching in the Classroom Janet Holmshaw and Jeff Sapiro Middlesex.
Performance Management Upul Abeyrathne, Dept. of Economics, University of Ruhuna, Matara.
Standards and Guidelines for Quality Assurance in the European
Gender Aware Monitoring and Evaluation. Amsterdam, The Netherlands Presentation overview This presentation is comprised of the following sections:
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
Leadership for Student Achievement National School Boards Association.
Eric R. Johnson Hillsborough County, (Tampa) FL
SESSION ONE PERFORMANCE MANAGEMENT & APPRAISALS.
Aligning Academic Review and Performance Evaluation (AARPE)
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Parliamentary Committees in Democracies: Unit 4 Research Services for Parliamentary Committees.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Performance Measurement and Analysis for Health Organizations
Evaluation in the GEF and Training Module on Terminal Evaluations
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University.
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
Peter Hansen Perspectives on Development Aid Health Impact Assessment ASPHER/EAGHA Consultative Workshop Brussels, 6 February 2012.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
UNDP-GEF Community-Based Adaptation Programme Anne-France WITTMANN CBA-Morocco Programme Manager (UNV) Tools & Tips to foster Gender Mainstreaming & Inclusion.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
Chapter 1 The Nature of Strategic Management
Ensuring rigour in qualitative research CPWF Training Workshop, November 2010.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
From Monitoring Through Evaluation To Impact Assessment The case of NHDRs and vulnerable groups Andrey Ivanov Human Development Adviser, Bratislava RSC.
Changing the way the New Zealand Aid Programme monitors and evaluates its Aid Ingrid van Aalst Principal Evaluation Manager Development Strategy & Effectiveness.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Project Management Training
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Tools for Mainstreaming Disaster Risk Reduction: Guidance Notes for Development Organisations Charlotte Benson and John Twigg Presented by Margaret Arnold.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Development of Gender Sensitive M&E: Tools and Strategies.
Croatia: Result orientation within the process of preparation of programming documents V4+ Croatia and Slovenia Expert Level Conference Budapest,
Introduction to Participatory Monitoring & Evaluation (PM&E): A Practical Approach to Engaging Stakeholders and Communities in Monitoring & Evaluation.
Monitoring & Evaluation. What? Campaign monitoring is a step-by-step analysis of how the campaign is progressing against milestones previously defined.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Principles of Good Governance
Project monitoring and evaluation
Session VII: Formulation of Monitoring and Evaluation Plan
Decade of Roma Inclusion Progress monitoring
Food and Agriculture Organization of the United Nations
BUMP IT UP STRATEGY in NSW Public Schools
Module 5 SDG follow-up and review mechanisms
Session 4: SDG follow-up and review mechanisms
Implementation, Monitoring, and NM DASH
Monitoring and Evaluation in Communication Management
Presentation transcript:

Evaluation and Monitoring Methodologies Strengthening the Legislature – Challenges and Techniques K. Scott Hubli, NDI

Overview General Comments on Monitoring and Evaluation Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs Practical Tips and Considerations

General Comments on Monitoring and Evaluation Evaluation (and Baseline Assessments) --Use to develop program design; use for major course corrections --More costly and less frequent than monitoring (every two to three years) --Typically done at the beginning and the end of a program, but often also done after a major change in the political landscape (e.g., regime change, ethnic conflict settlement, etc.) --Used for accountability to partners, donors, stakeholders, not for ongoing project management

General Comments on Monitoring and Evaluation Performance Monitoring --Ongoing monitoring; used to manage performance of implementation --Track changes (but less analysis) -- Informed by baseline assessment and, if well designed, it can reduce future evaluation costs -- May indicate a need for a evaluation or updated baseline -- Focus on low cost, regular data collection (workshop evaluations, information available from parliament, regular focus groups, etc.)

General Comments on Monitoring and Evaluation Always distinguish among: --Inputs (e.g., consultants, computers, etc.) --Outputs (e.g., 40 people trained in a workshop on oversight techniques) --Outcomes (e.g., increased knowledge of oversight investigation techniques) --Objectives (e.g., increased oversight hearings) --Goals (e.g., increased government accountability)

How are legislative strengthening programs different from other programs with respect to monitoring and evaluation?

Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs Legislatures are highly complex institutions --They involve multiple actors seeking to achieve multiple goals simultaneously --Where possible, disaggregate data (by gender, party, region, etc.) --Identify clear goals and targeted groups; watch for unintended consequences

Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs Long-term goals, short-term programs -- Resist the tendency to monitor outputs rather than progress in achieving desired outcomes, objectives and goals -- Find ways to measure small changes in large goals; or outcomes that can be affected with the project time frame

Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs Programs focus on process, not outputs --Example: number of laws passed --Emphasize qualitative over quantitative information --Use detailed process descriptions in establishing baselines --Use monitoring and evaluation to help strengthen this process and to teach results-based management, where possible (“Monitoring and evaluation should be managed as joint exercises with development partners.”)

Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs Monitoring and evaluation is often highly political --Involving partners can sometimes further politicize evaluation and monitoring; use caution and judgment --Can be hard to get necessary information --Politics may cause people to be less than fully honest --Results can be used as a political weapon

Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs Legislatures have natural cycles --Elections, post-election learning curves, legislative floor periods, recesses, budget processes, etc. --Example: constituency relations --Expect uneven development in performance monitoring, but try to attribute fluctuations in data --Time evaluations carefully – look for “normal” periods

Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs Many intervening variables --Economic conditions, geopolitical developments, ethnic conflict, death of a key politician, etc. --No substitute for nuanced political analysis --Measure outcomes, objectives, goals – not just outcomes; this can help identify these intervening variables

Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs Perceptions matter --Importance of qualitative over quantitative indicators --Use of focus groups, opinion polls, etc. --Even anecdotal evidence is useful if it captures a political mood or issue

Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs Difficulty of comparative benchmarking --Only one national legislature; cross-country comparisons are of limited utility --Comparisons across time more important; use of thorough baselines --Implications on setting goals and targets – use of reasonable/consensus expectations

What are some practical strategies for dealing with these unique aspects of monitoring and evaluating legislative strengthening programs?

Practical Tips & Considerations General Issues --Be pragmatic in designing an evaluation or monitoring plan; tie evaluation and monitoring to the purpose or objectives. Avoid evaluation for evaluation’s sake. Consider: --resource availability for evaluation --novelty of the program --confidence in program design or implementation --needs of funder --Budget sufficient resources – (Costs for legislative strengthening evaluation may exceed those for other program types – soft assistance, new field, etc.)

Practical Tips & Considerations Issues in Doing a Baseline --Limit scope to allow for detailed coverage of program areas --Protect against biases of person(s) doing the baseline by: --Using teams --Using clear, detailed terms of reference --Incorporating documentary evidence --Seeking consistency in future assessments

Practical Tips & Considerations Issues in Doing a Baseline (cont.) --Pick timing carefully; describe any special circumstances --Prepare carefully for baseline assessment team --Cover the range of stakeholders --Get out of the capital --Consider focus groups or creative methods for documenting perceptions and processes (e.g., a sample of 10 legislators to track periodically every 3 years) --Pay attention to protocol; build good will.

Practical Tips & Considerations Using outside evaluators --Outside evaluators can not only provide objectivity but also insulation from the political consequences of an evaluation --Combine multiple backgrounds (academic or legislative strengthening specialists and MPs or staff from similar systems) --Recognize value of “time in the trenches” --Designate a lead person with responsibility for producing the document --Get a sufficient time commitment

Practical Tips & Considerations Issues in Performance Monitoring --Draw on baseline and prior evaluations --Design performance monitoring plan up front; adjust it as project evolves: --Imposes discipline; keeps program on track --Provides clarity of expectations to partners --Keep it current, modify as needed --Make these changes explicit

Practical Tips & Considerations Issues in Performance Monitoring (cont.) --Tie to likely performance issues --Draw on low-cost existing information sources; may be more quantitative, with less analysis --May focus on outcome level, rather than objective or goal level --Consider quarterly or semi-annual monitoring --Expect, but explain, fluctuations --When you can’t explain repeated fluctuations, consider updating a baseline to try to identify issues --Often done, in part, by those implementing program

Final Thoughts Be creative: legislative strengthening is an art, not a science Be willing to accept criticism; fight structural bias for “spinning” results Share lessons learned, both internally and externally