Download presentation
Presentation is loading. Please wait.
Published byClement Riley Modified over 9 years ago
1
WELCOME June 2012 Evaluation Planning and Management John Wooten ~Lynn Keeys
2
Session 1: Course Introduction Objectives: Introduce the course, facilitators, participants Set protocols Overview course objectives and logistics Conduct a pre-test 2
3
What We Remember On average, we remember: 20% of what we read 30% of what we hear 40% of what we see 60% of what we do 90% of what we read, hear, say and do “The more you can hear it, see it, say it, and do it, the easier it is to learn.” Colin Rose Accelerated Learning Action Guide 3
4
Rules/Guidelines Cell Phones OFF, please Respect the speaker Keep questions relevant to topic Start on time Actively participate 4
5
Participants’ Introductions Name Office and role Level of evaluation planning and management experience (Lo-Med-Hi) Two course expectations How will fulfilling these expectations impact your job/career? 5
6
Course Objectives USAID’s iterative approach to EPM, including new project design and evaluation policy guidance Basic terms, concepts and methodological challenges The importance of performance baselines in evaluation All phases of EPM Planning and managing different types of evaluations 6
7
Course Objectives (continued) Data collection methods and tools Evaluation statements of work Data analysis and use The importance and content of evaluation follow-up plans to reporting, disseminating and using evaluation findings Common pitfalls in EPM Flexible evaluation checklists 7
8
Logistics Class time Breaks Parking lot Small group areas Small groups assignments Courtesy rules Special needs? Course evaluation Proactive note-taking 8
9
Pre-Test Closed “book & mouth”, please Questions: –Multiple choice –Fill in the blanks –Cross references 9
10
Session 2: USAID Evaluation Planning and Management Objectives: Understand why we evaluate Review USG policy on evaluation and USAID contexts Review highlights of the revised USAID project design and evaluation policies Introduce some key terms and types of evaluations Overview USAID’s program cycle and context for evaluation Review some key values to guide evaluation 10
11
Why Evaluate? “People and their managers are working so hard to be sure things are done right, that they hardly have time to decide if they are doing the right things.” Stephen R. Covey Author 11
12
Why Evaluate? Early 1990s, U.S. Congress found: Waste/inefficiency undermine confidence in government and reduces ability to address vital public needs Federal managers disadvantaged due to insufficient articulation of program goals and inadequate info on performance Congress seriously handicapped by insufficient attention to program performance and results 12
13
…It’s the law of the land! Why Evaluate? US Government Performance Results Act, 1993 (GPRA) Holds entire USG accountability for achieving results Focuses on results, service quality and customer satisfaction Requires objective information on effectiveness and efficiency in achieving objectives Improves the internal management of the USG Requires Strategic Plans per agency with regular performance assessments and program evaluations http://www.whitehouse.gov/omb/mgmt-gpra/gplaw2m 13
14
But Why Else Evaluate? “Be ware the watchman…” Sir Josiah Stamp 14
15
USAID and Donor Evaluation Experiences USAID –Rich performance management and evaluation history and culture –Evaluation leader among donors –Past decade, quality and leadership slipped –Recent efforts to reclaim leadership as a “learning institution” Other Donors Paris Declaration: Aid Effectiveness and Accra Agenda for Action –Ownership –Alignment –Harmonization –Results –Mutual accountability –Inclusive partnerships –Delivering results 15
16
Reinvigorated Project Designs and Evaluations New Project Design Guidance Designs informed by evidence, supported by analytical rigor Promote gender equality, female empowerment Strategically apply innovative technologies Selectively target and focus on investments with highest probability of success Design with evaluation in mind, rigorously measure and evaluate performance and impact… 16
17
Reinvigorated Project Designs and Evaluations New Project Design Guidance Design with clear sustainability objectives Apply integrated/multi-disciplinary approaches Strategically leverage or mobilize “solution-holders” and partners Apply analytic rigor, utilize best available evidence Broaden the range of implementing options… 17
18
Reinvigorated Project Designs and Evaluations New Project Design Guidance Incorporate continuous learning for adaptive management (re-examining analytic basis) Implement peer review processes Promote collaboration and mutual accountability Demonstrate USAID staff leadership in the project design effort 18
19
Reinvigorated Project Designs and Evaluations New Evaluation Policy More and higher quality evaluations (2 types) Evidence-based evaluation and decision-making Generating knowledge for the dev. Community Increased transparency on return on investments Evaluation as an integral part of managing for results Designing with evaluation in mind Building local evaluation capacity… 19
20
Reinvigorated Project Designs and Evaluations Management actions: -More training -Evaluation Audits -DEC submissions -Peer SOW Reviews -Annual Evaluation Plan -Evaluation Point-of-Contact New Evaluation Policy At least one opportunity for an impact evaluation per DO Evaluating all large and all pilot projects Thematic or meta evaluations Best affordable evaluation designs Collection/storage of quantitative data 20
21
Reinvigorated Project Designs and Evaluations ! More aggressive, direct involvement of USAID staff ! More carefully integrated, systemic approach !! Much more rigorous evidence-based planning and decision-making throughout the entire program cycle +“Unprecedented transparency” (A/AID) “Meaning for ME?” More + More + Much More 21
22
“Meaning for ME?” USAID expects you to… Define and organize your work around the end results you seek to accomplish. This requires: -Making intended results clear and explicit -Ensuring agreement among partners, customers, and stakeholders that proposed results are worthwhile (relevant and realistic) -Organizing your work/interactions to achieve results effectively More + More + Much More 22
23
Some Key Terms and Definitions Evaluation Performance Indicators Performance Monitoring Performance Management -- Managing for Results (MFR) Evaluation Design Performance Evaluations Impact Evaluations Attribution Counterfactual 23
24
Types of Evaluations Performance Evaluation (Normative) Reviews performance against agreed standards Assesses mgmt. structure, performance, resource use Reviews project design/development hypothesis Reviews progress, constraints and opportunities Assesses likelihood of achieving targets Provides notional judgments on project’s perceived value Evaluation design challenges Clarity/flexibility of project design Appropriateness of a few evaluation questions 24
25
Types of Evaluations Impact Evaluation (Summative) Probe/answer ‘cause-effect’ questions testing the development hypothesis Require comparison group (counterfactual), baselines and end-line indicator data Extrapolate broader lessons and policy implications Evaluation design challenges Timing Internal/external validity (ruling out “noise”) Availability, adequacy, comparability of baseline and end-line data 25
26
USAID Program Cycle http://www.usaid.gov/our_work/policy_planning_and_learning/documents/ProgramCycle Overview.pdf Evaluation within USAID Program Context 26
27
Strategy Implementation Roadmap Project Design and Implementation Roadmap Evaluation Roadmap Evaluation within USAID Program Context 27
28
Evaluation within USAID Program Context 28
29
Conceptual Analytical Approval Evaluation within USAID Program Context 29
30
Evaluation within USAID Program Context 30
31
Evaluation within USAID Program Context 31
32
Values for Planning and Managing Evaluations Designing for Learning Best Methods Local Capacity Building/Reinforcing Unbiased Accountability Participatory Collaboration Evidence-based Decision-making Transparency (revisited) 32
33
Values for Planning and Managing Evaluations Deciding on an evaluation design Disseminating the evaluation report upon completion Registration Requirement Statement of Differences Standard Reporting and Dissemination DEC Submissions Data Warehousing “Unprecedented transparency…” 33
34
(jwootenjr@yahoo.com) (lynnkeeys50@yahoo.com) 34 Evaluation Planning and Management John Wooten ~ Lynn Keeys Thank you~
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.