WELCOME June 2012 Evaluation Planning and Management John Wooten ~Lynn Keeys.

Slides:



Advertisements
Similar presentations
Presented at the ECOSOC 2012 Development Cooperation Forum 1 st High-level Symposium Bamako, Mali 5-6 May 2011 by Timothy Lubanga, Assistant Commissioner.
Advertisements

MAPP Process & Outcome Evaluation
Donald T. Simeon Caribbean Health Research Council
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
1 Professionalising Programme & Project Management Developing programme & project management capacities for UNDP and national counterparts External Briefing.
Sabelo Mbokazi Senior Policy Officer HIV/AIDS, TB, Malaria & OID
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Ray C. Rist The World Bank Washington, D.C.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
Performance management guidance
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Sierra Leone Consortium for Relief and Development (CORAD) CARE Int., AFRICARE, Catholic Relief Services, World Vision Int. Action Plan for Making Monitoring.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
February 8, 2012 Session 4: Educational Leadership Policy Standards 1 Council of Chief School Officers April 2008.
Overview of UNDAF process and new guidance package March 2010 u nite and deliver effective support for countries.
“Strengthening the National Statistical System of RM” Joint Project By 2011, public institutions with the support of civil society organizations (CSOs)
Policy on Gender Equality and Female Empowerment June 2012 Sylvia Cabus Gender Advisor USAID/Bureau of Food Security.
USAID’s Experience and Lessons Learned in Approaches used in Monitoring and Evaluating Capacity Building Activities Duane Muller, USAID November 7, 2008.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Theme III Introducing Greater Impact Orientation at the Institutional Level Group 6.
High-level forum on strategic planning for statistics: Bishkek, May 2006 Why statistics? Why strategic planning? Presentation by PARIS21 Secretariat.
Missouri Integrated Model Mid-Year Meeting – January 14, 2009 Topical Discussion: Teams and Teaming Dr. Doug HatridgeDonna Alexander School Resource SpecialistReading.
Acumanage Draft presentation Effective Leadership & Project Management Courses 1 L. Zegers – Training courses (in English, French or Spanish) Course 1:
Food For Peace: Title II Programs and Gender 1 FSN Knowledge Sharing Meeting November 15, 2012 Presented by Michelle Gamber, MA, DrPH AAAS Fellow, FFP.
INTOSAI Public Debt Working Group Updating of the Strategic Plan Richard Domingue Office of the Auditor General of Canada June 14, 2010.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Shaida Badiee, Director Development Data Group The World Bank International Forum on Monitoring Economic Development Beijing, China Sept 28, 2011.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Caribbean Community Secretariat 2nd meeting of the Advisory Group on Statistics San Ignacio – Belize 25 June 2008 Introduction and Objectives of NSDS day.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Group Technical Assistance Webinar August 5, CFPHE RESEARCH METHODS FOR COMPARATIVE EFFECTIVENESS RESEARCH.
European Commission Introduction to the Community Programme for Employment and Social Solidarity PROGRESS
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
T he Istanbul Principles and the International Framework Geneva, Switzerland June 2013.
Setting Standards for Health Statistics: The HMN Framework High Level Forum on Strategic Planning for Statistics Bangkok 6-9 June
The Next Stage for Results in Africa. Context 2005 Paris Declaration on Aid Effectiveness 2006 Mutual Learning Events Uganda & Burkina Faso 2007 Hanoi.
Aid Transparency: Better Data, Better Aid Simon Parrish, Development Initiatives & IATI Yerevan, 4 October 2009.
NSDS DESIGN PROCESS: ROAD MAPS & OTHER PRELIMINARIES Prof. Ben Kiregyera NSDS Workshop, Addis Ababa, Ethiopia 9 August 2005.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Vito Cistulli - FAO -1 Damascus, 2 July 2008 FAO Assistance to Member Countries and the Changing Aid Environment.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Kathy Corbiere Service Delivery and Performance Commission
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Office of Service Quality
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
Towards a Sector Wide Approach (SWAp) for water sector in Lesotho Prepared by T.W. Sepamo Principal Engineer Water and Sanitation. Prepared for EUWI –
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Introduction to the NAP process & the NAP Expo NAP-Expo 8– 9 August 2014, Bonn, Germany LEG Thinley Namgyel.
Australian Council for Educational Research School Improvement Christian Schools National Policy Forum Canberra, 26 May 2014.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
1 Introducing the ARC: The New Performance Appraisal Tool for RCs and UNCTs March 2016.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Managing Talent – Maximizing Your Employee’s Potential 3 rd SACCO LEADERS’ FORUM Monique DunbarLorri Lochrie Communicating Arts Credit UnionCentral 1 Credit.
The USAID Agency-wide Climate Change Evaluation Agenda: Unpacking Overarching Concepts Nancy Peek, Research Associate, Development and Training Services,
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Mandy Williams, Participation Cymru manager
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Establish and Identify Processes  Identify and establish current state:  Roles and responsibilities  Processes and procedures  Operational performance.
CABRI response to Accra Action Agenda
Auditing Sustainable Development Goals
Fundamentals of Monitoring and Evaluation
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
One ODOT: Positioned for the Future
Project Management Process Groups
February 21-22, 2018.
Presentation transcript:

WELCOME June 2012 Evaluation Planning and Management John Wooten ~Lynn Keeys

Session 1: Course Introduction Objectives: Introduce the course, facilitators, participants Set protocols Overview course objectives and logistics Conduct a pre-test 2

What We Remember On average, we remember: 20% of what we read 30% of what we hear 40% of what we see 60% of what we do 90% of what we read, hear, say and do “The more you can hear it, see it, say it, and do it, the easier it is to learn.” Colin Rose Accelerated Learning Action Guide 3

Rules/Guidelines Cell Phones OFF, please Respect the speaker Keep questions relevant to topic Start on time Actively participate 4

Participants’ Introductions Name Office and role Level of evaluation planning and management experience (Lo-Med-Hi) Two course expectations How will fulfilling these expectations impact your job/career? 5

Course Objectives USAID’s iterative approach to EPM, including new project design and evaluation policy guidance Basic terms, concepts and methodological challenges The importance of performance baselines in evaluation All phases of EPM Planning and managing different types of evaluations 6

Course Objectives (continued) Data collection methods and tools Evaluation statements of work Data analysis and use The importance and content of evaluation follow-up plans to reporting, disseminating and using evaluation findings Common pitfalls in EPM Flexible evaluation checklists 7

Logistics Class time Breaks Parking lot Small group areas Small groups assignments Courtesy rules Special needs? Course evaluation Proactive note-taking 8

Pre-Test Closed “book & mouth”, please Questions: –Multiple choice –Fill in the blanks –Cross references 9

Session 2: USAID Evaluation Planning and Management Objectives: Understand why we evaluate Review USG policy on evaluation and USAID contexts Review highlights of the revised USAID project design and evaluation policies Introduce some key terms and types of evaluations Overview USAID’s program cycle and context for evaluation Review some key values to guide evaluation 10

Why Evaluate? “People and their managers are working so hard to be sure things are done right, that they hardly have time to decide if they are doing the right things.” Stephen R. Covey Author 11

Why Evaluate? Early 1990s, U.S. Congress found: Waste/inefficiency undermine confidence in government and reduces ability to address vital public needs Federal managers disadvantaged due to insufficient articulation of program goals and inadequate info on performance Congress seriously handicapped by insufficient attention to program performance and results 12

…It’s the law of the land! Why Evaluate? US Government Performance Results Act, 1993 (GPRA) Holds entire USG accountability for achieving results Focuses on results, service quality and customer satisfaction Requires objective information on effectiveness and efficiency in achieving objectives Improves the internal management of the USG Requires Strategic Plans per agency with regular performance assessments and program evaluations 13

But Why Else Evaluate? “Be ware the watchman…” Sir Josiah Stamp 14

USAID and Donor Evaluation Experiences USAID –Rich performance management and evaluation history and culture –Evaluation leader among donors –Past decade, quality and leadership slipped –Recent efforts to reclaim leadership as a “learning institution” Other Donors Paris Declaration: Aid Effectiveness and Accra Agenda for Action –Ownership –Alignment –Harmonization –Results –Mutual accountability –Inclusive partnerships –Delivering results 15

Reinvigorated Project Designs and Evaluations New Project Design Guidance Designs informed by evidence, supported by analytical rigor Promote gender equality, female empowerment Strategically apply innovative technologies Selectively target and focus on investments with highest probability of success Design with evaluation in mind, rigorously measure and evaluate performance and impact… 16

Reinvigorated Project Designs and Evaluations New Project Design Guidance Design with clear sustainability objectives Apply integrated/multi-disciplinary approaches Strategically leverage or mobilize “solution-holders” and partners Apply analytic rigor, utilize best available evidence Broaden the range of implementing options… 17

Reinvigorated Project Designs and Evaluations New Project Design Guidance Incorporate continuous learning for adaptive management (re-examining analytic basis) Implement peer review processes Promote collaboration and mutual accountability Demonstrate USAID staff leadership in the project design effort 18

Reinvigorated Project Designs and Evaluations New Evaluation Policy More and higher quality evaluations (2 types) Evidence-based evaluation and decision-making Generating knowledge for the dev. Community Increased transparency on return on investments Evaluation as an integral part of managing for results Designing with evaluation in mind Building local evaluation capacity… 19

Reinvigorated Project Designs and Evaluations Management actions: -More training -Evaluation Audits -DEC submissions -Peer SOW Reviews -Annual Evaluation Plan -Evaluation Point-of-Contact New Evaluation Policy At least one opportunity for an impact evaluation per DO Evaluating all large and all pilot projects Thematic or meta evaluations Best affordable evaluation designs Collection/storage of quantitative data 20

Reinvigorated Project Designs and Evaluations ! More aggressive, direct involvement of USAID staff ! More carefully integrated, systemic approach !! Much more rigorous evidence-based planning and decision-making throughout the entire program cycle +“Unprecedented transparency” (A/AID) “Meaning for ME?” More + More + Much More 21

“Meaning for ME?” USAID expects you to… Define and organize your work around the end results you seek to accomplish. This requires: -Making intended results clear and explicit -Ensuring agreement among partners, customers, and stakeholders that proposed results are worthwhile (relevant and realistic) -Organizing your work/interactions to achieve results effectively More + More + Much More 22

Some Key Terms and Definitions Evaluation Performance Indicators Performance Monitoring Performance Management -- Managing for Results (MFR) Evaluation Design Performance Evaluations Impact Evaluations Attribution Counterfactual 23

Types of Evaluations Performance Evaluation (Normative) Reviews performance against agreed standards Assesses mgmt. structure, performance, resource use Reviews project design/development hypothesis Reviews progress, constraints and opportunities Assesses likelihood of achieving targets Provides notional judgments on project’s perceived value Evaluation design challenges Clarity/flexibility of project design Appropriateness of a few evaluation questions 24

Types of Evaluations Impact Evaluation (Summative) Probe/answer ‘cause-effect’ questions testing the development hypothesis Require comparison group (counterfactual), baselines and end-line indicator data Extrapolate broader lessons and policy implications Evaluation design challenges Timing Internal/external validity (ruling out “noise”) Availability, adequacy, comparability of baseline and end-line data 25

USAID Program Cycle Overview.pdf Evaluation within USAID Program Context 26

Strategy Implementation Roadmap Project Design and Implementation Roadmap Evaluation Roadmap Evaluation within USAID Program Context 27

Evaluation within USAID Program Context 28

Conceptual  Analytical  Approval Evaluation within USAID Program Context 29

Evaluation within USAID Program Context 30

Evaluation within USAID Program Context 31

Values for Planning and Managing Evaluations Designing for Learning Best Methods Local Capacity Building/Reinforcing Unbiased Accountability Participatory Collaboration Evidence-based Decision-making Transparency (revisited) 32

Values for Planning and Managing Evaluations Deciding on an evaluation design Disseminating the evaluation report upon completion Registration Requirement Statement of Differences Standard Reporting and Dissemination DEC Submissions Data Warehousing “Unprecedented transparency…” 33

34 Evaluation Planning and Management John Wooten ~ Lynn Keeys Thank you~