Monitoring and Evaluation Presentation by Kanu Negi Office of Development Effectiveness (ODE), DFAT and David Goodwins Coffey International Development.

Slides:



Advertisements
Similar presentations
1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Advertisements

The complex evaluation framework. 2 Simple projects, complicated programs and complex development interventions Complicated programs Simple projects blue.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Research methods – Deductive / quantitative
Problem Identification
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
DISCUSSION PAPER PREPARED FOR THE WHO’S DEPARTMENT OF HUMAN RESOURCES FOR HEALTH ON BEHALF OF THE UN TASK FORCE ON IMPACT ASSESSMENT OF FELLOWSHIPS BY.
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
Plan © Plan Assessing programme effectiveness at the global level in a large and complex organisation Presentation delivered to the conference on Perspectives.
RESEARCH DESIGNS FOR QUANTITATIVE STUDIES. What is a research design?  A researcher’s overall plan for obtaining answers to the research questions or.
2.4. Design in quantitative research Karl Popper’s notion of falsification and science – If a theory is testable and incompatible with possible empirical.
Whilst the pharmaceutical industry plays a key role in developing and producing medicines, there is a tension between industry’s need to expand product.
PEACE III - Theme 1.1 Aid for Peace – Phases I & II 21 September 2011 Celeste McCallion.
DSS Modeling Current trends – Multidimensional analysis (modeling) A modeling method that involves data analysis in several dimensions – Influence diagram.
Writing the Introduction to the Study Dissertation Editors Writing Center.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
SESSION 3: FROM SETTING PRIORITIES TO PROGRAMMING FOR RESULTS.
Quantitative and Qualitative Approaches
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
Unpacking the Elements of Scientific Reasoning Keisha Varma, Patricia Ross, Frances Lawrenz, Gill Roehrig, Douglas Huffman, Leah McGuire, Ying-Chih Chen,
Review of Research Methods. Overview of the Research Process I. Develop a research question II. Develop a hypothesis III. Choose a research design IV.
The Language of Science.  Hypothesis: a prediction that can be tested; an educated guess base on observations and prior knowledge  Theory: a well tested.
Evaluation design and implementation Puja Myles
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
A. Strategies The general approach taken into an enquiry.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
OED Perspective on ICR Quality Soniya Carvalho, OED Quality At Entry Course on SFs/CDD April 13, 2005 * Contributions from OED’s ICR Review Panel members.
Cedric D. Murry APT Instructor of Applied Technology in research and development.
Role of Training in Program Evaluation: Evidence from the Peace Corps Projects Shahid Umar Ph.D. Candidate Rockefeller College of Public.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Responding to Complexity in Impact Evaluation
Monitoring and evaluation of disability-inclusive development
Making Causal Claims in Non-Experimental Settings
Writing a sound proposal
Approaches to social research Lerum
Chapter 11: Quasi-Experimental and Single Case Experimental Designs
Program Evaluation ED 740 Study Team Project Program Evaluation
Technical Assistance on Evaluating SDGs: Leave No One Behind
Food and Agriculture Organization of the United Nations
a New Focus for External Validity
The Research Design Continuum
Exam feedback.
Introduction to Evaluation
Assess Plan Do Review Resource 1: Types of Evaluation – which type of approach should I use? There are a number of different types of evaluation, the most.
Social Research Methods
DUET.
Teaching and Educational Psychology
Program Evaluation Essentials-- Part 2
Evidence Based Practice 3
2016 AmeriCorps Texas All-Grantee Meeting February 25-26, 2016
Assessment: Measuring the Impact of our Programs on Students
Research Methods in Behavior Change Programs
OFFICE OF DEVELOPMENT EFFECTIVENESS
Identify key terms from definitions
Monitoring and Evaluation of Postharvest Training Projects
Single-Case Designs.
4.1 Selecting Project Purposes and Outcomes
Session 9 Recap on LFM and IL.
4.2 Identify intervention outputs
DCE3004 RESEARCH METHODS: Quantitative Research Approach
Introduction to Experimental Design
Chapter 1 The Science of Biology.
Regulated Health Professions Network Evaluation Framework
Building a Strong Outcome Portfolio
Monitoring and Evaluating FGM/C abandonment programs
EVALUATION FOR MODEL ADAPTATION WS3
Evaluating WP initiatives: Overcoming the Challenges
Misc Internal Validity Scenarios External Validity Construct Validity
Research design and techniques Workshop ICBEDC 2010
Case study from a market-systems development programme in Ethiopia
Presentation transcript:

Monitoring and Evaluation Presentation by Kanu Negi Office of Development Effectiveness (ODE), DFAT and David Goodwins Coffey International Development

I.Key annual DFAT performance reporting—assessments at multiple levels: Investment/project—Aid Quality Checks Country/regional program—Aid Program Performance Reports Overall Australian aid program—the Performance of Australian Aid II. Role of ODE in quality assuring aid performance assessments Aid program evaluation Independent oversight the performance reporting on the Australian aid program III.M&E expectations in DFAT’s aid program: Investment Design Guidelines DFAT Monitoring and Evaluation Standards IV.Guidance and tips on program evaluation and program logic, including program logic used in ODE’s trade facilitation evaluation V.Guidance on dealing with attributing investments to results Presentation structure

Program Logic Major steps in Program Evaluation Program Logic Program Logic

Trade Facilitation Program Logic (used in ODE evaluation)

To what extent did the investment lead to changes (positive or negative). Used in tandem with contribution which looks at the confluence of many factors influencing a change (what contribution did the investment make to that change?) Two approaches are often used: 1.The intervention is unpacked and quantitative approaches (e.g. statistical inference) are used to look at causes for change for specific components 2.Mixed method case study approaches are used such as contribution analysis. Dealing with the Attribution problem

Why is attribution often difficult?

What effects (outcomes) are caused by DFAT investments? › Quasi – experimental randomised control trials – scientific method – expensive, works with “closed systems”. Uses cases that have no investment compared to those that have. › Case studies and contribution analysis – uses case studies and research to test your program theory (theory of change). Does you program theory hold true in the real world? › Can the results of your analysis be generalised for other situations? › What works under what situations? Cause and effect

1.Develop a detailed program logic model clearly articulating assumptions and external influencing factors. Map out it out as a systems diagram showing where the investments will contribute. What are the pre- conditions? 2.Assess the existing evidence on results – helped by making sure you have a baseline. Break down the intervention into measurable components or cases. Use quantitative and qualitative data (mixed methods) and identify strong and weak evidence. 3.Assess alternative explanations for results – other factors that could have produced the results 4.Assemble the possible reasons for the performance of the investment – describe how the contribution made could be influencing outcomes. Check assumptions. 5.Seek additional evidence to fill in any gaps – e.g. interviews – qualitative studies 6.Revise and strengthen the performance case study Contribution Analysis

1.Study the most successful cases and the least successful cases of an intervention 2.Use these extremes to work out what works and what doesn’t and why. Document the context, mechanism and outcomes for each case. 3.Use the lessons from the most successful and least successful cases to strengthen the investment to improve the overall results. Case study approach to support contribution analysis – Success case method