Presentation is loading. Please wait.

Presentation is loading. Please wait.

Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.

Similar presentations


Presentation on theme: "Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify."— Presentation transcript:

1

2 Monitoring Evaluation Impact Assessment

3 Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify what you want to monitor, how and for whom n Identify indicators that can be used to monitor and evaluate your work n Know what tools can be used for monitoring and evaluating, their limitations and advantages

4 Stakeholders n Who has an interest in monitoring and evaluation in your organisation n is this primarily accountability, quality, learning? n What are the relationships between them? n What are their key questions from m and e?

5 What is the cause and effect relationship? nTnThe underlying principle of the logframe structure is cause and effect, ie. If……..Then nInIF we carry out activities THEN we will produce outputs…THEN we will achieve objectives…THEN we will contribute to the overall goal. nTnThe better the cause and effect linkage the better the project design.

6 Indicators for different levels of monitoring and evaluation

7 Planning: n Situation analysis/needs assessment n Aims (goals, wider objectives) n Objectives (outcomes, purpose) n Activities (outputs and inputs) n plan of action, budget, resources n Indicators to measure progress n baseline information

8 Monitoring (& management) n Continuous throughout project n Strengths and weaknesses. What adjustments need to be made? n External changes. Does work need to change to respond? n Progress towards achieving outcomes. Do activities need to be changed? n Impact. How is the project affecting different groups in longer term? n Critical analysis and feedback

9 Evaluation n Carried out at significant stages in project’s development. n Looks backwards at what has been done, learns lessons for the future. n Assesses progress towards, outcomes, objectives and goal. n Answers specific questions about effectiveness, efficiency, sustainability – set in TOR n Self evaluation or external n can be peer evaluation

10 Impact assessment n Short term impact n longer term impact n intended and unintended impact n positive and negative n attribution vs contribution n “plausible association” of contributions to impact

11 A BOND Approach to Quality Standards in NGOs: Putting Beneficiaries First u Throughout our research, we consistently heard that NGOs deliver quality when their work is based on a sensitive and dynamic understanding of beneficiaries’ realities; responds to local priorities in a way that beneficiaries feel is appropriate; and is judged to be useful by beneficiaries

12 Downward accountability n Meaningful participation: process outputs and empowering? n Adequate attention to quality of relationships n ongoing learning and reflection: keep debates alive n efficient use of resources and minimise costs n sustainability and impact. (quality does not always produce impact)

13 Tools

14 Methods for collecting and analysing information n Qualitative approaches: allow a more open- ended and in-depth investigation, but often over a smaller area. n Quantitative approaches: useful for examining predetermined variables. n Participatory approaches: allow differences in people’s interests, needs and priorities to be recognised, and form the basis for negotiation between stakeholders. They also allow people to benefit from analysing and asserting their own interests.

15 How to choose different methods n Who needs the information and why? n What key questions are you trying to answer? What indicators are you monitoring? n What resources are there to collect and analyse the information? n RIGOUR Vs. PARTICIPATION n critical thinking more important than actual tools

16 n Specific n M easurable or Monitorable n Achievable n Relevant n Timebound OBJECTIVES

17 Logical Framework Analysis n The logical framework is a tool for: n organising thinking n relating activities to expected results n setting performance objectives n allocating responsibilities

18 The logical framework matrix

19 Indicators and Means of verification n Set indicators for all levels of the hierarchy n Show how you will get the information; sources and methods. Are these realistic?

20 What about the influence of uncertain factors? n Good project design requires uncertain factors to be taken into account n Assumptions/risks are the uncertainty factors between each level n Assumptions complete the if/then logic and are set out in the last column of the matrix n The project team is not responsible for Assumptions/Risks but they must monitor changes and report on the implications.

21 Complementary approaches n Most significant change - no indicators, reflect on what happened and what changes were significant n Outcome mapping - focus on changes in behaviour and relationships of “boundary partners”

22 Impact: What Change Did We Make? Most Significant Change n Each tell a story n Draw out common themes n Pick a story that is representative n Share stories

23 Put it into practice for your project n Who are main stakeholders? n What is main focus – learning, quality or accountability? n Select one outcome and activity from your project n identify some indicators. n What m and e tools could be used? n How will the resulting information be used?


Download ppt "Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify."

Similar presentations


Ads by Google