 Meaning  Purpose  Steps in evaluation  Models of evaluation  Program evaluators and the role.

Slides:



Advertisements
Similar presentations
1 Mateja Bizilj PEMPAL BCOP KEY CONCEPTS AND APPROACHES TO PROGRAMS AND PERFORMANCE Tbilisi, June 28, 2007.
Advertisements

Donald T. Simeon Caribbean Health Research Council
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Project Monitoring Evaluation and Assessment
Program Evaluation.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Evaluating and Revising the Physical Education Instructional Program.
Title I Needs Assessment and Program Evaluation
Chapter 15 Evaluation.
Evaluation. Practical Evaluation Michael Quinn Patton.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
INTRODUCTION TO BOOK-KEEPING AND ACCOUNTANCY
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Health Program Effect Evaluation Questions and Data Collection Methods CHSC 433 Module 5/Chapter 9 L. Michele Issel, PhD UIC School of Public Health.
Basic Training on Project Proposal
Importance of Health Information Systems Information explosion during 1990s  It is estimated that in the next 50 years, the amount of knowledge currently.
Nine steps of a good project planning
Planning and submitting a shadow report Charlotte Gage Women’s Resource Centre.
RESEARCH A systematic quest for undiscovered truth A way of thinking
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
If you don’t know where you’re going, any road will take you there.
SEILA Program and the Role of Commune Database Information System (CDIS) Poverty and Economic Policy (PEP) Research Network Meeting June 2004, Dakar,
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
Lesson 8: Effectiveness Macerata, 11 December Alessandro Valenza, Director, t33 srl.
Evaluating a Research Report
INTRODUCTION TO BOOK-KEEPING AND ACCOUNTANCY Samir K Mahajan.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
Dr Opio C K (TM) Quality assurance at FOM Quality assurance committee.
SUB-MODULE 5. MEANS OF VERIFICATION RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
UNESCO-CEPES 10 September 2002 Kauko Hämäläinen, Kirsi Mustonen and Karl Holm Standards, Criteria and Indicators in Programme Accreditation and Evaluation.
Peter Gondo. Session overview Introduction Why monitor What to monitor How to monitor Exercise: Developing a performance monitoring plan for the FI.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Monitoring and Evaluation Orientation 17 th May 2011 MCA Namibia.
Program Evaluation.
(c) 2007 McGraw-Hill Higher Education. All rights reserved. Accountability and Teacher Evaluation Chapter 14.
Developing a Project Proposal - SPROUT - ACTRAV-Turin.
Developing a Project Proposal
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT PLAN/REPORT By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Elementary School Administration and Management GADS 671 Section 55 and 56.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
WELCOME Challenge and Support. What is challenge and support Table discussion As a governor what do you think Challenge and Support looks like?
Developing a Project Proposal ACTRAV-Turin. Contents Concept of “Logical Framework Approach” SPROUT – model project proposal Individual activity Presentation.
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
4.Model of Evaluation a. Hierarchical Criteria Model (classical model by Bennett, 1976) 1. Input (resource) 2. Activity 3. Participation 4. Reaction 5.
Reading Discussion – s519 by Peter Hall – 1/21/09 Ryan, J., McClure, C. R., & Bertot, J. C. (2001). Choosing measures to evaluate networked information.
Practicalities of a complex success criteria model Viktória Horváth PhD student Department of Strategy and Project Management Corvinus University of Budapest.
Group evaluation There is need to assess the degree to which a group is achieving or has achieved its set goals. The process of assessing this constitutes.
Community RESOURCE DEVELOPMENT
Controlling Measuring Quality of Patient Care
School Self-Evaluation 
Monitoring and Evaluation using the
INTRODUCTION TO BOOK-KEEPING AND ACCOUNTANCY
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
EVALUATION IN CD PROGRAM
EVIDENCE COLLECTION TECHNIQUES (Focus Group)
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

 Meaning  Purpose  Steps in evaluation  Models of evaluation  Program evaluators and the role

 A process of making judgement on the worth of an implemented program.  Judgement is made by comparing what is seen/observed (evidence) with a standard criterion.

The meaning of evaluation is further strengthened by the following characteristics: a.A continuous process - from beginning, midway, and final stage. b.Evaluation is a learning process to the participants involved.

c.Evaluation is a process of measuring performance. Therefore strengths and weaknesses are identified. d.Can be measured quantitatively and qualitatively. e.An ideal model of evaluation involves input-output-impact.

1. To see the achievement of objectives *Data are collected on the performance of the program

Analysis is done on the data, and it is compared with the statement of objective of the program The result of the comparison is stated, eg. 90% or 80% of the objective is achieved. The result is used for follow-up activities.

2. As a proof for budget/resource utilization: Most CD programs received budgets from sponsors (NGO), institutions) Most CD programs received budgets from sponsors (NGO), institutions) Program participants must be accountable of the budgets used. Program participants must be accountable of the budgets used.

Change brought about by the program should be proven. Evaluation results is one of the ways to prove the utilization of budget/resource. Change brought about by the program should be proven. Evaluation results is one of the ways to prove the utilization of budget/resource. Evaluation result is submitted to sponsor. Evaluation result is submitted to sponsor.

3. Evaluation as Data Bank Evaluation needs data that to be gathered continuously. Data gathered through – eg. Survey, observations, are kept in the “data bank” that could be retrieved when needed.

A good evaluation should be based on up-to-date data (not obsolete data). Example of data bank is Department of Statistics, who does continuous data collection on the various sectors of development. Digital facilities (computer) facilitates the management of data for development.

4. Evaluation as a Strategy in Management  Planning and implementation are both activities in management.  Management needs careful usage of resources.  Management needs on-going information about the program.

 Management answers questions such as: -Is the program formulated according to the problems and interest of the community? -What activities should be prioritised? -What should be the action if there is natural calamity?

5.Evaluation as a Strategy for Program Improvement  From evaluation, weaknesses of the program are known.  Weakness means the gap between the present and the desired status.

6.Evaluation as a basis for follow-up activities: Results of evaluation are used for future activities of the program. Results of evaluation are used for future activities of the program. Also used for reformation of policy in order to bring better impact. Also used for reformation of policy in order to bring better impact. For duplication purposes (similar program on different community and locality). For duplication purposes (similar program on different community and locality).

7.Evaluation as a means to get recognition

3. Analyse data and do judgement 1.Define the focus of evaluation 2. Collect data (evidence) 4. Report the result of evaluation Steps in Evaluation

Step 1 : Define the focus of evaluation Answer the following questions: What is the objective of evaluation? What is the objective of evaluation? What criteria and indicators in each criterion to be used? What criteria and indicators in each criterion to be used?

What are the data (evidences), and the sources of the data? What are the data (evidences), and the sources of the data? Who are the evaluators (internal or external)? Who are the evaluators (internal or external)? How is the result reported, and for whom? How is the result reported, and for whom?

Step 2: Collect the data (Evidence) Step 2: Collect the data (Evidence) It is done after criteria and indicators of criteria are known. It is done after criteria and indicators of criteria are known. Data are collected similar to data collection techniques in situational analysis or research, e.g. Survey, observation, document reviews. Data are collected similar to data collection techniques in situational analysis or research, e.g. Survey, observation, document reviews.

 Sources of Data: Depending on objectives of the evaluation. If the objective is to see the impact of balanced diet program among children, therefore, the source of data is the children and the mother or parents. Depending on objectives of the evaluation. If the objective is to see the impact of balanced diet program among children, therefore, the source of data is the children and the mother or parents.

Participants of the program Participants of the program Program facilitators/CD workers/ social workers/implementors Program facilitators/CD workers/ social workers/implementors Relevant reports such as meeting minutes. Relevant reports such as meeting minutes.

Step:3 Analyse the data and do judgment Basis of analysis is doing judgment is by comparing the present status of the program and what it ought to be It is done one by one on the criteria or indicators of criteria

e:g : Balanced Diet Criterion – Participation Indicators – i. Attendance in meeting ii. Active participation iii. Give feedback

Analysis and judgment can be done quantitatively and qualitatively Qualitative – e.g. of criteria: i) Appropriateness – according to problems and needs -Easy or difficult to follow -According to mandate of organization

ii) Effectiveness – how is the achievement of objectives - Impact on income or other indicators -Impact on community’s psychological change such as attitude, awareness and knowledge

iii) Efficiency – is the program implemented according to the duration as planned? Delayed or faster? How is the ratio of input and output?

iv) Significance – to community or organization? Is it commensurate with the resources used?

Step 4: Report the Evaluation Results  Every one (participant) has the right to know the evaluation  Various forms of reports – academic (journal articles, papers) and non- academic (bulletin, news through mass media)

Reports are channeled to departments or ministries for policy formulation especially results that need immediate action Reports are channeled to departments or ministries for policy formulation especially results that need immediate action Some reports are made for sponsors – with certain specifications for reporting Some reports are made for sponsors – with certain specifications for reporting