Community RESOURCE DEVELOPMENT

Slides:



Advertisements
Similar presentations
The Teacher Foundation A Look at the Evaluation Process by The Teacher Foundation (TTF)
Advertisements

Project Monitoring Evaluation and Assessment
Chapter 15 Evaluation.
Evaluation. Practical Evaluation Michael Quinn Patton.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
INTRODUCTION TO BOOK-KEEPING AND ACCOUNTANCY
Standards and Guidelines for Quality Assurance in the European
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
If you don’t know where you’re going, any road will take you there.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Tier III Implementation. Define the Problem  In general - Identify initial concern General description of problem Prioritize and select target behavior.
(c) 2007 McGraw-Hill Higher Education. All rights reserved. Accountability and Teacher Evaluation Chapter 14.
Elementary School Administration and Management GADS 671 Section 55 and 56.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Developing a Project Proposal ACTRAV-Turin. Contents Concept of “Logical Framework Approach” SPROUT – model project proposal Individual activity Presentation.
Alex Ezrakhovich Process Approach for an Integrated Management System Change driven.
4.Model of Evaluation a. Hierarchical Criteria Model (classical model by Bennett, 1976) 1. Input (resource) 2. Activity 3. Participation 4. Reaction 5.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
 Meaning  Purpose  Steps in evaluation  Models of evaluation  Program evaluators and the role.
Practicalities of a complex success criteria model Viktória Horváth PhD student Department of Strategy and Project Management Corvinus University of Budapest.
Assessment, Information Systems, Monitoring, and Statistics (AIMS) Planning for National EFA Mid-Decade Assessment October 2005 Guidelines on Methods.
Community Score Card as a social accountability Approach Methodology and Applications March 2015.
Data Collection Techniques
Evaluating the Quality and Impact of Community Benefit Programs
Group evaluation There is need to assess the degree to which a group is achieving or has achieved its set goals. The process of assessing this constitutes.
MODULE 15 – ORGANISATIONAL COMMUNICATION
Gathering a credible evidence base
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
“Strategic Planning” Mississippi Library Commission
Introduction to evaluating and measuring impact in career development Presented by – Date – Doubt, the essential preliminary of all improvement and.
Project Management and Monitoring & Evaluation
Locating The Problem Dr. Anshul Singh Thapa.
Community RESOURCE DEVELOPMENT
Chapter 17 Evaluation and Evidence-Based Practice
Performance Appraisal
Unit 6 Research Project in HSC Unit 6 Research Project in Health and Social Care Aim This unit aims to develop learners’ skills of independent enquiry.
Impact evaluations at IFAD-IOE
Controlling Measuring Quality of Patient Care
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
School Self-Evaluation 
Chapter 15 Evaluating Dr. James Pelletier The Swain Department of Nursing.
Program Evaluation Essentials-- Part 2
TEACHING PERFORMANCE STANDARDS FRAMEWORK
Module 7: Monitoring Data: Performance Indicators
Community RESOURCE DEVELOPMENT
Community RESOURCE DEVELOPMENT
3. COURSE Is another activity in CD that involves learning
Community RESOURCE DEVELOPMENT
ROOTS 1+2 Advocacy toolkit
Faculty Development Dr Samira Rahat Afroze.
INTRODUCTION TO BOOK-KEEPING AND ACCOUNTANCY
Community RESOURCE DEVELOPMENT
The Importance of Project Communications Management
United Nations Voluntary Fund on Disability (UNVFD)
The role of internal and external evaluation in an autonomous system
How to conduct Effective Stage-1 Audit
Evaluation and Testing
EVALUATION IN CD PROGRAM
Assessment of Service Outcomes
Injury epidemiology- Participatory action research and quantitative approaches in small populations Lorann Stallones, PhD Professor and Director, Colorado.
EVIDENCE COLLECTION TECHNIQUES (Focus Group)
OGB Partner Advocacy Workshop 18th & 19th March 2010
Strategic Planning.
Providing feedback to learners
Gender Training Workshop Name of Institution Place Date
Co-Curricular Assessment
Presentation transcript:

Community RESOURCE DEVELOPMENT (DCE3411) Associate Prof. Dr. Roziah Mohd Rasdi Dept. of Professional Development & Continuing Education Faculty of Educational Studies Universiti Putra Malaysia roziah_m@upm.edu.my

EVALUATION IN CD PROGRAM

MEANING A process of making judgment on the worth of an implemented program. Judgement is made by comparing what is seen/observed (evidence) with a standard criterion.

The meaning of evaluation is further strengthened by the following characteristics: A continuous process - from beginning, midway, and final stage. Evaluation is a learning process to the participants involved

Evaluation is a process of measuring performance Evaluation is a process of measuring performance. Therefore strengths and weaknesses are identified. Can be measured quantitatively and qualitatively. An ideal model of evaluation involves input-output-impact.

PURPOSE To see the achievement of objectives Data are collected on the performance of the program Analysis is done on the data, and it is compared with the statement of objective of the program

The result of the comparison is stated, e. g The result of the comparison is stated, e.g. : 90% or 80% of the objective is achieved. The result is used for follow-up activities.

Cont.. As a proof for budget /resource utilization Most CD programs received budgets from sponsors (NGO), institutions) Program participants must be accountable of the budgets used.

Change brought about by the program should be proven Change brought about by the program should be proven. Evaluation results is one of the ways to prove the utilization of budget/resource. Evaluation result is submitted to sponsor.

Cont.. Evaluation as Data Bank Evaluation needs data that to be gathered continuously. Data gathered through – e.g. Survey, observations, are kept in the “data bank” that could be retrieved when needed.

A good evaluation should be based on up-to-date data (not obsolete data). Example of data bank is Department of Statistics, who does continuous data collection on the various sectors of development. Digital facilities (computer) facilitates the management of data for development.

Cont.. Evaluation as a Strategy in Management Planning and implementation are both activities in management. Management needs careful usage of resources. Management needs on-going information about the program.

Management answers questions such as: Is the program formulated according to the problems and interest of the community? What activities should be prioritised? What should be the action if there is natural calamity?

Cont.. Evaluation as a Strategy for Program Improvement From evaluation, weaknesses of the program are known. Weakness means the gap between the present and the desired status.

Cont.. Evaluation as a basis for follow-up activities: Results of evaluation are used for future activities of the program. Also used for reformation of policy in order to bring better impact. For duplication purposes (similar program on different community and locality). Evaluation as a means to get recognition

Steps in Evaluation Define the focus of evaluation 4. Report the result of evaluation 2. Collect data (evidence) 3. Analyse data and do judgement

Step 1 : Define the focus of evaluation Answer the following questions: What is the objective of evaluation? What criteria and indicators in each criterion to be used?

What are the data (evidences), and the sources of the data? Who are the evaluators (internal or external)? How is the result reported, and for whom?

Step 2: Collect the data (Evidence) It is done after criteria and indicators of criteria are known. Data are collected similar to data collection techniques in situational analysis or research, e.g. Survey, observation, document reviews.

Sources of Data: 1 Depending on objectives of the evaluation. If the objective is to see the impact of balanced diet program among children, therefore, the source of data is the children and the mother or parents. 2 Participants of the program 3 Program facilitators/CD workers/ social workers/ implementors 4 Relevant reports such as meeting minutes

Step 3: Analyse the data and do judgment Basis of analysis is doing judgment is by comparing the present status of the program and what it ought to be It is done one by one on the criteria or indicators of criteria

e.g : Balanced Diet Criterion – Participation Indicators – i. Attendance in meeting ii. Active participation iii. Give feedback

Analysis and judgment can be done quantitatively and qualitatively Qualitative – e.g. of criteria: i) Appropriateness – according to problems and needs - Easy or difficult to follow - According to mandate of organization

ii) Effectiveness – how is the achievement of objectives - Impact on income or other indicators - Impact on community’s psychological change such as attitude, awareness and knowledge

iii) Efficiency – is the program implemented according to the duration as planned? Delayed or faster? How is the ratio of input and output? iv) Significance – to community or b organization? Is it commensurate with the resources used?

Step 4: Report the Evaluation Results Every one (participant) has the right to know the evaluation Various forms of reports – academic (journal articles, papers) and non-academic (bulletin, news through mass media)

Reports are channeled to departments or ministries for policy formulation especially results that need immediate action Some reports are made for sponsors – with certain specifications for reporting

MODEL OF EVALUATION

Hierarchical Criteria Model (classical model by Bennett, 1976)

Input All physical and non-physical resources including human resources (participants) Indicators involve in the evaluation of this criterion are: total resources used, maintenance of the resources, skills of participants in using resources, how resources mobilized, etc. Inputs are prime movers in any program.

Activity Evaluate activities at all stages - initiation, implementation and evaluation. Activities listed in the plan of work or calendar of activities are used. Judgement on the activities e.g. in the form satisfactory, or excellent. Participation Total involved. Pattern of involvement Continuity – continuous or seasonal.

Reaction Response and acceptance of people. Shown by their commitment, interest. Cognitive and Affective Change Shown by their commitment, interest. Shown by their interest, value, and attitudinal changes. .

Skills change Impact as a result of cognitive and affective changes Especially involvement in the use of technological innovations in the CD program. e.g. : Better use of hydroponics farming, proper use of computer in information sharing in the villages Skills change is more difficult to measure Takes longer time Final Outcome Achievement of objectives At the end of program

Model Context, Input, Process and Product (CIPP) by Stufflebeam (2000) To see the appropriateness of program based on situation of the program such environmental characteristics, and community’s problem Seen at macro level Historical background of the area is relevant. Basis for other types of evaluation (input, process and product)

Input See the handling of inputs including human resources, activities and the sequence, support services and budget use. Micro level. Make use of the calendar of activities. See the input-output analysis.

Process Also called on-going evaluation or formative evaluation, or monitoring, or operational evaluation. Objective : to identify weaknesses to predict results of implementation activities to find remedies for the weaknesses. needs on-going staff to do evaluation data/information are collected formally and informally.

Product Normally called final evaluation or summative evaluation. To measure the achievement of program objectives. The effectiveness of context, input and process evaluation will affect product evaluation. Product evaluation – tells about the level of achievement, but process and input explain why that level is achieved. Overall evaluation should look at the four aspects of CIPP.

Internal Program Evaluators Planners, implementers and all that are involved in the program All the community

Advantages They know the in an out of program (they experience), including the weaknesses and strengths Disadvantages May bias, highlight the goodness of the program only.

External Evaluators EXTERNAL Consultant Someone who comes from outside the program Specialist in the area, knows very well about the subject matter

Advantages Very objective Capable of assessing critical issues Disadvantages May give extreme results Not experiencing the practical side of the program Dependent on documents High cost