EVALUATION APPROACHES Heather Aquilina 24 March 2015.

Slides:



Advertisements
Similar presentations
1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Advertisements

Managing SEN: Monitoring and Evaluation – Gathering Evidence: Making Judgements Day 1.
The Aged Care Standards and Accreditation Agency Ltd Continuous Improvement in Residential Aged Care.
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Donald T. Simeon Caribbean Health Research Council
Theory of Change, Impact Monitoring, and Most Significant Change EWB-UK Away Weekend – March 23, 2013.
Project Monitoring Evaluation and Assessment
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
8. Evidence-based management Step 3: Critical appraisal of studies
Reviewing and Critiquing Research
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Business research methods: data sources
Evaluation. Practical Evaluation Michael Quinn Patton.
Formulating the research design
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Impact assessment framework
Research Methods in Psychology (Pp 1-31). Research Studies Pay particular attention to research studies cited throughout your textbook(s) as you prepare.
What research is Noun: The systematic investigation into and study of materials and sources in order to establish facts and reach new conclusions. Verb:
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Evidence Based Practice (EBP) For Physical Therapists.
Program Evaluation and Logic Models
RESEARCH IN MATH EDUCATION-3
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of step 7 of the UFE checklist and to “level.
Outcome Based Evaluation for Digital Library Projects and Services
Developing Indicators
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
Evaluating a Research Report
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
The LOGICAL FRAMEWORK Scoping the Essential Elements of a Project Dr. Suchat Katima Mekong Institute.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Quantitative and Qualitative Approaches
Qualitative Data and Quantitative Data: Are they different?
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Monitoring and Evaluation
Evaluation: Methods & Concerns Otojit Kshetrimayum V.V. Giri National Labour Institute, Noida
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
Support the spread of “good practice” in generating, managing, analysing and communicating spatial information Evaluating and Reflecting on the Map-making.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Paper III Qualitative research methodology.  Qualitative research is designed to reveal a specific target audience’s range of behavior and the perceptions.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Environmental Systems and Society Internal Assessment.
Are we there yet? Evaluating your graduation SiMR.
Computing Honours Project (COMP10034) Lecture 4 Primary Research.
Fact Finding (Capturing Requirements) Systems Development.
Introduction Ms. Binns.  Distinguish between qualitative and quantitative data  Explain strengths and limitations of a qualitative approach to research.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Quantitative Methods for Business Studies
DATA COLLECTION METHODS IN NURSING RESEARCH
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Data Collection Methods
Research Methods Lesson 1 choosing a research method types of data
Research Methods AS Sociology Unit 2.
IB Environmental Systems and Societies
Starter Look at the photograph, As a sociologist, you want to study a particular group in school. In pairs think about the following questions… Which group.
Features of a Good Research Study
OGB Partner Advocacy Workshop 18th & 19th March 2010
Presentation transcript:

EVALUATION APPROACHES Heather Aquilina 24 March 2015

2 Aim of Today Different frameworks for viewing evaluation approaches (M&E, formative/summative, Owen’s forms, belief system), data collection type Different approaches: what are they, when to use them and why Questions

3 Evaluation and Monitoring Evaluation – a structured process of assessing the success of a project or program in meeting its goals and on lessons learnt. Monitoring – setting targets and milestones to measure progress and achievements and whether inputs are producing planned outputs.

4 Formative and Summative Evaluation Formative (process) Assesses the value of a project as it is being implemented, to check its implementation and to improve its design and performance. Uses both qualitative data (how participants felt about the process) as well as quantitative data. Summative (impact/outcome) Focuses on whether a project's goals have been achieved. More concerned with quantitative data.

5 FormUseExamples Proactive (Design) Synthesis Needs assessment, best practice review ClarificativeClarificationProgram logic Interactive Improvement, implementation Action research, developmental Monitoring Justification, fine tuning PI, systems analysis ImpactAccountabilityPerformance audit Evaluation Form (Owen) John M. Owen, Program evaluation – forms and approaches

6 PositivismRealism Interpretivism / Constructivism Random trials SMART Program Logic RealistParticipatory evaluation Action research Needs assessment Focus Groups Belief About The World

7 Methodology By methodology: Document analysis Program data Interviews Focus groups Surveys Descriptive and inferential statistics Observation

8 Data Collection - Qualitative Descriptions of problems, behaviours or events, people’s thoughts and opinions about their experiences, attitudes, and beliefs. Pros – more depth and insights as to the “why” and “how” of attitudes and behaviours, clarifies quantitative data. Cons - time consuming to capture and analyse, more subjective and difficult to summarize and compare systematically. Generally viewed as less reliable because it is more subjective than quantitative methods, and may yield smaller sample sizes.

9 Data Collection - Quantitative Counts or frequencies, rates or percentages, or other statistics. Pros - easy to administer, can have large number of questions, can yield large samples, easier to summarize, more widely accepted as a form of evidence. Cons - not as rich or as detailed as qualitative methods, survey/written questionnaires may be difficult for some participants, may not be able to be interpreted, and large amounts of data may require more sophisticated analysis approaches.

10 Data collection – Mixed Methods Combines quantitative and qualitative data. Maximizes the strengths and minimizes the weaknesses of both. Pros – Can improve research results. Complement each other and addresses a broader range of questions. Provides insights that don't appear when using a single method. Cons – Need to be experienced in both types. Different methods need to be combined appropriately, which requires more time and money.

11 Approaches Literature review Program Logic Case Studies SMART Outcome Mapping (OM) Action Learning/Research Realist Mixed Methods Appreciative Enquiry Most Significant Change (MSC) Participatory Evaluation Performance Indicators Social Return on Investment (SROI)

12 Literature Review Overview of the theory and the research literature. Seeks to describe, summarise, evaluate, clarify and/or integrate the content of primary reports. Useful for: –design –identifying key issues/factors –justifying questions or framework & method –understanding the study

Program Logic Describes how a program is intended to work. Pros - Clarifies the theory of action, assumptions, builds Cons – May not be appropriate for complex cases, may restrict improvement

14 Case Studies Focuses on a particular unit - a person, a site, a project. Often uses a combination of quantitative and qualitative data. Particularly useful for: –understanding how different elements fit together and –how different elements (implementation, context and other factors) have produced the observed impacts.

SMART Defined as Specific, Measurable, Attainable, Relevant and Time-bound Outcome or Summative approach and useful in program evaluations. Records the achieved outcomes or consequences of a program. Measures program effectiveness, cost effectiveness, appropriateness and efficiency.

16 Outcome Mapping - OM Measures behavioural change of stakeholders. Collects data on immediate, basic changes that lead to longer, more transformative change. Useful for programs which use a development/influence rather than direct service model

17 Action Learning/Research A learning-by-doing process. Analysis of actions feeds into a reflective process leading to improvements. Gathers and shares information about what works and what doesn’t work as the project/program operates. Makes improvements along the way rather than waiting until the end.

18 Realist What works, for whom, in what respects, to what extent, in what contexts, and how? Four key linked concepts - mechanism, context, outcome pattern, and context- mechanism-outcome pattern configuration. Data collection includes interviews, focus groups, questionnaires.

Mixed Methods Combines a number of approaches in one evaluation study. Pros - Different types of data complement and support each other Cons – Carried out ad hoc, not matched to rationale

Appreciative Enquiry Focuses on strengths rather than on weaknesses. Identifies what is working well and seeks to improve performance and conditions. Pros – encourages people to produce their best Cons – may lead to ignoring problems

21 Most Significant Change - MSC Generating and analysing personal accounts of change and deciding which of these accounts is the most significant – and why. Pros - Useful for HOW change comes about & WHEN – in what situations & context. Compelling stories. Cons – Expensive in resourcing and time, only answers contribution to outcomes, not cost effectiveness or appropriateness.

Participatory Evaluation Why is evaluation being done, how evaluation is done, who evaluates, what is being evaluated, and for whom is it being done. Pros – more useful, more fair, more valid Cons – expensive in time and resources, difficult, open to bias, intervention rather than evaluation

23 What to choose when Evaluation question Program stage Resources available Skills of the evaluator Personal philosophical view

24 Thank you Any Questions?