INTRODUCTION TO UTILIZATION FOCUSED EVALUATION SLEVA Colombo June 6, 2011 Facilitators: Sonal Zaveri Chelladurai Solomon IDRC Consultants Assisted by Nilusha.

Slides:



Advertisements
Similar presentations
CRITICAL THINKING The Discipline The Skill The Art.
Advertisements

Program Evaluation Essentials. WHAT is Program Evaluation?
Communication! Facilitation!. What is a Facilitator?  A facilitator/Leader must know how to build consensus and productively manage conflict within the.
Data Collection* Presenter: Octavia Kuransky, MSP.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Problem Identification
Evaluating and Revising the Physical Education Instructional Program.
Business research methods: data sources
By the end of this chapter you will be able
1 Writing the Research Proposal Researchers communicate: Plans, Methods, Thoughts, and Objectives for others to read discuss, and act upon.
Comparing Assessment to Educational Research. In this session, we will… Construct and discuss the (related but different) goals and attributes of assessment.
Title I Needs Assessment and Program Evaluation
Chapter 15 Evaluation.
DEFINING JOB PERFORMANCE AND ITS RELATIONSHIP TO ASSESSMENTS.
Understanding Validity for Teachers
Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell.
TC176/IAF ISO 9001:2000 Auditing Practices Group.
Chapter 3 Needs Assessment
Proposal Writing.
How to Develop the Right Research Questions for Program Evaluation
What is Business Analysis Planning & Monitoring?
Copyright © 2001 by The Psychological Corporation 1 The Academic Competence Evaluation Scales (ACES) Rating scale technology for identifying students with.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Business research methods: using questions and active listening
Developing Business Practice –302LON Introduction to Business and Management Research Unit: 6 Knowledgecast: 2.
Impact assessment framework
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Medical Audit.
Northcentral University The Graduate School February 2014
Too expensive Too complicated Too time consuming.
June 2002USDA Natural Resources Conservation Service1 Critical Meeting Elements: Preparation to Minimize Conflict.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of step 7 of the UFE checklist and to “level.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Developing Business Practice –302LON Introduction to Business and Management Research Unit: 6 Knowledgecast: 1.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
United States Department of Agriculture Food Safety and Inspection Service 1 National Advisory Committee on Meat and Poultry Inspection August 8-9, 2007.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Regional Seminar 2005 EVALUATING POLICY Are your policies working? How do you know? School Development Planning Initiative.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
S519: Evaluation of Information Systems Result D-Ch10.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.
Project Thesis 2006 Adapted from Flor Siperstein Lecture 2004 Class CLASS Project Thesis (Fundamental Research Tools)
Developing your Research Plan for FemNorthNet Community Case Studies 1.
Context Evaluation knowing the setting Context Evaluation knowing the setting.
Facilitate Group Learning
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Program Evaluation Principles and Applications PAS 2010.
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
1 RESEARCH METHODOLOGY FOR ED, BABM AND MBA STUDENTS PREPARED BY: MUKUNDA KUMAR.
Facilitating UFE step-by-step: a process guide for evaluators Module 4: Steps 8-12 of UFE Checklist.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Specific Types of Quantitative Research.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
TC176/IAF ISO 9001:2000 Auditing Practices Group.
© 2009 Pearson Prentice Hall, Salkind. Chapter 13 Writing a Research Proposal.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
A1 & A2 The aim: (separate) Critique a Qualitative study on “Telemonitoring of blood glucose and blood pressure in type 2 diabetes.” Critique a Quantitative.
MUHC Innovation Model.
Putting Knowledge into Practice
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Facilitating UFE step-by-step: a process guide for evaluators
TECHNOLOGY ASSESSMENT
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Changing the Game The Logic Model
Presentation transcript:

INTRODUCTION TO UTILIZATION FOCUSED EVALUATION SLEVA Colombo June 6, 2011 Facilitators: Sonal Zaveri Chelladurai Solomon IDRC Consultants Assisted by Nilusha (LirneAsia) and Malathi (TESA)

Agenda 1.Understanding UFE better 1.Validation of preliminary analysis of KEQ 1.Identification of intended outcomes of the program. 2.Definition of required data. 3.Selection of appropriate data collection methods.

Suppose that at the beginning of this project you had the required resources and total freedom for implementing or not implementing a formal evaluation plan. What factors would have discouraged you about implementing the evaluation plan? Why? Exercise – previous experience (1/4)

What would have motivated you to implement the evaluation plan? Why? Exercise – previous experience (2/4)

Do you think that this evaluation process can contibute to making major decisions or program improvement? How? Exercise – evaluation utility (3/4)…

Can you think of any milestones that could be critical or useful to have in mind for major decision-making throughout the project subject of this evaluation? Exercise – critical dates…4/4

What we have accomplished so far… 1.First draft of KEQ that seems useful to guide the remainder of the evaluation process. 2.First 6 steps of the UFE checklist have been covered. 3.The process has been well documented up to this point.

KEQ Validation Analysis # Key Evaluation Question Related Primary Intended Use KEQ Category Does the KEQ comply with the desired KEQ features? Related specific program objective KEQ #1 KEQ #2 KEQ #3 KEQ #4

The trajectory of change… INPUT / RESOURCES ▼ ACTIVITIES ▼ OUPUTS ▼ OUTCOMES ▼ IMPACT / RESULTS CONTROL&PREDICTIONCONTROL&PREDICTION ?

Focusing on outcomes (1/17) DESIRED/EXPECTED OUTCOMES Desired or expected outcomes that would result from the program subject of this evaluation.  What are you trying to achieve with your program?  What type of changes do you want to see in the program participants in terms of behaviour, attitude, knowledge, skills, status, etc?

Focusing on outcomes (2/17) DESIRED/EXPECTED OUTCOMES Specific ObjectivesOUTCOMES What do you want to achieve? Type of change Proyect objective #1Outcome #1X Proyect objective #2Outcome #2Y Proyect objective #3Outcome #3X,Y,Z

Focusing on outcomes (3/17) DETAILS OF DATA COLLECTION ¿What data do you need in order to answer the KEQs?

Focusing on outcomes (4/17) # Key Evaluation Questions Required data Other considerations for the evaluation KEQ #1 KEQ #2 KEQ #3 KEQ #4 DETAILS OF DATA COLLECTION

Focusing on outcomes (5/17) DETAILS OF DATA COLLECTION ¿What methods could be used to collect the required data?

Focusing on outcomes (6/17) DETAILS OF DATA COLLECTION 1.There is no magic key to tell you the most appropriate method to answer your KEQ. 2.All methods have limitations, so try using a combination of methods. 3.Each type of question suits specific approaches/methods – so let them guide you. Other factors to consider: time, cost, resources, knowledge. 4.Primary users should the one to determine what constitutes credible evidence. The primary user should feel comfortable with the selected methods and the collected data. Adapted from Dart, 2007.

Focusing on outcomes(7/17) DETAILS OF DATA COLLECTION COMPATIBILITY BETWEEN METHODS AND QUESTION CATEGORIES Impact: Contribution Analysis / Data trawl & expert panel / GEM. Outcomes: OM / MSC / GEM. Approach/Model: Comparative studies of different approaches. Process: Evaluation study: interview process, focus groups. Quality: Audit against standards, peer review. Cost-effectiveness: Economic modeling Adapted from Dart, 2007.

Focusing on outcomes (8/17) DETAILS OF DATA COLLECTION – METHODS SUMMARY (1/3) Contribution Analysis: Seeks for evidence to show evidence between a given activity and an outcome in order to show change trends that have resulted from an intervention. Does not intend to show linear causality. Data Trawl: Data search and analysis from disperse literature in order to identify relationships between activities and outcomes. GEM (Gender Evaluation Methodology): Links gender and ICT through relevant indicators. Read more:

Focusing on outcomes (9/17) DETAILS OF DATA COLLECTION – METHODS SUMMARY (2/3) Outcome Mapping: Focuses on mid-term outcomes, suggesting that in the best case scenario these outcomes will lead to long-term impact in a non-linear way. Read more: Most Significant Change: Seeks to identify most significant changes based on participants´ stories. Read more: Expert panels: Group of experts is invited to comment and analyze outcomes and how they relate to possible impacts. Read more:

Focusing on outcomes (10/17) DETAILS OF DATA COLLECTION – METHODS SUMMARY (2/3) Comparative studies of different approaches: Self-explanatory. Interview process: Interviews on how participants experienced the process of the project subject of the evaluation. Focus Groups: Self-explanatory. Audit against standards: This might refer to a comparative analysis against specific standards. Peer reviews: Self-explanatory. Economic Modeling: Not sure what this method refers to.

Focusing on outcomes (11/17) DETAILS OF DATA COLLECTION Given the primary intended USES of the evaluation, do you think that the results that will be obtained with these methods will be :  Credible (accurate)?  Reliable (consistent)?  Valid (true, believable and correct)?

Focusing on outcomes (12/17) DETAILS OF DATA COLLECTION Do you think that these methods are :  Cost-effective?  Practical?  Ethical?

Focusing on outcomes (13/17) DETAILS OF DATA COLLECTION ¿Do you think that you will be able to use the results that you will obtain by the selected methods according to the purposes and intended uses that you defined earlier in the process?

Focusing on outcomes (15/17) DETAILS OF DATA COLLECTION Who will do the data collection? How will you, as primary users, be involved in the data collection?

Focusing on outcomes (16/17) DETAILS OF DATA COLLECTION Will the data collection be based on a sample? How do you think the sampling should be done? Who will do it?

Focusing on outcomes (17/17) DETAILS OF DATA COLLECTION Who will manage and analyze collected data? How will you, as primary users, be involved in data management and analysis?

References Patton, M.Q. (2008). Utilization focused evaluation, 4th Edition. Sage. Dart, J “Key evaluation questions”. Presentation at the Evaluation in Practice Workshop. Kualal Lumpur, December.