Facilitating UFE step-by-step: a process guide for evaluators

Slides:



Advertisements
Similar presentations
Analyzing Student Work
Advertisements

INTEGRATING THEORY AND PRACTICE
Importance of Questioning and Feedback Technique in developing 3 Cs
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
Codex Guidelines for the Application of HACCP
How to Develop the Right Research Questions for Program Evaluation
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Welcome to Workshop 3 The Foundation of Nursing Studies (FoNS) in Partnership with the Burdett Trust for Nursing Patients First: Supporting Nurse-led.
WELCOME BACK!! WEDNESDAY. Response to Feedback Awesome!!! Hmmm… Increased collaboration Michael’s presentation Logic model work Coordinators who understand.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 15: Capacity development and training on Maternity.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of step 7 of the UFE checklist and to “level.
ATL’s in the Personal Project
Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
Designing the Evaluation Sonal R. Doshi MS, MPH Program Evaluator CDC/OSTLTS/DPHPI TB Education, Training, and Evaluation Lights, Camera, Action: Setting.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Strong leadership and whole school engagement – How does this happen? Rationale: Whole school change occurs when the leadership team has a common vision,
Phase-1: Prepare for the Change Why stepping back and preparing for the change is so important to successful adoption: Uniform and effective change adoption.
Decision-Oriented Evaluation Approaches Presenters: Chris, Emily, Jen & Marie.
Jme McLean, MCP, MPH PolicyLink Kari Cruz, MPH Dana Keener Mast, PhD ICF International American Evaluation Association October 27, 2012 | Minneapolis,
Session 1. The Central Principles of HiAP WORKSHOP: PREPARING FOR TRAINING IN HEALTH IN ALL POLICIES (HiAP) USING THE NEWLY LAUNCHED WHO HiAP TRAINING.
Collaborative & Interpersonal Leadership
Coaching in Early Intervention Provider Onboarding Series 3
Stages of Research and Development
Group evaluation There is need to assess the degree to which a group is achieving or has achieved its set goals. The process of assessing this constitutes.
Incorporating Evaluation into a Clinical Project
Welcome to Scottish Improvement Skills
Competences for science teaching at the 21st century
Session 1. The Central Principles of HiAP
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
Stakeholder consultations
CREATED BY T.ALAA AL AMOUDI
Business Case Analysis
MODULE 11 – SCENARIO PLANNING
Chapter 16 Participating in Groups and Teams.
MUHC Innovation Model.
The ISSAIs for Financial Audit ISSAIs
Good Participatory Practice (GPP) Guidelines for Biomedical HIV Prevention Trials 2011, Second Edition Note to trainer/presenter: You may use all or.
AACSB’s Standard 9: Curriculum content
Ice-breaker If you were fruit or vegetables, what would you like to be? Why?
Outcome Harvesting nitty- gritty Promise and Pitfalls of Participatory Design Atlanta, 10:45–11:30 29 October, 2016.
Vision Facilitation Template
Strategic Prevention Framework - Evaluation
Intro slide to all of these:
KEY PRINCIPLES OF THINKING SYSTEMICALLY
Module 7 Key concepts Part 2: EVALUATING COMPLEX DEVELOPMENT PROGRAMS
Multi-Sectoral Nutrition Action Planning Training Module
Supervision and creating culture of reflective practice
Foundations of Planning
Obj. 2.2 Discuss considerations involved before, during and after an interview To view this presentation, first, turn up your volume and second, launch.
S519: Evaluation of Information Systems
Style You need to demonstrate knowledge and understanding beyond undergraduate level and should also reach a level of scope and depth beyond that taught.
Let’s Talk Data: Making Data Conversations Engaging and Productive
CREATED BY T.ALAA AL AMOUDI
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Alignment of Part 4B with ISAE 3000
Regulated Health Professions Network Evaluation Framework
Leadership for Safety Through the Case Method
Building Leadership Capacity Difficult Discussions
Building Leadership Capacity Difficult Discussions
Facilitating UFE step-by-step: a process guide for evaluators
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
PowerPoint Presentation by Charlie Cook
Alignment of Part 4B with ISAE 3000
Approaches to Learning (ATL)
Alignment of Part 4B with ISAE 3000
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

Facilitating UFE step-by-step: a process guide for evaluators Module 1: Steps 1-3 of UFE checklist https://creativecommons.org/licenses/by-sa/4.0/ Joaquín Navas & Ricardo Ramírez December, 2009

Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 1 to 3 of the UFE checklist. The main goal is to suggest a process that can help evaluators present the UFE principles and select primary users. (PLEASE READ THE NOTES SECTION OF THE DIFFERENT SLIDES). Please adapt this presentation to the context of the project that you are evaluating and to your facilitation style. From the UFE checklist, this activity covers: Task 3 of Step 1;

Agenda UFE – What is it? What for? Stakeholder identification Roles definition Recess Steps 1 – 3: Are we ready for UFE?

Program evaluation is… Systematic collection of information about the activities, characteristics, and results of programs to make judgments about the project / network, improve or further develop project / network effectiveness, inform decisions about future programming, and / or increase understanding.

The ideal evaluation conditions According to the American Evaluation Association, these are some of the ideal conditions for program evaluation, which are seldom met (Patton, 2008:198): The program’s goals are clear, specific, and measurable. The evaluator has ready access to all necessary data and enthusiastic cooperation from all necessary people. The evaluator’s role is clear and accepted. There are adequate resources and sufficient time to conduct a comprehensive and rigorous evaluation. The original evaluation proposal can be implemented and designed.

Trends in the literature on evaluation CONTEXT: context as a factor explaining use (this includes organizational and project culture, time and resource constraints, physical and social conditions). DIFFERENT TYPES or LEVELS OF USE: from the individual to the organizational level. ROLE OF THE EVALUATOR: diversified to include facilitation, planning and training.

A collaborative approach means… Maintenance of an ongoing focus on LEARNING. Clarification of ROLES and EXPECTATIONS. Creation of spaces for DIALOGUE. JOINT FOCUS on all issues being investigated. Attention to the VALIDATION of findings. Joint INTERPRETATION of results.

Engaging users contributes to Personal LEARNING among them. More CONFIDENCE and direct APPLICATION of evaluation findings to program practices. A reduced POWER DIFFERENTIAL between evaluators and program practitioners. More NEGOTIATED DECISION MAKING and learning.

Utilization focused evaluation is… A decision-making framework for enhancing the utility and actual use of evaluations. (Patton, 2008a: slide 9)

Utilization focused evaluation is… A PROCESS for helping primary intended users select the most appropriate content, model, methods, theory, and uses for their particular situation. However, UFE does not advocate any particular evaluation content, model, method or theory. Situational responsiveness guides the interactive process between evaluator and primary intended users. Evaluation done for and with specific, intended primary USERS for specific, intended USES. “USES” refers to the way real people in the real world APPLY evaluation findings and experience the evaluation PROCESS. (Patton, 2008: 37 - Ch.2)

What is unique in this project in terms of: UFE : KEY POINTS What is unique in this project in terms of: Context The role of the evaluators Collaborative approaches Engage with participant experiences. Outline the principles that underlie UFE. Photos meant to be changed by participants. Last slide focuses on take home messages.

Premises of UFE Evaluations should be JUDGED by their utility and ACTUAL USE. No evaluation should go forward unless and until there are primary intended users who will use the information that can be produced. Primary intended users are involved in the process. Evaluation is part of initial program design. The primary intended users want information to help answer a question or questions. Evaluator’s role is to help intended users clarify their purpose and objectives. Implications for use are part of every decision throughout the evaluation – it is the driving force of the evaluation process. (Patton, 2008a)

UFE in 12 steps Project / network readiness assessment. Evaluator readiness and capability assessment. Identification of primary intended users. Situational analysis. Identification of primary intended uses. Focusing on evaluation. Evaluation design. Simulation of use. Data collection. Data analysis. Facilitate use. Meta-evaluation.

Conventional Evaluation Vs. Developmental Evaluation Render definitive judgment of success or failure. Position the evaluator outside to assure independence and objectivity. Design the evaluation based on linear cause-and-effect logic models. Aim to produce generalizable findings across time and space. Evaluator determines the design based on the evaluator’s perspective about what is important. Provide feedback, generate learnings, support changes in direction. Position evaluation as an internal process that integrates team functions into action. Design evaluation to capture systems dynamics, interdependencies, and emergent interconnections. Aim to produce context-specific understandings. Evaluator collaborates with those engaged in the change effort to design the evaluation process.

References Gamble, 2008. Developmental evaluation. Montreal: McConnell Foundation Henry, G.T. & Mark, M.M. (2003) Beyond use: Understanding evaluation’s influence on attitudes and actions. American Journal of Evaluation 24 (3): 293-314. McConnell Foundation. (2006) Sustaining social innovation: Developmental evaluation. www.mcconnellfoundation.ca/default.aspx?page=139 11/Feb/09 Patton, M.Q. (2008) Utilization focused evaluation, 4th Edition. Sage. Patton, M.Q. (2008a) Utilization focused evaluation. Presentation to the AEA. Shulha, L. & Cousins, J. (1997) Evaluation use: Theory, research, and practice since 1986. Evaluation Practice, 18(3), Fall, 195-208.

Comments / Questions Allow some time to discuss the information presented to this point.

Think back about your best evaluation experience… To what extent was your best evaluation experience compatible with UFE? Review it in terms of: The USERS: were they identified? The specific USES: were they clear? User ENGAGEMENT: how and why?

¿Who are the stakeholders that are or need to be involved in this project? PAUSE on this slide for 20-30 minutes and allow participants to identify the key stakeholders of the project (more actors can be added at a later stage). It is also useful to group the identified actors into stakeholder groups. If required, you can also do a more detailed stakeholder analysis that will look at levels of power and influence, but you will need more time for this. From the UFE checklist, this activity covers: Task 3 of Step 1; All tasks of Step 3.

ROLES (1 of 3) EVALUATOR: Person or organization responsible for facilitating / leading the design, the implementation and the utilization of the evaluation. Tasks: organizational development agent, educator, coach/mentor, strategic planner, etc.

ROLES (2 of 3) PRIMARY USER: People who will use and apply the findings of the evaluation. Patton (2008) suggests the following profiler: 1. Interested. 2. Knowledgeable. 3. Open minded. 4. Represents an important interest group. 5. Has credibility. 6. Teachable. 7. Available for ongoing interaction throughout the evaluation process.

ROLES (3 de 3) AUDIENCE INTERESTED IN THE REPORT: Actors interested in the unfolding and findings of the evaluation.

¿From the stakeholders that were identified, who plays what role? PAUSE on this slide for 15-20 minutes and allow participants to assign key UFE roles (evaluator, primary user and audience interested in the report) to the different stakeholders that were identified. From the UFE checklist, this activity covers: Task 3 of Step 1; All tasks of Step 3.

¿Who is missing?

BREAK

We don’t want to end up in this situation, so…

¿How well prepared do we feel for adopting UFE as the methodology to evaluate this project? This slide is just to prompt the audience with question, which only needs to be replied after the facilitator talks about UFE traps or temptations (next 3 slides). From the UFE checklist, this activity covers: Task 1 of Step 1.

UFE traps or temptations (1 of 2) Evaluators make themselves the primary decision makers. Identify vague, passive audiences as users. Targeting organizations as users. Focusing on decisions instead of decision makers. Assuming the evaluation’s funder is the primary stakeholder.

UFE traps or temptations (2 of 2) Waiting until the findings are in to identify intended users and intended uses. Taking a stance of standing above the messiness of people and politics. Being co-opted by powerful stakeholders. Identifying primary intended users but not involving them meaningfully. (Patton, 2008, adapted from p. 90 - Ch.3)

Under what conditions… Are the traps are real? What can you do from the START to minimize the risk of falling into them? How have you addressed the risk of falling into these traps? What other traps or temptations have you faced?

¿How well prepared do we feel we are for adopting UFE as the methodology to evaluate this project? From the UFE checklist, this activity covers: Task 1 of Step 1.

What resources do we require in order to implement a UFE plan What resources do we require in order to implement a UFE plan? Is the program willing/ready to allocate the required resources? From the UFE checklist, this activity covers: Task 2 of Step 1.

Are all involved parties supportive of adopting a UFE approach Are all involved parties supportive of adopting a UFE approach? ¿What can be done to increase such support? From the UFE checklist, this activity covers: Task 4 of Step 1. .

What could the main challenges of this evaluation initiative be What could the main challenges of this evaluation initiative be? ¿Do we feel well-enough prepared to face these challenges? From the UFE checklist, this activity covers: ALL tasks of Step 2.

¿Do the evaluators feel prepared to have their effectiveness judged by the use that the primary intended users will give to the findings of the evaluation? From the UFE checklist, this activity covers: Task 3 of Step 2.

Conclusions and next steps NOTE: Tasks 1 and 2 of UFE Step 2 need to be done as a reflection exercise when writing the report of this session.