Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

From Research to Advocacy
Study Objectives and Questions for Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research and Quality (AHRQ)
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
School Leadership that Works
What You Will Learn From These Sessions
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
The Art and Science of Teaching (2007)
Systems Engineering in a System of Systems Context
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
Negotiating Diverse Contexts and Expectations in Stakeholder Engagement Sue Lin Yee, MA, MPH National Center for Injury Prevention and Control Office of.
Michael Quinn Patton MESI 2014
Evaluation of OCHA’s Role in Humanitarian Civil-Military Coordination Findings and Recommendations Seminar on Evaluation of UN Support for Conflict Affected.
Chapter 12 Organizational Development. After reading this chapter, you should be able to: Understand organizational development. Understand the process.
Codex Guidelines for the Application of HACCP
How to Develop the Right Research Questions for Program Evaluation
Teacher Development and Evaluation Model October 22, 2012.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Welcome to Workshop 3 The Foundation of Nursing Studies (FoNS) in Partnership with the Burdett Trust for Nursing Patients First: Supporting Nurse-led.
INTRODUCTION TO UTILIZATION FOCUSED EVALUATION SLEVA Colombo June 6, 2011 Facilitators: Sonal Zaveri Chelladurai Solomon IDRC Consultants Assisted by Nilusha.
Canadian Cancer Society Manitoba Division: Knowledge Exchange Network (KEN) & CancerCare Manitoba Manitoba Integrated Chronic Disease Primary Prevention.
WELCOME BACK!! WEDNESDAY. Response to Feedback Awesome!!! Hmmm… Increased collaboration Michael’s presentation Logic model work Coordinators who understand.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 15: Capacity development and training on Maternity.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Engagement as Strategy: Leading by Convening in the SSIP Part 2 8 th Annual Capacity Building Institute May, 2014 Joanne Cashman, IDEA Partnership Mariola.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of step 7 of the UFE checklist and to “level.
Management of assessments and decision making: execution, facilitation, evaluation Mikko V. Pohjola, Nordem Oy (THL)
Towards a European network for digital preservation Ideas for a proposal Mariella Guercio, University of Urbino.
EVALUATING PAPERS KMS quality- Impact on Competitive Advantage Proceedings of the 41 st Hawaii International Conference on System Sciences
Defining and Directing Public Administration toward Performance Excellence Dr. Donald Klingner International Conference on Administrative Development Riyadh,
DGM07: Research Methodology SESSION 1 April 2, 2011 Dr. Naseem Abidi.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
TDEC-NUATRC Workshop Strategic Risk Communication: Air Toxics Rebecca Parkin, PhD, MPH The George Washington University Washington, DC October 18, 2005.
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
Queen’s Management & Leadership Framework
LEARNER CENTERED APPROACH
Social science methodology: An overview from the BRCSS network Robin Peace, Massey University Amanda Wolf, Victoria University 10 June 2009.
1 Implementing a Knowledge Cycle for Best Practices in Health Promotion and Chronic Disease Prevention Kerry Robinson, Vincent Turgeon, Dexter Harvey,
IC&IC eLearning solutions Bucarest, 2 September 2010.
Designing the Evaluation Sonal R. Doshi MS, MPH Program Evaluator CDC/OSTLTS/DPHPI TB Education, Training, and Evaluation Lights, Camera, Action: Setting.
Ch 10 Methodology.
“Participation is a Goal, not just a Means, in NFPs.” Margaret A. Shannon, Ph.D. COST Action E-19 Vienna, September 15, 2003.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
Audience Profiles RoleKey CharacteristicsValues & NeedsRecommendations C-Level Execs Challenge and opportunity is to capitalize on executives’ critical.
Facilitating UFE step-by-step: a process guide for evaluators Module 4: Steps 8-12 of UFE Checklist.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Decision-Oriented Evaluation Approaches Presenters: Chris, Emily, Jen & Marie.
Session 1. The Central Principles of HiAP WORKSHOP: PREPARING FOR TRAINING IN HEALTH IN ALL POLICIES (HiAP) USING THE NEWLY LAUNCHED WHO HiAP TRAINING.
Competency Based Learning and Project Based Learning
Assist. Prof.Dr. Seden Eraldemir Tuyan
Facilitating UFE step-by-step: a process guide for evaluators
CREATED BY T.ALAA AL AMOUDI
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Facilitating UFE step-by-step: a process guide for evaluators
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Presentation transcript:

Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist

Agenda 1.UFE – What is it? What for? 2.Stakeholder identification 3.Roles definition 4.Recess 5.Steps 1 – 3: Are we ready for UFE?

Program evaluation is… Systematic collection of information about the activities, characteristics, and results of programs to make judgments about the project / network, improve or further develop project / network effectiveness, inform decisions about future programming, and / or increase understanding.

The ideal evaluation conditions According to the American Evaluation Association, these are some of the ideal conditions for program evaluation, which are seldom met (Patton, 2008:198): 1.The program’s goals are clear, specific, and measurable. 2.The evaluator has ready access to all necessary data and enthusiastic cooperation from all necessary people. 3. The evaluator’s role is clear and accepted. 4. There are adequate resources and sufficient time to conduct a comprehensive and rigorous evaluation. 5. The original evaluation proposal can be implemented and designed.

CONTEXT: context as a factor explaining use (this includes organizational and project culture, time and resource constraints, physical and social conditions). DIFFERENT TYPES or LEVELS OF USE: from the individual to the organizational level. ROLE OF THE EVALUATOR: diversified to include facilitation, planning and training. Trends in the literature on evaluation

Maintenance of an ongoing focus on LEARNING. Clarification of ROLES and EXPECTATIONS. Creation of spaces for DIALOGUE. JOINT FOCUS on all issues being investigated. Attention to the VALIDATION of findings. Joint INTERPRETATION of results. A collaborative approach means…

Personal LEARNING among them. More CONFIDENCE and direct APPLICATION of evaluation findings to program practices. A reduced POWER DIFFERENTIAL between evaluators and program practitioners. More NEGOTIATED DECISION MAKING and learning. Engaging users contributes to

Utilization focused evaluation is… A decision-making framework for enhancing the utility and actual use of evaluations. (Patton, 2008a: slide 9)

A PROCESS for helping primary intended users select the most appropriate content, model, methods, theory, and uses for their particular situation. However, UFE does not advocate any particular evaluation content, model, method or theory. Situational responsiveness guides the interactive process between evaluator and primary intended users. Evaluation done for and with specific, intended primary USERS for specific, intended USES. “USES” refers to the way real people in the real world APPLY evaluation findings and experience the evaluation PROCESS. (Patton, 2008: 37 - Ch.2) Utilization focused evaluation is…

UFE : KEY POINTS What is unique in this project in terms of: 1.Context 2.The role of the evaluators 3.Collaborative approaches

Premises of UFE 1.Evaluations should be JUDGED by their utility and ACTUAL USE. 2.No evaluation should go forward unless and until there are primary intended users who will use the information that can be produced. 3.Primary intended users are involved in the process. 4.Evaluation is part of initial program design. The primary intended users want information to help answer a question or questions. 5.Evaluator’s role is to help intended users clarify their purpose and objectives. 6.Implications for use are part of every decision throughout the evaluation – it is the driving force of the evaluation process. (Patton, 2008a)

1.Project / network readiness assessment. 2.Evaluator readiness and capability assessment. 3.Identification of primary intended users. 4.Situational analysis. 5.Identification of primary intended uses. 6.Focusing on evaluation. 7.Evaluation design. 8.Simulation of use. 9.Data collection. 10.Data analysis. 11.Facilitate use. 12.Meta-evaluation. UFE in 12 steps

Conventional Evaluation Vs. Developmental Evaluation Render definitive judgment of success or failure. Position the evaluator outside to assure independence and objectivity. Design the evaluation based on linear cause-and-effect logic models. Aim to produce generalizable findings across time and space. Evaluator determines the design based on the evaluator’s perspective about what is important. Provide feedback, generate learnings, support changes in direction. Position evaluation as an internal process that integrates team functions into action. Design evaluation to capture systems dynamics, interdependencies, and emergent interconnections. Aim to produce context-specific understandings. Evaluator collaborates with those engaged in the change effort to design the evaluation process.

References Gamble, Developmental evaluation. Montreal: McConnell Foundation Henry, G.T. & Mark, M.M. (2003) Beyond use: Understanding evaluation’s influence on attitudes and actions. American Journal of Evaluation 24 (3): McConnell Foundation. (2006) Sustaining social innovation: Developmental evaluation. 11/Feb/09www.mcconnellfoundation.ca/default.aspx?page=139 Patton, M.Q. (2008) Utilization focused evaluation, 4th Edition. Sage. Patton, M.Q. (2008a) Utilization focused evaluation. Presentation to the AEA. Shulha, L. & Cousins, J. (1997) Evaluation use: Theory, research, and practice since Evaluation Practice, 18(3), Fall,

Comments / Questions

Think back about your best evaluation experience… To what extent was your best evaluation experience compatible with UFE? Review it in terms of: The USERS: were they identified? The specific USES: were they clear? User ENGAGEMENT: how and why?

¿Who are the stakeholders that are or need to be involved in this project?

ROLES (1 of 3) EVALUATOR: Person or organization responsible for facilitating / leading the design, the implementation and the utilization of the evaluation. Tasks: organizational development agent, educator, coach/mentor, strategic planner, etc.

ROLES (2 of 3) PRIMARY USER: People who will use and apply the findings of the evaluation. Patton (2008) suggests the following profiler: 1. Interested. 2. Knowledgeable. 3. Open minded. 4. Represents an important interest group. 5. Has credibility. 6. Teachable. 7. Available for ongoing interaction throughout the evaluation process.

ROLES (3 de 3) AUDIENCE INTERESTED IN THE REPORT: Actors interested in the unfolding and findings of the evaluation.

¿From the stakeholders that were identified, who plays what role ?

¿Who is missing ?

BREAK

We don’t want to end up in this situation, so…

¿How well prepared do we feel for adopting UFE as the methodology to evaluate this project?

Evaluators make themselves the primary decision makers. Identify vague, passive audiences as users. Targeting organizations as users. Focusing on decisions instead of decision makers. Assuming the evaluation’s funder is the primary stakeholder. UFE traps or temptations (1 of 2)

Waiting until the findings are in to identify intended users and intended uses. Taking a stance of standing above the messiness of people and politics. Being co-opted by powerful stakeholders. Identifying primary intended users but not involving them meaningfully. (Patton, 2008, adapted from p Ch.3) UFE traps or temptations (2 of 2)

Under what conditions… Are the traps are real? What can you do from the START to minimize the risk of falling into them? How have you addressed the risk of falling into these traps? What other traps or temptations have you faced?

¿How well prepared do we feel we are for adopting UFE as the methodology to evaluate this project?

What resources do we require in order to implement a UFE plan? Is the program willing/ready to allocate the required resources?

Are all involved parties supportive of adopting a UFE approach? ¿What can be done to increase such support?

What could the main challenges of this evaluation initiative be? ¿Do we feel well-enough prepared to face these challenges ?

¿ Do the evaluators feel prepared to have their effectiveness judged by the use that the primary intended users will give to the findings of the evaluation ?

Conclusions and next steps