Evaluation Collaboration: Opportunities and Challenges Michael Quinn Patton Utilization-Focused Evaluation 19 May, 2010 Evaluating the Haiti Response:

Slides:



Advertisements
Similar presentations
Leveraging inter-sectoral action to address the social determinants of health: view from the health system Lucy Gilson University of Cape Town; London.
Advertisements

Performance Assessment
Why Shared Measurement Matters
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
USE OF REGIONAL NETWORKS FOR POLICY INFLUENCE: THE HIS KNOWLEDGE HUB EXPERIENCE Audrey Aumua and Maxine Whittaker Health Information Systems Knowledge.
Twelve Cs for Team Building
Applying Conflict Sensitivity in Emergency Response: Current Practice and Ways Forward Conflict Sensitivity Consortium ODI Humanitarian Practice Network.
Leading by Convening: The Power of Authentic Engagement
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University.
Role of RAS in the Agricultural Innovation System Rasheed Sulaiman V
Linking the Fairs to the 2013 Ontario Curriculum Social Studies 1 to 6 and History and Geography 7 and 8.
Landscape functions and people Bangkok, October 2010 Ten principles for a Adaptive Landscape Approach.
Evaluation. Practical Evaluation Michael Quinn Patton.
Utilization-Focused Developmental Evaluation and Toward a More Perfect Union.
Evaluation of OCHA’s Role in Humanitarian Civil-Military Coordination Findings and Recommendations Seminar on Evaluation of UN Support for Conflict Affected.
" ICT SUPPORT FOR UNIVERSALISATION OF SECONDARY EDUCATION“ Ashish Garg Asia Regional Coordinator Global eSchools and Communities Initiative 27 th May 2009,
Sustaining Change in Higher Education J. Douglas Toma Associate Professor Institute of Higher Education University of Georgia May 28, 2004.
Strategies for health system performance comparison: some international experience Peter C. Smith Emeritus Professor of Health Policy Imperial College.
Critical Role of ICT in Parliament Fulfill legislative, oversight, and representative responsibilities Achieve the goals of transparency, openness, accessibility,
Investing in Change: Funding Collective Impact
Welcome to Workshop 3 The Foundation of Nursing Studies (FoNS) in Partnership with the Burdett Trust for Nursing Patients First: Supporting Nurse-led.
Inventory, Monitoring, and Assessments A Strategy to Improve the IM&A System Update and Feedback Session with Employees and Partners December 5, 2011.
Overview of NIPP 2013: Partnering for Critical Infrastructure Security and Resilience October 2013 DRAFT.
The County Health Rankings & Roadmaps Take Action Cycle.
+ REFLECTIVE COACHING APRIL 29, Goals for Today Check in on where everyone is in our self-guided learning and practice with reflective coaching.
CBR 101 An Introduction to Community Based Research.
Collaborative Leadership for Better Preparedness: Lessons and Tools from Turning Point Betty Bekemeier Turning Point National Program Office June 30, 2005.
Planning for Sustainability National Child Traumatic Stress Network All Network Meeting February 6, 2007.
Engagement as Strategy: Leading by Convening in the SSIP Part 2 8 th Annual Capacity Building Institute May, 2014 Joanne Cashman, IDEA Partnership Mariola.
Collaboration In Emergency Management. Session Learning Objectives Explain the meaning of “collaboration” in the EM context. Explain the meaning of “collaboration”
O F F I C E O F T H E Auditor General of British Columbia 1 OAG Review of the Performance Agreements between MoHS and Health Authorities.
Convening Partners to Define the Landscape of the Future: Steps toward multi-partner Landscape Conservation Design June 2015 Steering Committee Workshop.
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
A Peer Education Approach to Sexuality Education in Schools Melissa Blake Melissa Reagan Princeton Center for Leadership Training AAHE-AAHPERD National.
© 2011 Brooks/Cole, A Division of Cengage Learning Chapter 16 Consultation and Collaboration You must be the change you wish to see in the world. Mahatma.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
Teaching to the Standard in Science Education By: Jennifer Grzelak & Bonnie Middleton.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
TRUE PATIENT & PARTNER ENGAGEMENT HOW IS IT DONE?.
Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.
Chapter 4 Developing and Sustaining a Knowledge Culture
1 Standard setting in education A UNESCO Case Study & Proposal Supporting Human Diversity through Inclusive Design - The Role of e-learning Standards What.
© UKCIP 2015 Monitoring, Reporting and Evaluation for Adaptation A brief introduction to the EEA Expert Workshop on MRE Patrick Pringle Deputy Director.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
FINAL PRESENTATION OF ORGANIZATIONAL BEHAVIOUR AND ANALYSIS Prepared for : Dr. S. Kumar Group : Dollar 2 A. R. S. BANDARA - PGIA / 06 / 6317 B. A. G. K.
Setting the context: Full costing and the financial sustainability of universities Country Workshop: POLAND EUIMA – Full Costing Project University of.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Workshop: Food, Energy and Water Nexus in Sustainable Cities Beijing October 20-21, 2015 Nada Marie Anid, Ph.D. Dean School of Engineering and Computing.
PRESENTATION AT THE TECHNOLOGICAL UNIVERSITIES QUALITY FRAMEWORK Professor Sarah Moore, Chair, National Forum for the Enhancement of Teaching and Learning.
Keys to A Strong CHART (Team) Purpose Behind the CHART The Right Mix: Recruitment Clear Roles What We Know About Strong CHARTs.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Session 2: Developing a Comprehensive M&E Work Plan.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
Alice Pedretti, Project Manager Effective management of complaints for companies Lessons learned from the Management of Complaints Assessment Tool Amsterdam,
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Assessments ASSESSMENTS. Assessments The Rationale and Purpose for Assessments.
Monitoring and Evaluating Rural Advisory Services
Update from the Faster Payments Task Force
Building the foundations for innovation
How do we know that two heads are better
The SWA Collaborative Behaviors
Evaluation in the GEF and Training Module on Terminal Evaluations
Community Mobilization: Garnering public support for your housing plan
Presentation transcript:

Evaluation Collaboration: Opportunities and Challenges Michael Quinn Patton Utilization-Focused Evaluation 19 May, 2010 Evaluating the Haiti Response: Encouraging Improved System-wide Collaboration A joint OECD/DAC Evalnet-UNEG-ALNAP Roundtable Meeting

VISION and PERCEPTION THOMAS’ THEORM What is perceived as real is real in its consequences. 2

Degrees of working together 1. Networking: sharing information and ideas 2. Cooperating: helping distinct members accomplish their separate individual goals. 3. Coordinating: working separately on shared goals. 4. Collaborating: working together toward a common goal but maintaining separate resources/responsibilitie s 5. Partnering : Shared goals, shared decisions, shared resources within a single entity. 3

4

Please number a sheet of paper:

Question 1. What is current state of working together in humanitarian evaluation (current baseline)? 0. Not at all working together 1. Networking: sharing information and ideas 2. Cooperating: Helping distinct members accomplish their separate individual goals. 3. Coordinating: working separately on shared goals. 4. Collaborating: Working together toward a common goal but maintaining separate resources and responsibilities 5. Partnering: Shared goals, shared decisions, shared resources within a single entity. 6

Question 2. From your perspective, what is the desired level of working together for the Haitian relief evaluation that should be the goal of this meeting? 0. Not at all working together 1. Networking: sharing information and ideas 2. Cooperating: Helping distinct members accomplish their separate individual goals. 3. Coordinating: working separately on shared goals. 4. Collaborating: Working together toward a common goal but maintaining separate resources and responsibilities 5. Partnering: Shared goals, shared decisions, shared resources within a single entity. 7

8 Question 3. At what level of working together are you personally committed on behalf of your organization? 0. Not at all working together 1. Networking: sharing information and ideas 2. Cooperating: Helping distinct members accomplish their separate individual goals. 3. Coordinating: working separately on shared goals. 4. Collaborating: Working together toward a common goal but maintaining separate resources and responsibilities 5. Partnering: Shared goals, shared decisions, shared resources within a single entity.

Evaluation Framework The WORKING TOGETHER continuum as an evaluation framework for evaluating the Haitian relief evaluation. Potentially different degrees of working together for different evaluation purposes. 9

Diverse Types of Evaluation Different uses for different primary intended users 10

Accountability Were funds used appropriately? Each organization accountable to its own stakeholders for basic accountability. What can be done together? Criteria for accountability, for example: * What is appropriate use of relief funds? * What is timely relief? * What is coordinated relief? 12

Other Evaluation Purposes Improvement of relief efforts in the short term Learning and sharing lessons Development of Haiti in the long term Overall judgment about Haitian relief 13

Speaking Truth to Power 14

Aboutthis bookAboutthis book Preview this bookPreview this book Shake Hands with the Devil By Roméo DallaireShake Hands with the Devil By Roméo Dallaire This is a preview. The total pages displayed will be limited. LL Loading... Loading... 15

New Directions for Evaluation Enhancing Disaster and Emergency Preparedness, Response, and Recovery Through Evaluation edited by Liesel Ashley Ritchie and Wayne MacDonald

New Directions for Evaluation 1.Enhancing Disaster and Emergency Preparedness, Response, and Recovery Through Evaluation Liesel Ashley Ritchie and Wayne MacDonald 2. Real-Time Evaluation in Humanitarian Emergencies Emery Brusset, John Cosgrave, Wayne MacDonald 3. The Interagency Health and Nutrition Evaluation Initiative in Humanitarian Crises: Moving From Single-Agency to Joint, Sectorwide Evaluations Olga Bornemisza, André Griekspoor, Nadine Ezard and Egbert Sondorp

4. Save the Children’s Approach to Emergency Evaluation and Learning: Evolution in Policy and Practice Megan Steinke-Chase and Danielle Tranzillo 5. Logic Modeling as a Tool to Prepare to Evaluate Disaster and Emergency Preparedness, Response, and Recovery in Schools Kathy Zantal-Wiener and Thomas J. Horwood

6. Evolution of a Monitoring and Evaluation System in Disaster Recovery: Learning from the Katrina Aid Today National Case Management Consortium Amanda Janis, Kelly M. Stiefel &Celine C. Carbullido 7. Disasters, Crises, and Unique Populations: Suggestions for Survey Research Patric R. Spence, Kenneth A. Lachlan 8. Evaluation of Disaster and Emergency Management: Do No Harm, But Do Better Liesel Ashley Ritchie & Wayne MacDonald

FOCUS and PRIORITIES Utilization-Focused Evaluation lessons 1.Less is more: The dangers and delusions of comprehensiveness. Quality first. 2.Ask important questions: Better to get weaker data on important questions than harder data on unimportant questions. 2. Stay focused on use: actionable findings for intended uses by primary intended users. 20

Understanding and Creating CONTEXT “Social scientists need to recognize that individual behavior is strongly affected by the context in which interactions take place rather than being simply a result of individual differences.” Elinor Ostrom 2009 Nobel Prize in Economics for her analysis of economic governance, especially the commons. 21

The Central Role of Trust “Crucial role of trust among participants as the most efficient mechanism to enhance transactional outcomes…. Empirical studies confirm the important role of trust in overcoming social dilemmas.” Reference: American Economic Review 100 (June 2010):

The Central Role of Trust “Thus, it is not only that individuals adopt norms but also that the structure of the situation generates sufficient information about the likely behavior of others to be trustworthy reciprocators who will bear their share of the costs of overcoming a dilemma.” 23

Elinor Ostrom’s Conclusion “Extensive empirical research leads me to argue that … a core goal of public policy should be to facilitate the development of institutions that bring out the best in humans. We need to ask how diverse polycentric institutions help or hinder the innovativeness, learning, adapting, trustworthiness, levels of cooperation of participants, and the achievement of more effective, equitable, and sustainable outcomes at multiple scales.” 24

Evaluation Theory of Change Process Use – Beyond findings’ use How an evaluation is conducted has an impact beyond the findings that come from the evaluation. For example, evaluation questions carry messages about what matters, what’s important. 25

26

Lessons on effective evaluation collaborations Research on  Shared Measurement Platforms  Comparative Performance Systems, and  Adaptive Learning Systems The research was based on six months of interviews and research by FSG Social Impact Advisors. They examined 20 efforts to develop shared approaches to performance, outcome or impact measurement across multiple organizations. Reference: Kramer, Mark, Marcie Parkhurst, & Lalitha Vaidyanathan (2009). Breakthroughs in Shared Measurement and Social Impact. FSG Social Impact Advisors. 27

Eight factors important to effective evaluation collaborations 1.Strong leadership and substantial funding throughout a multi-year development period. 2.Broad engagement in the design process by many organizations in the field, with clear expectations about confidentiality or transparency 3.Voluntary participation open to all relevant organizations 4.Effective use of web-based technology 28

Eight factors important to effective evaluation collaborations 5. Independence from funders in devising indicators and managing the system 6. Ongoing staffing to provide training, facilitation, and to review the accuracy of all data 7. Testing and continually improving the system through user feedback 8. In more advanced systems, a facilitated process for participants to gather periodically to share results, learn from each other, and coordinate their efforts. 29

Adaptive Learning Systems  The most important lesson learned: the power of breakthroughs to promote a systemic and adaptive approach to solving social problems.  Adaptive Learning Systems offer a new vision that goes beyond capacity building for individual organizations. 30

Adaptive Learning Systems  Breakthroughs offer ways to increase the efficiency, knowledge, and effectiveness of the entire system of interrelated organizations that affect complex interactions and working together.  Adaptive Learning Systems provide a collaborative process for all participating organizations to learn, support each other’s efforts, and improve over time. 31

Adaptive Learning Systems “We believe that shared measurement systems can help move the sector beyond fragmented and disconnected efforts…by creating a new degree of coordination and learning that can magnify the impact of funders and grantees alike.” 32

Adaptive Learning Systems Attentive to complexity concepts and understandings:  COMPLEX DYNAMIC INTERACTIONS  EMERGENCE  ADAPTABILITY  CO-EVOLUTION  DEALING WITH UNCERTAINTY  SYSTEMS CHANGE  INNOVATION For more on complexity and aid, see Ramalingam et al, (2008) Exploring the science of complexity, ODI Working Paper

34 Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use

Elinor Ostrom on Complexity To explain the world of interactions and outcomes occurring at multiple levels, we also have to be willing to deal with complexity instead of rejecting it. Some mathematical models are very useful for explaining outcomes in particular settings. We should continue to use simple models where they capture enough of the core underlying structure and incentives that they usefully predict outcomes. When the world we are trying to explain and improve, however, is not well described by a simple model, we must continue to improve our frameworks and theories so as to be able to understand complexity and not simply reject it. 35

Major Capacity Development Purpose Question Is the collaborative system to be designed only for Haitian relief, or is it to become the foundation for future evaluations of disaster relief? What’s the VISION for the system of evaluation collaboration being built? 36

Results of the Baseline Survey THOMAS’ THEORM What is perceived as real is real in its consequences. 37