Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University.

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

WV High Quality Standards for Schools
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
The HR Paradigm Shift Discover Stakeholder Value for the Human Resources Function.
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
CULTURAL COMPETENCY Technical Assistance Pre-Application Workshop.
The Partnership Way: Blueprint Development Session Joanne Cashman Director, The IDEA Partnership At NASDSE Luann Purcell Executive Director, CASE Stacy.
The Measuring Success Project January 17, 2014 J.Billman, P.J. Lundgren, S. Brown.
Clinical Supervision Foundations Module Six Performance Evaluation.
Comprehensive M&E Systems
Action Implementation and Monitoring A risk in PHN practice is that so much attention can be devoted to development of objectives and planning to address.
NRCOI March 5th Conference Call
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Evaluation. Practical Evaluation Michael Quinn Patton.
Student Assessment Inventory for School Districts Inventory Planning Training.
Professional Growth= Teacher Growth
Molly Chamberlin, Ph.D. Indiana Youth Institute
Capable leadership is vital for meeting the challenges faced by aged care provider organisations and for the continued sustainability of the industry.
Identification, Analysis and Management
A Guide for Navigators 1National Disability Institute.
Why the Alliance was Formed Rising rates of overweight and obesity; 50% of adults are not active enough for health benefits; Concern about dietary practices.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Investing in Change: Funding Collective Impact
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
PSYCHOEDUCATION: APPLICATIONS FOR CROSS- SYSTEMS PRACTICE IN INTERNATIONAL CONTEXT Mainstreaming Mental Health in Public Health Paradigms: Global Advances.
Evaluating the Strength of the Advocacy Field A prospective look at the Missouri health advocacy ecosystem Tanya Beer Center for Evaluation Innovation.
Engagement as Strategy: Leading by Convening in the SSIP Part 2 8 th Annual Capacity Building Institute May, 2014 Joanne Cashman, IDEA Partnership Mariola.
Outcome Based Evaluation for Digital Library Projects and Services
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
CONNECTICUT HEALTH FOUNDATION: Update on Evaluation Planning for the Strategic Plan.
Museums and Galleries Education Programme 2 Final Report Centre for Education and Industry University of Warwick.
MOVING FORWARD: STATEWIDE PROGRESS OF MOFAS GRANTEES.
Logic Models and Theory of Change Models: Defining and Telling Apart
Evaluation framework: Promoting health through strengthening community action Lori Baugh Littlejohns & Neale Smith David Thompson Health Region, Red Deer,
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Military Psychology: Teams and Teamwork Dr. Steven J. Kass.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
1-2 Training of Process Facilitators 3-1. Training of Process Facilitators 1- Provide an overview of the role and skills of a Communities That Care Process.
Take Charge of Change MASBO Strategic Roadmap Update November 15th, 2013.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
The Next Stage for Results in Africa. Context 2005 Paris Declaration on Aid Effectiveness 2006 Mutual Learning Events Uganda & Burkina Faso 2007 Hanoi.
Mountains and Plains Child Welfare Implementation Center Maria Scannapieco, Ph.D. Professor & Director Center for Child Welfare UTA SSW National Resource.
SUPERVISION: SIGNS OF SAFETY STYLE Phase 1 The Supervision Contract Phase 2 Case Specific Supervision Phase 3 Performance Booster Phase 4 Review of P.E.
Tier 2/ Tier 3 Planning for Sustainability Rachel Saladis WI PBIS Network/Wi RtI Center Katrina Krych Sun Prairie Area School District.
Community Planning Training 5- Community Planning Training 5-1.
Evaluation of the Quebec Community Learning Centres: An English minority language initiative Learning Innovations at WestEd May 21, 2008.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Validated Self Evaluation of Alcohol and Drug Partnerships Evidencing Implementation: The Quality Principles – Care Inspectorate/The Scottish Government.
A framework for evaluating partnerships
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
Creating an evaluation framework to optimise practice returns: Process & progress within a large community sector organisation Australasian Evaluation.
CoP: The Partnership Way ASD CoP November 2011 Joanne Cashman.
Advancing learning through service Tamara Thorpe Trainer | Coach | Consultant Region 2 NAFSA Albuquerque, NM.
Session 2: Developing a Comprehensive M&E Work Plan.
Resource Review for Teaching Resource Review for Teaching Victoria M. Rizzo, LCSW-R, PhD Jessica Seidman, LMSW Columbia University School of Social Work.
The AUC TVET Strategy for Youth Employme nt Windhoek, April 2014 Prudence Ngwenya Department for Human Resources Science & Technology.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
THE USE OF TWO TOOLS TO EVALUATE HEALTHY START COALITIONS Susan M. Wolfe, Ph.D.
Planning for Research Uptake through a Research Communication Strategy (ResCom)
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
A Focus on Outcomes and Impact
Assessment of Service Outcomes
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University of Wisconsin-Extension

Types of evaluation questions The outcomes are broad and complex. How do we get started? The outcomes are broad and complex. How do we get started? Is evaluating process good enough or do we have to evaluate outcomes? Is evaluating process good enough or do we have to evaluate outcomes? Who should be involved in evaluating a collaborative program? Who should be involved in evaluating a collaborative program? I’m not in charge. How do I evaluate it? I’m not in charge. How do I evaluate it? How do I take credit for something that we’ve done together? How do I take credit for something that we’ve done together?

Issues and challenges Power – control Power – control Process of the evaluation Process of the evaluation Data Data Standards and quality of the evaluation Standards and quality of the evaluation Cross-cultural issues Cross-cultural issues Measurements issues Measurements issues Attribution Attribution Taking credit Taking credit

Collaborative evaluation (not evaluation of collaboration) Since mid – 1970’s, new paradigm of participatory evaluation Since mid – 1970’s, new paradigm of participatory evaluation “applied social research that involves trained evaluation personnel…and practice-based decision makers working in partnership” ( Cousins and Earl, 1992) “applied social research that involves trained evaluation personnel…and practice-based decision makers working in partnership” ( Cousins and Earl, 1992) Multiple approaches -from broadening decision making (practical) to emancipation and social change (transformation) Multiple approaches -from broadening decision making (practical) to emancipation and social change (transformation) Emphasis on using data collection and feedback to strengthen and monitor collaboration and thus increase overall effectiveness and efficiency Emphasis on using data collection and feedback to strengthen and monitor collaboration and thus increase overall effectiveness and efficiency value in the process of evaluation, process use (Patton, 1997), as much as product value in the process of evaluation, process use (Patton, 1997), as much as product

Who controls? Who participates? How much? Researcher control Practitioner/participant control Primary users Deep participation All legitimate groups Consultation Adapted from Cousins and Whitmore, 1998

First… Who wants to know what? Who wants to know what? For what purpose? For what purpose? How will information be used? How will information be used?

OUTCOMESINPUTSOUTPUTS Collaborative Product Collaborative Relationship AssumptionsExternal factors Building a logic model of collaboration EVALUATION SITUATION Collaborative Effectiveness

Partners Funding Research- based Key stake holders Change in behaviors Value- added Clientele Users Policy makers Publics Policy changes Change in Knowldge Attitudes Skills Motivation Intent Self- efficacy Implement activities – action plan Monitor and evaluate Communicate Advocacy/ Policy Collaborative Relationship building Individual members Group Change in behaviors decision making Change in KAS Self- efficacy Intent Effective functioning partnership Member satisfaction Changes in conditions Collaboration: Theory of change WHAT DO YOU WANT TO KNOW? System changes Community changes Capacity building - TA

Evaluating the Collaborative Relationship 1. Process evaluation How is it functioning? How effective is the group work? Are we likely to achieve our desired results? How is it functioning? How effective is the group work? Are we likely to achieve our desired results? How satisfied are members? How satisfied are members? Questions about capacities, operations, climate, context Questions about capacities, operations, climate, context Factors influencing success Factors influencing success Projected tasks/activities relative to stages of development Projected tasks/activities relative to stages of development Milestones and Critical Events (journey) Milestones and Critical Events (journey)

MILESTONES Significant points along the way Significant points along the way Examples Examples Key stakeholders on board Key stakeholders on board Vision statement established Vision statement established Grant secured Grant secured Action plan formulated – plan of work Action plan formulated – plan of work Project implemented/service provided Project implemented/service provided Project evaluated Project evaluated CRITICAL EVENTS Unexpected events, positive and negative Progress markers Evidence of accomplishments Disruptions or obstacles Examples Change in membership Policy change New donor added

2. Outcomes (Process outcomes): What difference has being a part of this group made for the individual? What difference has being a part of this group made for the individual? Knowledge, skills, motivations, behaviors, etc. Knowledge, skills, motivations, behaviors, etc. Human capital development Human capital development What difference is their for the group? What difference is their for the group? Group functioning, identify, resource pooling, etc Group functioning, identify, resource pooling, etc Note: Outcomes can be positive, negative or neutral

Methods Informal feedback Informal feedback Member (partner) Survey Member (partner) Survey Member (partner) interviews Member (partner) interviews Group discussions Group discussions Key informant interviews Key informant interviews Observation Observation Identification and use of indicators Identification and use of indicators Network analysis ; sociogram Network analysis ; sociogram Use existing materials (integrate into ongoing operations) Use existing materials (integrate into ongoing operations) Minutes of meetings Minutes of meetings Logs: telephone, event, registration forms Logs: telephone, event, registration forms Management charts Management charts WHEN? Periodic Review Points of particular concern

Tools - Techniques Community Group Member Survey Community Group Member Survey Collaborative Relationship scales Collaborative Relationship scales Internal collaborative functioning scales Internal collaborative functioning scales Plan Quality Index Plan Quality Index Meeting effectiveness inventories Meeting effectiveness inventories Stage of readiness Stage of readiness On-line Wilder Collaboration Factors Inventory (Amherst H. Wilder Foundation) On-line Wilder Collaboration Factors Inventory (Amherst H. Wilder Foundation) On-line Partnership self-assessment tool (Center for Advancement of Collaborative Strategies in Health) On-line Partnership self-assessment tool (Center for Advancement of Collaborative Strategies in Health)

Evaluating Programs/Products created/implemented by the collaboration 1. Process or implementation evaluation (Focus: program delivery vs. coordination or support role) (Focus: program delivery vs. coordination or support role) How is program being implemented? Fidelity to plan? Extent of delivery? Participation? What is/has happened that wasn’t planned? How is program being implemented? Fidelity to plan? Extent of delivery? Participation? What is/has happened that wasn’t planned?

Outcome evaluation Outcome evaluation What is different? For whom? How? To what extent? What is different? For whom? How? To what extent? For: Individuals, Groups/Families, Agencies, Systems, Communities For: Individuals, Groups/Families, Agencies, Systems, Communities Changes in … Changes in …

Individuals Attitudes, perceptions, knowledge, competence, skills, abilities, behaviors, actions, lifestyles Groups/families Interactions, behaviors, actions, values, culture Agency, organization #/type of services/programs delivered, access, practices, resource generation, resource use, policies Systems Relationships, interaction patterns, linkages, networks, practices, policies, resource use, institutionalization of changes Communities Values, attitudes, relations, support systems, civic action, social norms, policies, laws, practices, conditions Change in :

Tools - Techniques Monitor implementation Monitor implementation Logs, management charts, Logs, management charts, Interviews Interviews Observations Observations Achievement of outcomes Achievement of outcomes Clientele surveys Clientele surveys Clientele interviews Clientele interviews Observations Observations  Mixed Methods

Evaluating self - Taking credit Mutual (reciprocal) accountability Mutual (reciprocal) accountability How do I take credit for my part? How does Extension gain visibility, recognition? How do I take credit for my part? How does Extension gain visibility, recognition? What is your contribution? What role did you play? What value did you bring? What is your contribution? What role did you play? What value did you bring? Document role you play, your activities and contributions, inputs you bring, resources you make available, niche, value… Document role you play, your activities and contributions, inputs you bring, resources you make available, niche, value…

Your contribution Log of activities, roles played Log of activities, roles played Record inputs, resources contributed Record inputs, resources contributed Management chart; analysis of minutes Management chart; analysis of minutes Independent assessment Independent assessment Survey Survey Interviews Interviews

Your (partner) performance: Your (partner) performance: Most important indicator: other partners’ satisfaction with your performance (Brinkerhoff, 2002) Most important indicator: other partners’ satisfaction with your performance (Brinkerhoff, 2002) Mutual assessment among partners of each partner’s performance. Resulting discussion re. Discrepancies = powerful information sharing and trust building. Mutual assessment among partners of each partner’s performance. Resulting discussion re. Discrepancies = powerful information sharing and trust building. (We aren’t very good at this type of thing) (We aren’t very good at this type of thing)

Web address