David Merves, Evergreen Evaluation And Consulting Using Social Network Analysis to Understand and Improve Collaboration Among Centers, Projects and Initiatives.

Slides:



Advertisements
Similar presentations
Delivering as One UN Albania October 2009 – Kigali.
Advertisements

Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Knowledge Translation Curriculum Module 3: Priority Setting Lesson 2 - Interpretive Priority Setting Processes.
GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
Early Intervention Research Institute Building Community Partnerships to Support Integrated Infant Mental Health Systems of Care Richard Roberts, Diane.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Dr. David Mowat June 22, 2005 Federal, Provincial & Local Roles Surveillance of Risk Factors and Determinants of Chronic Diseases.
David Merves, Evergreen Evaluation And Consulting Jan Vanslyke, Jan Vanslyke Evaluation Inc. Using Social Network Analysis to Understand and Improve Collaboration.
Evaluation Highlights from Pilot Phase July 2005 – June 2007 Prepared for Leadership Team Meeting January 11, 2008.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
The “Recipe” for Systems Change The Vision The Current Status/Self- Assessment & Objective Evaluation The Goal(s) The Objectives The plan of action Who.
BCO Impact Assessment Component 3 Scoping Study David Souter.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Module 2 National IEA Process Design and Organization Stakeholder’s consultative workshop/ Training on Integrated assessments methodology ECONOMIC VALUATION.
Evaluating with Perspective: Mobilizing Community Participation in Program Evaluation …its not too late!
Decision-Oriented Evaluation Approaches Presenters: Chris, Emily, Jen & Marie.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Stages of Research and Development
Logic Models How to Integrate Data Collection into your Everyday Work.
Participatory governance of natural resources in the Caribbean
Rapid Launch Workshop ©CC BY-SA.
Contents Playbook Objectives Playbook Value Details Playbook Design
Updating the Value Proposition:
Building Our Plan Creating our Regional Action Plan
Monitoring and Evaluation Frameworks
Strengthening Accountability Relations with Beneficiaries through GALI
Pat Mueller, EEC David Merves, EEC Vitaliy Shyyan, NCEO
Review of Selected Evaluation Approaches
BEST PRACTICES IDENTIFICATION
Lisa Raphael Laura Costello Jack Lumbley Melissa Dodson SEDL
Introduction to Program Evaluation
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
Western Isles Strategic Plan Raghnall Culley Priomh Oifigear
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Fiscal Mapping Community of Practice
AEA Conference | November 10, 2017
Overview of the Teacher Work Sample (TWS)
Participative Process Reviews
Statistics Governance and Quality Assurance: the Experience of FAO
GNC Global Partners Meeting Washington 30/03/16
Implementation Guide for Linking Adults to Opportunity
Implementation, Monitoring, and NM DASH
Community and Grantee Voice in Evaluation Design
Building Knowledge about ESD Indicators
the Connecticut public health association’s 2017 annual conference
Community Integration and Development USP Conference May 2013
COMMUNITY BASED LEARNING BEST PRACTICES
Strategic Planning Final Plan Team Meeting
Jamie Weinstein, MPH The MayaTech Corporation,
Environment and Development Policy Section
TRACE INITIATIVE: Data Use
Using Logic Models in Project Proposals
Assertive Community Treatment Webinar
SWA Progress Review Initial Framing
Institute for State Effectiveness (c) 2009
Evaluating Community Link Working in Scotland: Learning from the ‘early adopters’ Jane Ford, NHS Health Scotland Themina Mohammed & Gordon Hunt, NSS Local.
Evaluating AETC NCRC Partnerships for Impact
The Norwalk Story: How one community is using the Ages and Stages Questionnaires (ASQ®) to build a system for developmental screening for young children.
Quality Framework Overview
Data for PRS Monitoring: Institutional and Technical Challenges
Community Benefit Activities
Root Cause Analysis Identifying critical campaign challenges and diagnosing bottlenecks.
Presentation transcript:

David Merves, Evergreen Evaluation And Consulting Using Social Network Analysis to Understand and Improve Collaboration Among Centers, Projects and Initiatives 1

Evaluation Context Michigan’s Integrated Improvement Initiatives and Center for Educational Networking Evaluation approach and goals Decision to use Social Network Analysis (SNA) 2

3

EVALUATION METHODOLOGY A UTILIZATION-FOCUSED APPROACH Identify and engage primary evaluation users Review and analyze findings Interpret findings Make value judgments Develop recommendations 4

EVALUATION METHODOLOGY A UTILIZATION-FOCUSED APPROACH Communicate Findings and recommendations to key audiences Facilitate learning and use of evaluation findings and recommendations Commit resources to the process Monitor your process and your progress 5

Gates Foundation actionable measurement  We hold ourselves accountable for what we do and how we do it by measuring inputs, activities, and outputs of our own work and that of our investments. Facilitate learning and use of evaluation findings and recommendations  We contribute to accomplishing shared goals by measuring outcomes and impact, sharing our results, and collaborating with partners to understand what works and why in the population we serve 6

The inflection curve Inflection Point New HeightsNew Heights Decline 7

Social network analysis  Researchers have found that network perspective gives formal definition to social structure and patterns of relationships Illustrative diagrams (sociograms)- betweeness, connectivity, centrality Useful for formative evaluation- useful for facilitating use of data and decision making Using multi-method approach which includes SNA provides powerful outcome measures for summative evaluation 8

Hogue et. al. Adapted Measure of Collaboration 9

Collaborative Relationships A Balancing Act EECMI3 OSE-EIS CENInitiatives SNA Mentors JVS 10

The Power of Collaboration Source: Aha! Jokes 11

12

Inherent Tensions Divergent goals and values Priority setting (and scheduling) Contextual pressures 13

Advantages of SNA Participatory process gives a voice to all stakeholders SNA mentors guide the process and help frame results Network maps promote insight and discussion Visual representation of findings can validate participant experiences 14

Social network analysis First used in 1934 by Joseph Moreno in New York City Schools. New York Times called it, “Psychological Geography”  Money flow between organizations, how people obtain employment, transmission of infectious diseases, decision-making The use of social network analysis (SNA) to measure strengths in community collaboration offers several advantages 15

16

Parent Involvement Example 17

Parent Involvement Example 18

Inter-Agency Linkages Pre-Grant 19

Year 1 20

Year 2 21

Year 3 22

Challenges with SNA Can be time consuming and resource intensive SNA is a complicated method – Need to balance time devoted to technical detail with resources Choosing a data collection tool and method 23

Lessons Learned Determine a feasible level of participatory work Investing time to reach consensus on measurement is a double edged sword Use mentors to add value to your process and products 24

David Merves, Evergreen Evaluation And Consulting THANK YOU 25