Www.fsg-impact.org Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:

Slides:



Advertisements
Similar presentations
Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Advertisements

Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Deanne Gannaway Facilitating Change in Higher Education Practices.
1 Theory of Change Chesapeake Bay Funders Network Program Evaluation Training Workshop OMG Center for Collaborative Learning January 9-10, 2008.
Building a knowledge platform for agriculture and rural development: Evidence-based learning and results based management in Myanmar. Livelihoods and Food.
Getting on the same page… Creating a common language to use today.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Notes for a presentation to the EEN (Canada) Forum Blair Dimock Director, Research, Evaluation and Knowledge Management October 1, 2010 Sharing Practical.
Evaluation. Practical Evaluation Michael Quinn Patton.
Building Strong Library Associations | Regional Convenings DAY 2 Session 7 How will I get support for my proposal?
How’s it Working? Evaluating Your Program MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education,
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
CONNECTICUT ACCOUNTABILTY FOR LEARNING INITIATIVE Executive Coaching.
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Investing in Change: Funding Collective Impact
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
The Evaluation Plan.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Climate Change Council November 2011 draft ACT Planning Strategy.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
The Measurement and Evaluation of the PPSI Oregon Pilot Program Paint Product Stewardship Initiative Portland, Oregon December 10, 2009 Matt Keene Office.
Logic Models and Theory of Change Models: Defining and Telling Apart
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Claire Brindis, Dr. P.H. University of California, San Francisco Professor of Pediatrics and Health Policy, Department of Pediatrics, Division of Adolescent.
Futuring the Key to NC Success Pat Sobrero NC Urban Extension Summit May 11, 2005.
Julie R. Morales Butler Institute for Families University of Denver.
Program Evaluation for Nonprofit Professionals Unit 1 Part 2: Evaluation and The Logic Model.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
Building Bridges and Pathways to College and Careers Linda Collins Executive Director CLP January 24, 2008.
Chapter 4 Developing and Sustaining a Knowledge Culture
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
 2007 Johns Hopkins Bloomberg School of Public Health Section B Logic Models: The Pathway Model.
Managing the National Communications Process UNFCCC Workshop on Exchange of Experiences and Good Practices among NAI Countries in Preparing NCs September.
Erin M. Burr, Ph.D. Oak Ridge Institute for Science and Education Jennifer Ann Morrow, Ph.D. Gary Skolits, Ed.D. The University of Tennessee, Knoxville.
1 Introduction Overview This annotated PowerPoint is designed to help communicate about your instructional priorities. Note: The facts and data here are.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
ACCEL 1 ESL Providers Network (EPN) Adult English Language Acquisition Cohort March 13, 2015.
Logic Models Performance Framework for Evaluating Programs in Extension.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
Designing Effective Evaluation Strategies for Outreach Programs
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Using Logic Models in Program Planning and Grant Proposals
Introduction to Program Evaluation
Logic Models and Theory of Change Models: Defining and Telling Apart
Project Title: (PEARS Action Plan-Step 1)
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Troubleshooting Logic Models
Using Logic Models in Project Proposals
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Framing Grants for policy Research
Presentation transcript:

Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 2 Today’s Discussion Today’s Agenda Things You Need to Do to Evaluate Your Project’s Progress and Outcomes  Articulate your Theory of Change and develop a project Logic Model  Develop an evaluation plan  Execute your evaluation plan  Participate in Knight’s evaluation process

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 3 Evaluation Overview The KCIC Evaluation Has Three Overarching Objectives Knight’s 2 Goals: –Foundations are engaged in helping meet community information needs –Communities are more informed and engaged Evaluate Progress Inform and Improve Program Strategy Promote Learning Inform the Knight Foundation’s program strategy for the Community Information Challenge and identifying opportunities for refinements over time Create data that promotes shared learning within the KCIC grantee community and the field of place-based foundations to increase understanding of how community information needs can be addressed effectively

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 4 What is Evaluation?  The systematic collection of information based on questions of critical importance to the organization or community: What do you want to know? Why do you want to know it? Where will you find what you need to know? What will it look like? How will you use what you learn? Who are the intended users of the information?  To make judgments, inform decisions, and take action  Is an ongoing process; should not be an “event”  Is best when implemented as a collaborative learning activity that builds an organization’s capacity Definitions of Evaluation

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 5 Process The Evaluation Process: Planned, Purposeful, Systematic Develop Recommenda- tions and Action Plans Articulate Theory of Change & Develop Program Logic Model Focus the Evaluation Collect Data Analyze Data and Interpret Findings Determine Design & Data Collection Methods (numbers, words, pictures) Design Implementation Refine Strategy Based on Evaluation Findings Communicating and Reporting

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 6 Instrumental use - findings reflects changes that can be observed. Conceptual use - findings may lead to stakeholders having different kinds of conversations, greater insights into future decisions, and/or greater commitment to the program or initiative. Political or Symbolic use - legitimate uses of the evaluation findings may be applied to lobbying for the program or initiative, securing new or additional funding, or communicating that the evaluation has taken place. Uses of Evaluation How Can We Use Evaluation? Well designed evaluations will serve multiple purposes

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 7 Internal Stakeholders Board members/Trustees Senior Leaders/Executives Program staff External Stakeholders Legislators/Policymakers Project partners The general public The media Researchers Other funders Journalism programs Stakeholders Different Stakeholders Will Have Different Uses for Evaluation Evaluation Stakeholders Identifying stakeholders has implications for: which and how many questions are asked what indicators will be of interest how findings will be communicated and reported how results will be used

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 8 Example This Theory of Change illustrates at a basic level why SVCF is doing what it’s doing The community needs a regional plan to address climate change and sustainability There is a low awareness and understanding among residents of the intricacies and implications of regional development decisions Residents are not engaged in the planning process today Creation of SVCF’s Venture Fund (with investors) Development of a web- based information platform for citizens to learn about potential scenarios Planned convenings on land use and sustainable cities Awareness campaign (KQED) Interactive kiosks Citizens are more informed and aware of key issues Citizens are more engaged in planning processes Regional plan is developed with community input What we are doing to address the issue – how we plan to solve the problem or create the change (Activities) What the issue is and why we are taking action (Assumptions) The change we hope to see if we are successful (Outcomes) Source: FSG interview; used with grantee’s permission Envision Bay Area – Theory of Change

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 9 As You Think About Your Own KCIC Project Outcomes, Consider the Possibilities Outcomes are…the SO WHAT? The short and/or long term changes, results, and impacts from implementing a project, program, or initiative. Changes may be: positive or negative, singular or multiple in knowledge, skills, and/or attitudes in organizational policies, practices, capacity at community level in terms of behavior, values, and attitudes Outcomes What will be different in your community if your project is successful?

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 10 Types of Outcomes  Short term Outcomes Achieved during program timeframe Within program control “Expect to see”  For example: New knowledge Changed opinion/values Increased skills Changed motivation/intent Changed attitudes Changed aspirations New relationships/networks Short term Outcomes Adapted from: Innovation Network, Inc. (2005)

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 11 Types of Outcomes  Intermediate Outcomes Achieved at the end/beyond program timeframe Follow shorter-term outcomes “Want to see”  For example: Modified behavior Changed policies Changed practices Changed social action Changed decisions Modified structures Adapted from: Innovation Network, Inc. (2005) Intermediate Outcomes

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 12  Longer-term Outcomes Achieved after program timeframe Outside direct program control “Hope to see”  For example: Changed human condition Changed civic condition Changed economic condition Changed environmental condition Longer-term Outcomes Types of Outcomes Source: Innovation Network, Inc. (2005)

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 13 Start Building Your Evaluation Plan by Developing Your Own Theory of Change and Identifying Your Project Outcomes 1.From our interviews with each of you, we have attempted to visually depict your project’s theory of change. Review our version of your ToC and change, add/or revise as needed. 2.If your project were hugely successful in three years, what would your target audience be doing, thinking? (long term outcomes) 3.After one year, what would your target audience be doing, thinking? (short term outcomes) 4.What are some indicators that your project is achieving what you set out to do? (outputs) In Practice See online assignment form:

KCIC Boot Camp – FSG Evaluation Presentation (March 2010) © FSG Social Impact Advisors 14 Next Steps  Developing a Project Logic Model and Key Evaluation Questions (Webinar #1)  Determining Indicators of Online and Offline Behavior (Webinar #1)  Choosing an Evaluation Design and Data Collection Methods (Webinar #2)  Analyzing Quantitative and Qualitative Data (Webinar #2)  Strategies for Communicating and Reporting Evaluation Findings to Multiple Audiences (Webinar #3)  Developing an Evaluation Budget (Webinar #3)  Tips for Hiring an Evaluator (Webinar #3) Next Steps in Designing Your Project’s Evaluation Plan Throughout 2010 – FSG will provide up to 2 hours of additional evaluation-related technical assistance to each grantee