How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.

Slides:



Advertisements
Similar presentations
MADELEINE GABRIEL & JULIE DAS 9 FEBRUARY PURPOSE OF THIS SESSION Introductions Purpose of our input today Our role – we are evaluating the programme,
Advertisements

Gathering Evidence of Impact: A Continued Conversation Jan Middendorf Cindy Shuman Office of Educational Innovation and Evaluation.
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Corry Bregendahl Leopold Center for Sustainable Agriculture Ames, Iowa.
So what? Using Logic Models to Write Strong Impact Statements
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Who was there and how many were there? Did they learning something? Will they change behavior? Inputs: Time, Money, Materials Numbers Reached, Frequency.
Reality Check For Extension Programs Deborah J. Young Associate Director University of Arizona Cooperative Extension.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Collecting Data This is STEP 3 of the five steps.
Objectives and Indicators for MCH Programs
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
The Evaluation Plan.
Too expensive Too complicated Too time consuming.
Logic Models Handout 1.
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
School of Education Research and Practice Mini-Grant Program Request for Proposal
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Logic Models and Theory of Change Models: Defining and Telling Apart
Evaluating Financial Education Programs: A framework for measuring results Ellen Taylor-Powell, Ph.D. Evaluation Specialist American Savings Education.
Developing a logic model Western Region Institute Brian Luckey, University of Idaho Extension 1 © 2008 by the University of Wisconsin System..
LCES Advisory Leadership System… The role of advisory leadership councils in Extension’s programming process.
LOGIC MODEL: Moving Forward into the Accountability Era Sharon Schnelle, Presenter Sponsored through.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
How to Get Where You’re Going (Part 1)
Evaluation Workshop Self evaluation – some workable ideas and approaches.
Evaluation: what is it? Anita Morrison. What is evaluation? Evaluation… –…is the process of determining the merit, worth, or value of something, or the.
Program Evaluation.
Methods and Implications of Using Methods Ellen Taylor-Powell University of Wisconsin-Cooperative Extension.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
What is a Logic Model and What Does it Do?
Basic Program Logic RESOURCES/INPUTS The ingredients you need to implement your program! YOUR PROGRAM What you do to achieve your departmental goals! RESULTS/IMPACT.
Getting to Outcomes: How to do strategic planning with your CRP Theresa Costello National Resource Center for Child Protective Services May 25, 2007.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation What do you want to know? What will you measure?
Tools and techniques to measuring the impact of youth work (Caroline Redpath and Martin Mc Mullan – YouthAction NI)
Are we there yet? Evaluating your graduation SiMR.
Session 2: Developing a Comprehensive M&E Work Plan.
A Comprehensive Framework for Evaluating Learning Effectiveness in the Workplace Presented by Dr Cyril Kirwan.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
LOGIC MODEL: A Program Performance Framework Oklahoma State University Stillwater, OK.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Logic Models Performance Framework for Evaluating Programs in Extension.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Logic Models How to Integrate Data Collection into your Everyday Work.
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Utilizing the LOGIC MODEL for Program Design and Evaluation
School of Education Research and Practice Mini-Grant Program
Program Evaluation Essentials-- Part 2
Short term Medium term Long term
Logic Models and Theory of Change Models: Defining and Telling Apart
Logic Model, Rubrics & Data collection tools
Purpose of Outcomes measurement
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Resources Activity Measures Outcomes
Changing the Game The Logic Model
Using Logic Models in Project Proposals
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008

Workshop Desired Outcomes An understanding of how basic logic model techniques help define outcomes and focus program evaluation An understanding of options for data collection methods to measure outcomes

Workshop Agenda Introductions Part 1: Logic Model Presentation – Logic Model helps with outcomes Activity – “Using a logic model to focus outcomes evaluation” Part 2: Measuring Outcomes Presentation - Review of Data Collection Options Activity – Methods Café Close

Part 1: Logic Model as Road Map Where are you going? How will you get there? What will show that you’ve arrived? “If you don’t know where you are going, how are you gonna’ know when you get there?” Yogi Berra

University of Wisconsin-Extension, Program Development and Evaluation HUNGRYHUNGRY Feel better Get food Eat food Everyday example

University of Wisconsin-Extension, Program Development and Evaluation Logical chain of connections showing what the program is to accomplish What we do Who we reach INPUTS OUTPUTS OUTCOMES Program investments ActivitiesParticipationLearning (Short- term) Action (Medium -term) Impacts (Long- term) What we invest What results

University of Wisconsin-Extension, Program Development and Evaluation Outputs vs. outcomes Example: Number of patients discharged from state mental hospital is an output. Percentage of discharged who are capable of living independently is an outcome Not how many worms the bird feeds its young, but how well the fledgling flies (United Way of America, 1999)

University of Wisconsin-Extension, Program Development and Evaluation Social-economic- environmental impacts What do you want to know? Source: Bennett and Rockwell, 1995, Targeting Outcomes of Programs Reactions Learning Actions Number and characteristics of people reached; frequency and intensity of contact Degree of satisfaction with program; level of interest; feelings toward activities, educational methods Changes in knowledge, attitudes, skills, aspirations Changes in behaviors and practices Participation

University of Wisconsin-Extension, Program Development and Evaluation Needs assessment: What are the characteristics, needs, priorities of target population? What are potential barriers/facilitators? What is most appropriate to do? Process evaluation: How is program implemented? Are activities delivered as intended? Are participants being reached as intended? What are participant reactions? Outcome evaluation: To what extent are desired changes occurring? Who is benefiting / not benefiting? How? What are unintended outcomes? Impact evaluation: What are the net effects? What are final consequences? To what extent can changes be attributed to the program?

University of Wisconsin-Extension, Program Development and Evaluation ACTIVITY: PARENT EDUCATION PROGRAM What do you (and others) want to know about the program? Staff Money Partners Assess parent ed programs Design- deliver evidence- based program of 8 sessions Parents increase knowledge of child dev Parents better understanding their own parenting style Parents use effective parenting practices Improved child- parent relations Research INPUTS OUTPUTS OUTCOMES Facilitate support groups Parents gain skills in new ways to parent Parents identify appropriate actions to take Parents of 3-10 year olds attend Reduced stress Parents gain confidence in their abilities Strong families

Logic Model Reflection How does a logic model help focus on what outcomes to measure and when to measure them? How might you use a logic model in your own work?

Part 2: Measuring Outcomes How will you answer your questions? 1. Decide on what evidence answers the questions 2. Determine sources of information 3. Choose data collection methods

University of Wisconsin-Extension, Program Development and Evaluation Identify evidence How will you know it when you see it? What are the specific indicators that will be measured? - Often expressed as # or % - Outputs and outcomes indicators - Quantitative or qualitative

University of Wisconsin-Extension, Program Development and Evaluation Logic model with Indicators for Outputs and Outcomes Program implemented Targeted farmers Farmers learn Farmers practice new techniques Farm profitability increases Number of workshops held Quality of workshops Number and percent of farmers attending Number and percent who increase knowledge Number and percent who practice new techniques Number and percent reporting increased profits; amount of increase Outputs Outcomes

Remember: “Not everything that counts can be counted.” 9 (Quantity) Happy (Quality) Kids Quantitative: numbers, breadth, generalizability Qualitative: words, depth, specific

University of Wisconsin-Extension, Program Development and Evaluation Possible Evaluation questions, indicators Staff Money Partners Parents increase knowledge of child dev Parents better understand their own parenting style Parents use effective parenting practices Improved child- parent relations Research Facilitate support groups Parents gain skills in new ways to parent Parents identify appropriate actions to take To what extent is stress reduced? To what extent are relations improved? To what extent did behaviors change? For whom? Why? What else happened? To what extent did knowledge and skills increase? For whom? Why? What else happened? Who/how many attended/did not attend? Did they attend all sessions? Supports groups? Were they satisfied – why/why not? How many sessions were held? How effectively? #, quality of support groups? What amount of $ and time were invested? Parents of 3-10 year olds Deliver series of 8 interactive sessions EVALUATION QUESTIONS # Staff $ used # partners # Sessions held Quality criteria INDICATORS #,% attended per session Certificate of completion #,% demonstrating increased knowledge/skill Additional outcomes #,% demonstrating changes Types of changes #,% demonstrating improvements Types of improvements Develop parent ed curriculum Reduced stress Parents gain confidence in their abilities Strong families

Determine Sources of Information Program participants Existing data Program records, attendance logs, etc Pictures, charts, maps, pictorial records Others/Non-participants Key informants Funders Collaborators Etc.

University of Wisconsin-Extension, Program Development and Evaluation Decide on Data Collection Methods Survey Interview Test Observation Group techniques Portfolio review Diaries, journals Case study Photography, video Document review Expert or peer review

University of Wisconsin-Extension, Program Development and Evaluation Data collection plan template QuestionsIndicatorsData collection SourcesMethodsSampleTiming

When choosing your methods consider: Purpose Participants Resources Available

Some things to remember… There is no one right method of collecting data Each has a purpose, advantages and challenges The goal is to obtain trustworthy, authentic and credible evidence Often a mix of methods is preferred

Are the data reliable and valid? Validity: Are you measuring what you think you’re measuring? Reliability: If something was measured again using the same instrument, would it produce the same (or near the same) results?

University of Wisconsin-Extension, Program Development and Evaluation Logic model and reporting

Methods Cafe What approaches to measure outcomes work well for CE programs and why? Is there anything you’ve tried that you would not recommend?

Methods Reflection Given what we have discussed, what might you change or do differently with the methods that work for you? Are there any other, new methods that might work for you and your clientele?