Download presentation
Presentation is loading. Please wait.
Published byDamian Pitts Modified over 9 years ago
1
Evelyn Gonzalez Program Evaluation
2
AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts
3
Overview of evaluation Defining SMART objectives for your goals Know how to use different methods of evaluation Be more willing to evaluate your efforts OBJECTIVES
4
…the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming. (Patton, Utilization Focused Evaluation, 1997) WHAT IS PROGRAM EVALUATION?
5
Did the program/intervention work? Was it worth it? What worked; what didn’t? Who did we reached? Did we get our monies worth? WHY EVALUATE
6
WHEN SHOULD WE BEGIN EVALUATION?
7
An evaluation plan is the “blueprint” What will be evaluated What information will be collected When will it be collected What will be done with the results EVALUATION PLAN
8
CDC FRAMEWORK FOR EVALUATION
9
Collect Analyze Data START Implementation of the program Gather data as you go Monitor Planning Establish Goals & Objectives Establish baseline Identify an Evidence- Base Program (EBP) Evaluation As you implement End of program Community/Audience Stakeholders Planning Phase Implementation Phase Evaluation Phase Involve Stakeholders Share results with Community & Stakeholders
10
The “grand reason” for engaging in your public health effort Span 3 or more years State the desired end result of the program. GOALS: DEFINITION
11
More specific than goals. They state how the goals will be achieved in a certain timeframe. Well written objectives are SMART: Specific Measurable Achievable Realistic and Relevant Time-framed OBJECTIVES: DEFINITION
12
Specific Who are you reaching (priority audience)? What intervention will you use? Where, setting S.M.A.R.T.
13
Measurable Dosing, how many times will you do the intervention What is the expected outcome Increase of X% following the intervention Decrease of smoking by X% S.M.A.R.T.
14
Attainable Is your intervention feasible? Realistic and Relevant Does the objective match the goal? Is it evidence-based program (EBP)? S.M.A.R.T.
15
Time-framed By when do you anticipate the change? End of the session 3,6,9 months 5 years S.M.A.R.T.
16
You are working on an intervention that will increase awareness about breast cancer risk Objective 1: Participants will be aware of the major risk factors for developing breast cancer. How can this be re-written to be SMART? SMART OBJECTIVE EXERCISE
17
Original: Participants will be aware of the major risk factors for developing breast cancer. SMART Objective: Upon post test following the intervention, participants will be able to identify 3 major risk factors for developing breast cancer. SMART OBJECTIVE EXERCISE
18
Original: This program will increase screening for colorectal cancer in Arkansas. SMART: Colorectal screening will be increased by 5%, over the prior year for age appropriate males in Arkansas. RE-WRITTEN:
19
Objective 1: Public Education for Breast Cancer Screening – Increase knowledge and improve attitudes of all women with regards to the importance of breast cancer screening Strategy 1 – Promote campaigns to educate the public about the importance of mammography. Action 1 – Increase awareness among all women 40 and older of the importance of regular breast cancer screening GOAL: PROMOTE AND INCREASE THE APPROPRIATE UTILIZATION OF HIGH-QUALITY BREAST CANCER SCREENING
20
Planning—Develop the questions, consult with the program stakeholders or resources, make a timeline Data Collection—Pilot testing. How will the questions be asked? Who will ask them? Data Analysis—Who will analyze the data and how? Reporting—Who will report and how? Who will receive the data and when? How will it affect the program Application—How could your results be applied in other places? THE EVALUATION PROCEDURE
21
Look at the evaluation methods used in the original EBP. When discussing evaluation, think about these questions: What is important to know? What do you need to know versus what is nice to know? What will be measured and how? How will this information be used? PLANNING FOR EVALUATION
23
Indicators or measures are the observable and measurable data that are used to track a program’s progress in achieving its goals. Monitoring (program or outcome monitoring, for example) refers to on-going measurement activity SOME DEFINITIONS…
24
Process evaluation can find problems early on in the program. It includes an assessment of the staff, budget review, and how well the program is doing overall. For this kind of evaluation, it may be useful to keep a log sheet to record each of your activities. From Windsor et al., 1994 PROCESS EVALUATION
25
Impact evaluation can tell if the program has a short-term effect on the behavior, knowledge, and attitudes of your population. It also measures the extent to which you have met your objectives. From Green and Kreuter, 1991 IMPACT EVALUATION
26
Outcome evaluation looks to see if the long-term program goals were met. These goals could be changes in rates of illness or death, as well as in the health status of your population. From McKenzie & Smeltzer, 1997 OUTCOME EVALUATION
27
Identify Program Goals For each goal: Identify Process Objectives Identify Outcome Objectives For each objective: Identify Indicators Identify Data Source Plan Data Collection Plan Data Analysis APPLICATION TO YOUR PROGRAM:
28
DATA COLLECTION METHODS Surveys Interviews Focus Groups Observation Document Review
29
You may develop a way to compare the baseline data from the needs assessment with the final outcome of your program. Pre/Post survey in an education session. This will let you see if you have achieved your objectives. PRE- AND POST-EVALUATION
30
Primary sources Quantitative: Surveys/questionnaires Qualitative: Focus groups, public meetings, direct observation Qualitative: In-depth interviews with community leaders, interviews with other program planners. INFORMATION COLLECTION
31
Will depend on which EBP/Intervention selected Answer these questions: What specific behaviors do I want my audience to acquire or enhance? What information or skills do they need to learn to act in a new way? What resources do I need to carry out the program? What methods would best help me meet my objectives? STRATEGIES
32
USING MIXED DATA SOURCES/METHODS Involves using more than one data source and/or data collection method.
33
Your objectives should be measurable so that they can be evaluated. The evaluation should be in line with your objectives. Try not to make up new things to evaluate. PROGRAM OBJECTIVES AND EVALUATION
35
You may want to do a pilot test in order to evaluate the effect of your program. A pilot test is a practice run using a small group who are similar to your target audience. PILOT TESTING
36
Evidence-based programs have already done some type of evaluation. Look to see how the program was evaluated before. Try to use the same methods. You do not have to evaluate everything! REPLICATING THE EVALUATION
37
MONITORING PROGRESS
38
NOW THAT YOU’VE COLLECTED THE DATA, WHAT DO YOU DO WITH IT? Analyzing data Who When How Interpretation of results and sharing findings
39
Must be able to answer this! Do not just look for the good outcomes Learn from what didn’t work Share both the positive and negative outcomes SO WHAT?
40
DEVELOPING RECOMMENDATIONS Your evaluation’s recommendations should be: Linked with the original goals/SMART objectives. Based on answers to your evaluation questions. Should have stakeholder input Tailored to the end users of the evaluation results to increase ownership and motivation to act.
41
SHARING RECOMMENDATIONS Community Executive Summary Final Report Newsletter article(s) Website article Town hall meeting(s) Radio interviews Local newspapers Institution & Yourself Executive Summary Final Report Journal articles Professional conferences Poster sessions Meetings with colleagues
42
TIPS & CONSIDERATIONS Consult with partners with evaluation experience Budget 10-15% for evaluation Staffing Build a database Analysis Consider pilot testing your program Pilot test your evaluation method & tool(s)
43
Trust yourself. You know more than you think you do! Benjamin Spock
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.