Download presentation
Presentation is loading. Please wait.
Published byGeorgiana Wilson Modified over 8 years ago
1
Outcomes Evaluation A good evaluation is …. –Useful to its audience –practical to implement –conducted ethically –technically accurate.
2
Evaluation Outcomes/impact evaluation –measurement of changes in attitude, knowledge, health status, behavior, nutrition status
3
6 Steps of Evaluation Establish an evaluation plan from beginning of program Obtain buy-in from administrators Allow enough staff time to make evaluation a priority
4
6 Steps of Evaluation Obtain permission & encourage participation from participants Be flexible & creative
5
6 Steps of Evaluation Use a strong research design & measures which generate data you need to support your program’s goal
6
Validity Internal validity –extent to which an observed effect can be attributed to a planned intervention
7
Validity External validity –extent to which an observed impact can be generalized to other settings & populations with similar characteristics
8
Threats to Validity History Measurement Selection
9
History External event (H) Internal programmatic or internal participant events (I) Treatment effects (X)
10
Measurement Methods used to collect data Instruments need to be –reliable –valid
11
Selection Define eligibility to participate in a program –criteria –is there a difference between those who stay in the program & those that drop
12
Selection Of those eligible, who accept & who refuse –3222/5000 = 64% –attended/eligible
13
Selection Drop out rate –1600/3222 = 50% –drop out/initially say yes
14
Selection Lost to follow-up –300/1622 = 18% –can’t find/completed program
15
Selection Identify contextual or structural variables to decrease selection bias
16
Regression Effects If score high on pretest, little room for improvement Will show program has poor impact Is pretest score a threat to validity?
17
Synergistic Effects All work together to lower internal validity
18
Evaluation Designs Design to increase internal validity
19
Evaluation Designs Design is selected based on –objectives of the program –purpose of the evaluation –availability of eval resources –type of health & behavior problem, setting & audience
20
Evaluation Designs Notation –R - random assignment –E - intervention group –C - true control group –C - comparison group –X - treatment
21
Evaluation Designs Notation –N - number of subjects –O - observation to collect data –T - time
22
One Group One group & one time –posttest only E OXO Non-experimental No random assignment
23
One Group No control/comparison group
24
One Group What are some of the main weaknesses of this design for increasing internal validity?
25
One Group When could this design be used & be appropriate?
26
Nonequivalent Comparison E OXO C OXO Comparison group –any group not formed by random assignment
27
Nonequivalent Comparison What threats to internal validity are lessened?
28
Baseline Data
29
Time Series E OOO X OOO Pattern of outcome variable Know stability of outcome measure Collect outcome variable unobtrusively
30
Time Series Multiple data points –increases the power of the design Equal intervals
31
Time Series Must still try to control of history, selection & measurement
32
Time Series With control or comparison group, much stronger Better control over history threat to validity
33
Time Series with Comparison Group E OOO X OOO C OOO X OOO
34
True Experimental R E OXO R C O O Establish at baseline 2 groups not sig different Best control of threats to validity
35
True Experimental Advantages? Disadvantages?
36
Post-then-pre Appropriate for assessing behavior change Participants have limited knowledge at beginning of program
37
Post-then-pre Example of a typical pre-test question –Do you include one food rich in vitamin C in your diet daily?
38
Post-then-pre Implement See handout, Table 2 After program give only a posttest
39
Post-then-pre Q1 - Asks about behavior because of program Q2 - What the behavior had been before the program (i.e.. The pretest question)
40
Post-then-pre U of NE handout data analysis problem 6 U of NE ETHT report
41
Success Stories Testimonials Qualitative info Audience testing
42
Case Study Story of an individual Can be biased Cannot be generalized
43
More Services Available Change in the environment
44
Evaluation Plans Read Moving to the Future example PERT & Gantt charts
45
Evaluation Plans Must be in place –objectives –specifications of the intervention & program methods
46
Evaluation Plans Must be in place –measurement & data collection procedures –description of methods
47
Evaluation Plans Main reason –when –from whom –how –by whom
48
Evaluation Plans Worksheet 1 & 2 Worthen & Sanders, 1987 Fill out form as a team
49
END Evaluation Questions?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.