Presentation is loading. Please wait.

Presentation is loading. Please wait.

Comprehensive Evaluation Concept & Design Analysis Process Evaluation Outcome Assessment.

Similar presentations


Presentation on theme: "Comprehensive Evaluation Concept & Design Analysis Process Evaluation Outcome Assessment."— Presentation transcript:

1 Comprehensive Evaluation Concept & Design Analysis Process Evaluation Outcome Assessment

2 Part 3: Objectives Participants will be able to  Distinguish outcome evaluation from other levels: 1) concept and design evaluation 2) process evaluation 3) outcome evaluation 4) economic evaluation  Match evaluation questions and standards of comparison

3 Sources of Failure … Good Idea Program Theory Implementation Measurement/ Power Study Outcome Set in motion Sufficient effect Sufficient power, credible outcome measures Likely to succeed in similar populations & settings ApplicationGeneralizability Applied widely in similar populations & settings Population Outcome Population Impact!

4 Potential Sources of Failure in Program Evaluations … Program Theory Implemen- tation Measure- ment/Power Study Outcome Inaccurate results: False negative or false positive Set in Motion with good reach, dose, and fidelity Evaluated with poor measures, low power, very weak design Good idea

5 Sources of Failure … Program Theory Implemen- tation Measure- ment/Power Study Outcome Sufficient effect Set in motion Sensitive measures, sufficient power Good idea Generalizability But unlikely to succeed in similar populations & settings Application Applied to populations in need Population Outcome No population impact

6 Outcome Evaluation  Is the program in achieving the intended outcomes?  Is the program the real cause of the observed outcomes?  Does the program cause harm? Does it have positive “side effects”?  To which populations and settings can the outcomes of this evaluation be generalized? Concept & Design Process Outcome

7 Identifying Program Outcomes  Use the logic model to consider long-term, intermediate, or short-term outcomes  You may not need to measure long-term outcomes If you are in a formative phase and do not yet have evidence that you can achieve short-term outcomes If the epidemiologic evidence provides a strong link between intermediate and long-term outcomes and the intermediate outcome has good validity  Rule of thumb: Whatever outcome you decide to measure, also measure the more short-term outcomes to capture the causal chain

8 Activities Outcomes Shorter-termShort-termIntermedLong Meds session w/ PharmD Coordination of meds & other drugs Fewer meds w/ side effects or interactions with other meds Less dizziness, hypoten- sion Falls Dis- ability, loss of mobility Volunteers to modify home Fix fall hazards, install stabilization devices Fewer hazards in the home, more stabilizing devices Strength & balance training More knowl & stronger beliefs about benefits of exercise; More knowl about which exercises; More social support More walking, balance & strength exercises Better balance, strength Falls and the Elderly Intervention HypothesisCausal Hypotheses

9 Don’t forget to look for plausible negative outcomes  Condom protection programs for female commercial sex workers at risk for HIV/AIDS typically presume condom use will serve as an effective contraceptive  But, although condom use increases with many behavioral and social programs, condoms tend not to be used with primary partners, leaving women at risk for unplanned pregnancy

10 Basic Study Designs  Experimental Clinical Trials  “Controlled, Experiment, Randomized trials Quasi Experimental  Community trials  Descriptive Case Series PMR Ecological  Observational Cohort Case Comparison Cross Sectional

11 Study Designs - Descriptive In depth case studies with a lot of data collected that can help rule out rival explanations and suggest causal relations Best condition is when the outcomes of interest are unique, knowledge or behavior that have few other plausible causes Useful for hypotheses development

12 Study Designs - Correlational Studies  Provides a crude way of exploring associations, but the causal direction is unclear  Hypothesis generating  Does not relate exposure and outcome to an individual  No control or very little control over distorting factors  Sample selection is usually problematic

13 Study Designs - Cohort Studies  Provides stronger evidence of exposure (program) - disease (outcome) associations  Evidence of a temporal relationship (when prospective)  Minimizes bias in ascertainment of exposure (when prospective)

14 Study Designs – Experimental  Randomized trials Individual RCT Group or cluster RCT (need to analyze correctly) What constitutes appropriate comparison or control group

15 Sample Nonparticipants Randomization to groups Intervention group(s) Control group Lost to follow-up Measure outcome Schematic Diagram of a Clinical Trial

16 Study Designs – Experimental  Non-randomized (quasi-experimental) trials Can also be individual or group unit of assignment Concurrent vs non-concurrent (historical) comparison group Initial comparability of study groups Statistical adjustment to improve comparability

17 Confounding Program Independent Variable HIV Counseling & testing for IDUs Awareness that they should protect partners Confounder Outcome Dependent Variable HIV+ report using condoms with sex partners


Download ppt "Comprehensive Evaluation Concept & Design Analysis Process Evaluation Outcome Assessment."

Similar presentations


Ads by Google