Download presentation
Presentation is loading. Please wait.
Published byMadeline Martin Modified over 9 years ago
1
Program Evaluation The use of scientific methods to judge and improve the planning, monitoring, effectiveness, and efficiency of health, nutrition, and other human service programs
2
Why Evaluate a Program? See table 10-1, page 309, Boyles and Morris
3
Types of Program Evaluation Process evaluation Impact or outcome evaluation Fiscal or efficiency evaluation
4
Process Evaluation Evaluate process objectives Provides information for why program may or may not have reached its outcome objectives If program is delivered from a variety of sites, provides information on why some sites may have been more successful than others
5
Six Steps for Program Evaluation 1. Determine objectives of program Evaluate for: a. Appropriateness of objectives b. Effectiveness in meeting objectives c. Efficiency of program d. Side effects of program
6
Steps in Program Evaluation 2. Determine characteristics to be measured Measurements should be: Valid Reliable Precise
7
Steps in Program Evaluation 3. Measure characteristics
8
Steps in Program Evaluation 4. Make comparisons May use: Control groups Similar groups Standards Pre vs post measurements
9
Steps in Program Evaluation 5. Draw conclusions 6. Make recommendations
10
Common Biases Introduced During Evaluations Selection Testing History Maturation Halo effect
11
Evaluation Design 1. Experimental design 2. Quasi-experimental design 3. Non-experimental design
12
Steps for experimental design 1. Experimental and control groups randomly assigned 2. Each group measured 3. Intervention or program provided 4. Groups measured again--if experimental group improved more than control, program was successful
13
Examples of Designs of True Experiments Pre-test post-test control group design R O X O R O O
14
Examples of Designs of True Experiments After only control group R XO R O
15
Examples of Experimental Design Solomon 4 group R O X O R O O R X O R O
16
Quasi-experimental design Steps similar to experimental, but rigid control not met. Random selection may not be done Subjects may be volunteers Nonequivalent control groups may be used
17
Nonexperimental design Random selection not used No control group or nonequivalent control group used
18
Examples of Non-experimental design After only or one-shot case study X O Nonequivalent control group study X O O Pre-test-Post-test design O X O
19
Fiscal or Efficiency Evaluations Cost-benefit analysis Cost-effectiveness analysis
20
Cost benefit analysis Decision making framework used in allocating resources among competing uses. Both costs and benefits are expressed in dollars
21
Costs Direct Costs --Cash expenditures Indirect Costs –All other costs such as –Spillover effects –Costs to client –Costs to organization not covered by program Opportunity costs Intangible costs--grief, suffering pain
22
Benefits All costs that would be avoided if the program were in effect Direct benefits--values of resources which the program saves Negative benefits Indirect benefits--other costs averted Intangible benefits--happiness, bonding from breastfeeding
23
Discount rate Based on deferred benefits
24
Cost-Benefit Analysis of Attending School Direct costs Indirect costs Intangible costs Direct benefits Indirect benefits Intangible benefits Discount rates
25
Cost effectiveness analysis Determines the most efficient way of meeting a predetermined set of objectives Costs measured in dollars Effectiveness measured by outcomes, e.g.. lives saved, increase in birth weight, etc
26
Communicating Evaluation Results See pages 322-326
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.