Download presentation
Presentation is loading. Please wait.
Published byThomas York Modified over 9 years ago
1
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically and fairly – i.e. effect of DARE Evaluation can correct deficiencies Evaluation Research is not one research method – It is usually several methods Purpose: investigate training, therapies, treatments etc. to access effectiveness Designed to assess effect of social programs
2
History-became important due to all the social programs after 1920’s Depression and again in 60’s in Great Society era During the 1960’s RAND expanded from an Airforce planning facility to a major government research firm and still is today As social programs declined in the 80’s so did government research projects
3
Components of Evaluation The Inputs-program resources, raw materials, clients and staff in program The program process - how the program is implemented -how service or treatment is delivered
4
Outputs - the product of the delivery process, the indicators that the program is in process Outcomes – Impact of the program Stakeholders – Individuals and groups who have some basis of concern with the program (clients, staff, managers, funders, the public) Evaluation is a systematic approach to providing feedback
5
Questions for Evaluation Research Is the program needed? Needs assessment Could the program be evaluated? How does the program operate? process What is the program’s impact - outcome How efficient is the program - feedback How costly is the program - feedback
6
Basics of Evaluation Research Needs Assessment – Is there a need? What is the level of need? Develop plans for implementing a program to answer need. Evaluate how well the program satisfied different perspectives of the need. Evaluability Assessment-can the program be evaluated in the available time allotted and with the resources available. Formative Evaluation – used to refine and shape the program as it progresses
7
Process Evaluation – investigates the process of service delivery. Helps shape and refine a program when built into initial plan. Can use a wide range of indicators such as records, surveys, and qualitative descriptions. Impact Analysis – Asks the questions – Did the program work and did it have the intended result? measures the extent to which a treatment or other service has an effect (also known as summative evaluation)
8
The method of data collection most used for impact evaluation is experiment However, quasi-experimental design, survey or even qualitative methods can be used. Efficiency Analysis – Compares program costs with program effects. Are the taxpayers or sponsors getting their money’s worth? What resources are required by the program?
9
Types of Efficiency Analysis 1. Cost-benefit analysis – compares program costs to economic value of program benefits – It must identify the specific costs and benefits that will be studied and decide based on $$$ if worth it. Focus is on economic concerns.
10
2. Cost-effectiveness analysis – compares program costs to actual program outcomes. Focus is on outcome. May be more important to researcher and clients than to the tax-payer and sponsor. Unfortunately, the economic aspect is sometimes considered more important than the effectiveness of benefits to participants.
11
Orientation - Researcher or Stakeholder Integrative approach –expects researchers to the concerns of the other people involved (stakeholders) while the design is being formed but expects stakeholders to not be involved once the evaluation process itself begins. But social program evaluation is very political! - program employees try to save own jobs rather than have true evaluation
12
Who Cares? Whose goal matters most? Sponsors or Researcher Sponsor may not wish to follow scientific guideline of making results public and encourage researcher to be responsive to stakeholders first Researcher cannot passively accept the values and views of Stakeholders as most important Researcher need to maintain autonomy but be objective and fair in process
13
So, the Ideal is integrated approach – issues and concerns of both stakeholders and research evaluators are covered Bottomline: Not all agencies really want to know if a program works – especially if they need the answer to be yes and it is actually no
14
Ethics in Evaluation Assigning subjects randomly for treatment or benefit for evaluation purposes Can confidentiality of evaluation be preserved when the info is owned by sponsors and or policy makers? Politics can shape evaluation – results may be shared only with policy makers but shouldn’t all stakeholders receive results?
15
Are risks for participants being minimized? Is informed consent being given Are the subjects particularly vulnerable – mentally ill, children, elderly, student, inmates? These are ethical concerns that the federal government mandates that evaluation researchers take into account.
16
Health Research Extension Act of 1985 If research organization receives federal funds it must have a Review Board to assess all research for adherence to ethical practice guidelines Criteria are: 1. minimize risks 2. risks must be reasonable in relation to benefits
17
3. Selection of individuals must be equitable 4. Informed consent must be given 5. Data should be monitored 6. Privacy and confidentiality should be assured. Since researchers may be required to provide evidence in legal proceedings, subject confidentiality can be a problem.
18
Conclusions Because evaluation designs are complex important outcomes or aspects of program process may be missed Including stakeholders in research decisions may undermine adherence to scientific standards Researchers may be pressured to only report positive conclusions Findings may be distorted or over simplified because of who gets the report
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.