EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Educational Specialists Performance Evaluation System
Evaluating health informatics projects Reasons for and problems of evaluation Objective model Subjective model.
Donald T. Simeon Caribbean Health Research Council
Designing an Effective Evaluation Strategy
Chapter 1: An Overview of Program Evaluation Presenter: Stephanie Peterson.
Fundamentals of IRB Review. Regulatory Role of the IRB Authority to approve, require modifications in (to secure approval), or disapprove all research.
Ethical Considerations when Developing Human Research Protocols A discipline “born in scandal and reared in protectionism” Carol Levine, 1988.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Chapter 2 Flashcards.
TODAY’S TOPIC: Ethics – deconstructing consent and participation with “vulnerable” populations.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
SOWK 6003 Social Work Research Week 7 Designs for Evaluating Programmes and Practice – Experimental designs By Dr. Paul Wong.
SWRK 292 Thesis/Project Seminar. Expectations for Course Apply research concepts from SWRK 291. Write first three chapters of your project or thesis.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
What Needs HSC Review? Staying compliant with Federal research regulations.
1 Chapter 11 Evaluation research. 2 Evaluation research is not a method of data collection, like survey research of experiments, nor is it a unique component.
8 Criteria for IRB Approval of Research 45 CFR (a)
Internal Control. COSO’s Framework Committee of Sponsoring Organizations 1992 issued a white paper on internal control Since this time, this framework.
Informed Consent and HIPAA Tim Noe Coordinating Center.
Evaluation. Practical Evaluation Michael Quinn Patton.
Types of Evaluation.
Health Systems and the Cycle of Health System Reform
Chapter 3 Ethics in research.
DC Home visiting Implementation and impact evaluation
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Why Return on Investment (ROI) Matters Raimo Vuorinen presenting for: James P. Sampson, Jr. Florida State University.
REC 375—Leadership and Management of Parks and Recreation Services Jim Herstine, Ph.D., CPRP Assistant Professor Parks and Recreation Management UNC Wilmington.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
INTRODUCTION TO OPERATIONS RESEARCH Niranjan Saggurti, PhD Senior Researcher Scientific Development Workshop Operations Research on GBV/HIV AIDS 2014,
Chapter 15 Current Concerns and Future Challenges.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Medical Audit.
STANDARDS FOR THE PRACTICE RECREATIONAL THERAPY (ATRA, REVISED 2013) HPR 453.
HSA 171 CAR. 1436/ 7/4  The results of activities of an organization or investment over a given period of time.  Organizational Performance: ◦ A measure.
Chapter 11 Evaluation and Policy Research
Evaluation Assists with allocating resources what is working how things can work better.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Workshop 6 - How do you measure Outcomes?
Copyright © 2008 Delmar Learning. All rights reserved. Unit 8 Observation, Reporting, and Documentation.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
Making Sense of the Social World 4 th Edition Chapter 11, Evaluation Research.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
TYPES OF EVALUATION Types of evaluations ask different questions and focus on different purposes. This list is meant to be illustrative rather than exhaustive.
for quality and accountability
Tri-Council Guidelines.  Between 1932 and 1972, 412 men with untreated syphilis compared with 204 disease-free men to study the natural course of the.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
APPROVAL CRITERIA AN IRB INFOSHORT MAY CFR CRITERIA FOR IRB APPROVAL OF RESEARCH In order for an IRB to approve a research study, all.
Quality Frameworks: Implementation and Impact Notes by Michael Colledge.
What is a Performance Audit or Performance Auditing?
Evaluation Research Dr. Guerette. Introduction Evaluation Research – Evaluation Research – The purpose is to evaluate the impact of policies The purpose.
Chapter 5 Ethical Concerns in Research. Historical Perspective on Ethics Nazi Experimentation in WWII –“medical experiments” –Nuremberg War Crime Trials.
Basic Concepts of Outcome-Informed Practice (OIP).
Chapter 2: Ethical Issues in Program Evaluation. Institutional Review Boards (IRBs) Federal mandate for IRBs –Concern during 1970s about unethical research.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Evaluation What is evaluation?
PRAGMATIC Study Designs: Elderly Cancer Trials
Logic Models How to Integrate Data Collection into your Everyday Work.
Back to Basics – Approval Criteria
Right-sized Evaluation
An Overview of Evaluation Research
ETHICAL CONSIDERATIONS IN THE CONDUCT OF HEALTH SCIENCES RESEARCH
CATHCA National Conference 2018
Exploring 45 CFR , Criteria for IRB Approval of Research
Assessment of Service Outcomes
Human Participants Research
Presentation transcript:

EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically and fairly – i.e. effect of DARE Evaluation can correct deficiencies Evaluation Research is not one research method – It is usually several methods Purpose: investigate training, therapies, treatments etc. to access effectiveness Designed to assess effect of social programs

History-became important due to all the social programs after 1920’s Depression and again in 60’s in Great Society era During the 1960’s RAND expanded from an Airforce planning facility to a major government research firm and still is today As social programs declined in the 80’s so did government research projects

Components of Evaluation The Inputs-program resources, raw materials, clients and staff in program The program process - how the program is implemented -how service or treatment is delivered

Outputs - the product of the delivery process, the indicators that the program is in process Outcomes – Impact of the program Stakeholders – Individuals and groups who have some basis of concern with the program (clients, staff, managers, funders, the public) Evaluation is a systematic approach to providing feedback

Questions for Evaluation Research Is the program needed? Needs assessment Could the program be evaluated? How does the program operate? process What is the program’s impact - outcome How efficient is the program - feedback How costly is the program - feedback

Basics of Evaluation Research Needs Assessment – Is there a need? What is the level of need? Develop plans for implementing a program to answer need. Evaluate how well the program satisfied different perspectives of the need. Evaluability Assessment-can the program be evaluated in the available time allotted and with the resources available. Formative Evaluation – used to refine and shape the program as it progresses

Process Evaluation – investigates the process of service delivery. Helps shape and refine a program when built into initial plan. Can use a wide range of indicators such as records, surveys, and qualitative descriptions. Impact Analysis – Asks the questions – Did the program work and did it have the intended result? measures the extent to which a treatment or other service has an effect (also known as summative evaluation)

The method of data collection most used for impact evaluation is experiment However, quasi-experimental design, survey or even qualitative methods can be used. Efficiency Analysis – Compares program costs with program effects. Are the taxpayers or sponsors getting their money’s worth? What resources are required by the program?

Types of Efficiency Analysis 1. Cost-benefit analysis – compares program costs to economic value of program benefits – It must identify the specific costs and benefits that will be studied and decide based on $$$ if worth it. Focus is on economic concerns.

2. Cost-effectiveness analysis – compares program costs to actual program outcomes. Focus is on outcome. May be more important to researcher and clients than to the tax-payer and sponsor. Unfortunately, the economic aspect is sometimes considered more important than the effectiveness of benefits to participants.

Orientation - Researcher or Stakeholder Integrative approach –expects researchers to the concerns of the other people involved (stakeholders) while the design is being formed but expects stakeholders to not be involved once the evaluation process itself begins. But social program evaluation is very political! - program employees try to save own jobs rather than have true evaluation

Who Cares? Whose goal matters most? Sponsors or Researcher Sponsor may not wish to follow scientific guideline of making results public and encourage researcher to be responsive to stakeholders first Researcher cannot passively accept the values and views of Stakeholders as most important Researcher need to maintain autonomy but be objective and fair in process

So, the Ideal is integrated approach – issues and concerns of both stakeholders and research evaluators are covered Bottomline: Not all agencies really want to know if a program works – especially if they need the answer to be yes and it is actually no

Ethics in Evaluation Assigning subjects randomly for treatment or benefit for evaluation purposes Can confidentiality of evaluation be preserved when the info is owned by sponsors and or policy makers? Politics can shape evaluation – results may be shared only with policy makers but shouldn’t all stakeholders receive results?

Are risks for participants being minimized? Is informed consent being given Are the subjects particularly vulnerable – mentally ill, children, elderly, student, inmates? These are ethical concerns that the federal government mandates that evaluation researchers take into account.

Health Research Extension Act of 1985 If research organization receives federal funds it must have a Review Board to assess all research for adherence to ethical practice guidelines Criteria are: 1. minimize risks 2. risks must be reasonable in relation to benefits

3. Selection of individuals must be equitable 4. Informed consent must be given 5. Data should be monitored 6. Privacy and confidentiality should be assured. Since researchers may be required to provide evidence in legal proceedings, subject confidentiality can be a problem.

Conclusions Because evaluation designs are complex important outcomes or aspects of program process may be missed Including stakeholders in research decisions may undermine adherence to scientific standards Researchers may be pressured to only report positive conclusions Findings may be distorted or over simplified because of who gets the report