1 Chapter 11 Evaluation research. 2 Evaluation research is not a method of data collection, like survey research of experiments, nor is it a unique component.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable.
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Donald T. Simeon Caribbean Health Research Council
Barbara M. Altman Emmanuelle Cambois Jean-Marie Robine Extended Questions Sets: Purpose, Characteristics and Topic Areas Fifth Washington group meeting.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Designing an Effective Evaluation Strategy
Chapter 1: An Overview of Program Evaluation Presenter: Stephanie Peterson.
Project Monitoring Evaluation and Assessment
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Evaluation Research Kodi D. Havins AED 615 Fall 2006 Dr. Franklin.
Laura Pejsa Goff Pejsa & Associates MESI 2014
Chapter 2 Flashcards.
Introduction to Research Methodology
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
PPA 502 – Program Evaluation
Happy semester with best wishes from all nursing staff Dr Naiema Gaber
Problem Analysis Intelligence Step 2 - Problem Analysis Developing solutions to complex population nutrition problems (such as obesity or food insecurity)
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Types of Evaluation.
Performance Management Upul Abeyrathne, Dept. of Economics, University of Ruhuna, Matara.
Health Systems and the Cycle of Health System Reform
Standards and Guidelines for Quality Assurance in the European
PAI786: Urban Policy Class 2: Evaluating Social Programs.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Chapter 1 Psychology as a Science
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Program Evaluation Using qualitative & qualitative methods.
Research Methods in Psychology (Pp 1-31). Research Studies Pay particular attention to research studies cited throughout your textbook(s) as you prepare.
Topic 4 How organisations promote quality care Codes of Practice
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Too expensive Too complicated Too time consuming.
Chapter 11 Evaluation and Policy Research
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Outcome Based Evaluation for Digital Library Projects and Services
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Ways for Improvement of Validity of Qualifications PHARE TVET RO2006/ Training and Advice for Further Development of the TVET.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Nursing Research Prof. Nawal A. Fouad (5) March 2007.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
Making Sense of the Social World 4 th Edition Chapter 11, Evaluation Research.
Measuring Efficiency CRJS 4466EA. Introduction It is very important to understand the effectiveness of a program, as we have discovered in all earlier.
Professional Learning and Development: Best Evidence Synthesis Helen Timperley, Aaron Wilson and Heather Barrar Learning Languages March 2008.
Qualitative Research January 19, Selecting A Topic Trying to be original while balancing need to be realistic—so you can master a reasonable amount.
Chapter 8 New Wave Research: Contemporary Applied Approaches.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
The Practical Aspects of Doing Research An Giang University June, 2004 Dennis Berg, Ph.D.
+ Evidence Based Practice University of Utah Evidence-Based Treatment and Practice: New Opportunities to Bridge Clinical Research and Practice, Enhance.
Including School Stakeholders. There are many individuals and groups associated with schools and many of these people are likely to have valuable ideas.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
TCRF Strategic Planning Process A Stakeholders’ Consultative Retreat- Morogoro 26 th -27 April 2013.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Basic Concepts of Outcome-Informed Practice (OIP).
Sociology. Sociology is a science because it uses the same techniques as other sciences Explaining social phenomena is what sociological theory is all.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
AUDIT STAFF TRAINING WORKSHOP 13 TH – 14 TH NOVEMBER 2014, HILTON HOTEL NAIROBI AUDIT PLANNING 1.
DATA COLLECTION METHODS IN NURSING RESEARCH
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
CATHCA National Conference 2018
Standard for Teachers’ Professional Development July 2016
Presentation transcript:

1 Chapter 11 Evaluation research

2 Evaluation research is not a method of data collection, like survey research of experiments, nor is it a unique component of research designs, like sampling or measurement. Instead, evaluation research is social research that is conducted for a distinctive purpose: to investigate social programs (e.g., substance abuse treatment programs, welfare programs, criminal justice programs, or employment and training programs).

3 For each project, an evaluation researcher must select a research design and method of data collection that are useful for answering the particular research questions posed and appropriate for the particular program investigated. The development of evaluation research as a major enterprise followed on the heels of the expansion of the federal government during the Great Depression and World War II.

4 Large Depression-era government outlays for social program stimulated interest in monitoring program output, and the military effort in World War II led to some of the necessary review and contracting procedures for sponsoring evaluation research. In the 1960s, criminal justice researchers began to use experiments to test the value of different policies (Orr 1999:24).

5 In the early 1980s, after this period of rapid growth, many evaluation research firms closed in tandem with the decline of many Great Society programs. However, the demand for evaluation research continues, due, in part, to government requirements. The growth of evaluation research is also reflected in the social science community. The American Evaluation Association was founded in 1986 as a professional organization for evaluation researchers (merging two previous associations) and the publisher of an evaluation research journal.

6 The process of evaluation research can be viewed as a simple systems model. First, clients, customers, students, or some other persons or inputs units—cases—enter the program as inputs. (You’ll notice that this model treats programs like machines, with people functioning as raw materials to be processed.) Resources and staff required by a program are also program inputs.

7 Insert exhibit 11.1

8 Next some service or treatment is provided to the cases. This may be attendance in a class, assistance with a health problem, residence in new housing, or receipt of special cash benefits. program processThe program process may be simple or complicated, short or long, but it is designed to have some impact on the cases.

9 The direct product of the program’s service delivery output. process is its output. Program outputs may include clients served, case managers trained, food parcels delivered, or arrests made. The program outputs may be desirable in themselves, but they primarily serve to indicate that the program is operating.

10 outcomesProgram outcomes indicate the impact of the program on the cases that have been processed. Outcomes can range from improved test scores or higher rates of job retention to fewer criminal offenses and lower rates of poverty. Any social program is likely to have multiple outcomes, some intended and some unintended, some positive and others that are viewed as negative.

11 Variation in both outputs and outcomes, in turn, feedback influence the inputs to the program through a feedback process. If not enough clients are being served, recruitment of new clients may increase. If too many negative side effects result from a trial medication, the trials may be limited or terminated. If a program does not appear to lead to improved outcomes, clients may go elsewhere.

12 The evaluation process as a whole, and feedback in particular, can be understood only in relation to the interests and perspective of program stakeholders. StakeholdersStakeholders are those individuals and groups who have some basis of concern with the program. They might be clients, staff, managers, funders, or the public. Who the program stakeholders are and what role they play in the program evaluation will have tremendous consequences for the research.

13 Alternatives in evaluation designs Evaluation research tries to learn if, and how, real-world programs produce results. But that simple statement covers a number of important alternatives in research design, including the following: Black box or program theoryBlack box or program theory—Do we care how the program gets results? Researcher or stakeholders orientationResearcher or stakeholders orientation—Whose goals matter most? Quantitative or qualitative methodsQuantitative or qualitative methods—Which methods provide the best answers? Simple or complex outcomes  Simple or complex outcomes—How complicated should the findings be?

14 Black box or program theory Most evaluation research tries to determine whether a program has the intended effect. If the effect occurred, the program “worked”; if the effect didn’t occur, then, some would say, the program should be abandoned or redesigned. In this simple approach, the process by which a program produces outcomes is often treated as a “black box,” in which the “inside” of the program is unknown.

15 The focus of such research is whether cases have changed as a result of their exposure to the program, between the time they entered as inputs and when they exited as outputs (Chen, 1990). The assumption is that program evaluation requires only the test of a simple input/output model, like that in Exhibit There may be no attempt to “open the black box” of the program process.

16 If an investigation of program process had been conducted, program theory though, a program theory could have been developed. A program theory describes what has been learned about how the program has its effect. When a researcher has sufficient knowledge before the investigation begins, outlining a program theory can help to guide the investigation of program process in the most productive directions. theory-driven evaluation.This is termed a theory-driven evaluation.

17 Program theory can be either descriptive or prescriptive (Chen, 1990). Descriptive theoryDescriptive theory specifies impacts that are generated and how this occurs. It suggests a causal mechanism, including intervening factors, and the necessary context for the effects. Descriptive theories are generally empirically based.

18 Prescriptive theoryPrescriptive theory specifies what ought to be done by the program, and is not actually tested. Prescriptive theory specifies how to design or implement the treatment, what outcomes should be expected, and how performance should be judged. Comparison of the program’s descriptive and prescriptive theories can help to identify implementation difficulties and incorrect understandings that can be fixed (Patton, 2002:162–164).

19 Researcher or stakeholder orientation Stakeholder approachesStakeholder approaches encourage researchers to be responsive to program stakeholders. Issues for study are to be based on the views of people involved with the program and reports are to be made to program participants (Stake, 1975). The stakeholders and others who may be drawn into the evaluation are welcomed as equal partners in every aspect of design, implementation, interpretation, and resulting action of an evaluation—that is, they are accorded a full measure of political parity and control....determining what questions are to be asked and what information is to be collected on the basis of stakeholder inputs.

20 Social science approachesSocial science approaches, in contrast, emphasize researcher expertise autonomy in order to develop the most trustworthy, unbiased program evaluation. A program theory is derived from information on how the program operates and current social science theory, not from the views of stakeholders.

21 Integrative approachesIntegrative approaches attempt to cover issues of concern to both stakeholders, and evaluators. The emphasis given to either stakeholder or scientific concerns varies with the specific circumstances. Integrative approaches seek to balance responsiveness to stakeholders with being objectivity and scientific validity.

22 Quantitative and qualitative approaches to evaluation each have their strengths and appropriate uses. Quantitative research, with its clear percentages and numerical scores, allows quick comparisons over time and categories, and thus is typically used in attempts to identify the effects of a social program. Qualitative methods can add depth, detail, and nuance; they can clarify the meaning of survey responses, and reveal more complex emotions and judgments people may have.

23 Simple or complex outcomes Few programs have only one outcome. Sometimes a single policy outcome is sought, but is found not to be sufficient, either methodologically or substantively. In spite of the difficulties, most evaluation researchers attempt to measure multiple outcomes. Collection of multiple outcomes gives a better picture of program impact.

24 Focus of evaluation studies Evaluation projects can focus on a variety of different questions related to social programs and their impact. Which question is asked will determine what research methods are used.  What is the level of need for the program?  Can the program be evaluated?  How does the program operate?  What is the program’s impact? How efficient is the program?

25 Needs assessment needs assessmentA needs assessment attempts, with systematic, credible evidence, to evaluate what needs exist in a population. Need may be assessed by social indicators such as the poverty rate or the level of home ownership, interviews with local experts such as school board members or team captains, surveys of populations potentially in need, or focus groups with community residents. In general, it is a good idea to use multiple indicators of need. There is no absolute definition of need in most projects.

26 Evaluability assessment Some type of study is always possible, but to specifically identify the effects of a program may not be possible within the available time and resources. evaluability assessmentSo researchers may conduct an evaluability assessment to learn this in advance, rather than expend time and effort on a fruitless project. Because they are preliminary studies to “check things out,” evaluability assessments often rely on qualitative methods. The knowledge gained can be used to refine evaluation plans.

27 Process evaluation Process evaluation:Process evaluation: Evaluation research that investigates the process of service delivery. Process evaluation is more important when more complex programs are evaluated. Many social programs comprise multiple elements and are delivered over an extended period of time, often by different providers in different areas.

28 Formative evaluation Formative evaluationFormative evaluation: Process evaluation that is used to shape and refine program operations. Evaluation may then lead to changes in recruitment procedures, program delivery, or measurement tools.

29 Impact analysis The core questions of evaluation research are: Did the program work? Did it have the intended result? This kind of research is variously called impact analysis, impact evaluation, or summative evaluation. Impact analysissummative evaluationImpact analysis (also called summative evaluation) compares what happened after a program was implemented with what would have happened had there been no program at all.

30 Efficiency analysis Finally, a program may be evaluated for how efficiently it provides its benefit; typically, financial measures are used. Cost-benefit analysisCost-benefit analysis: a type of evaluation that identifies the specific program costs and the procedures for estimating the economic value of specific program benefits. Cost-effectiveness analysisCost-effectiveness analysis: a type of evaluation research that focuses attention directly on the program’s outcomes rather than on the economic value of those outcomes.

31 Ethics in evaluation Evaluation research can make a difference in people’s lives while the research is being conducted, as well as after the results are reported. Job opportunities, welfare requirements, housing options, treatment for substance abuse, and training programs are each potentially important benefits, and an evaluation research project can change both the type and availability of such benefits. This direct impact on research participants and, potentially, their families, heightens the attention that evaluation researchers have to give to human subjects concerns.

32 There are many specific ethical challenges in evaluation research:  How can confidentiality be preserved when the data are owned by a government agency or are subject to discovery in a legal proceeding?  Who decides what burden an evaluation project can impose upon participants? Can a research decision legitimately be shaped by political considerations? Must findings be shared with all stakeholders, or only with policymakers?  Will a randomized experiment yield more defensible evidence than the alternatives? Will the results actually be used?

33 Hopes for evaluation research are high: Society could benefit from the development of programs that work well, accomplish their policy goals, and that serve people who genuinely need them. Evaluation research can provide social scientists with rare opportunities to study complex social process, with real consequences, and to contribute to the public good. Although they may face unusual constraints on their research designs, most evaluation projects can result in high-quality analysis and publications in reputable social science journals.

34