EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.

Slides:



Advertisements
Similar presentations
Introduction to Psychology
Advertisements

The National Conference on Value-Added UW-Madison, April Program Chairs Douglas Harris, Adam Gamoran, Steve Raudenbush Program Committee Members.
Mywish K. Maredia Michigan State University
Chapter 2: The Research Process
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine October 14, 2010.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
8. Evidence-based management Step 3: Critical appraisal of studies
Reading the Dental Literature
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine October 7, 2010.
Evaluation.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
Copyright © 2009 Pearson Education Canada2-1 Chapter 2: Child Development 2.1 Doing Child-Development Research 2.2 Child-Development Research and Family.
PPA 502 – Program Evaluation
Chapter Three Research Design.
Types of Evaluation.
EVAL 6970: Experimental and Quasi- Experimental Designs Dr. Chris L. S. Coryn Dr. Anne Cullen Spring 2012.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine September 23, 2010.
S-005 Types of research in education. Types of research A wide variety of approaches: –Theoretical studies –Summaries of studies Reviews of the literature.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
How to Develop the Right Research Questions for Program Evaluation
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
EVIDENCE BASED PROGRAMS Dr. Carol AlbrechtUtah State Extension Assessment
INTRODUCTION TO OPERATIONS RESEARCH Niranjan Saggurti, PhD Senior Researcher Scientific Development Workshop Operations Research on GBV/HIV AIDS 2014,
Research Problem.
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
EVAL 6970: Cost Analysis for Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
How to Write a Critical Review of Research Articles
Experimental Methods Sept 13 & 14 Objective: Students will be able to explain and evaluate the research methods used in psychology. Agenda: 1. CBM 2. Reading.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Evaluating a Research Report
Exploratory Research Design Week 02
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
Evaluating HRD Programs
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Review: Alternative Approaches II What three approaches did we last cover? What three approaches did we last cover? Describe one benefit of each approach.
1 Experimental Research Cause + Effect Manipulation Control.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
Scientifically-Based Research What is scientifically-based research? How do evaluate it?
Lecture 02.
Overview of Research Designs
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
EVAL 6970: Cost Analysis for Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Evidence Based Practice RCS /9/05. Definitions  Rosenthal and Donald (1996) defined evidence-based medicine as a process of turning clinical problems.
EVAL 6970: Cost Analysis for Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Learning Objectives In this chapter you will learn about the elements of the research process some basic research designs program evaluation the justification.
The Research Process.  There are 8 stages to the research process.  Each stage is important, but some hold more significance than others.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
3-1 Copyright © 2010 Pearson Education, Inc. Chapter Three Research Design.
Program Evaluation Principles and Applications PAS 2010.
S-005 Types of research in education. Types of research A wide variety of approaches: –Theoretical studies –Summaries of studies Reviews of the literature.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
Research design By Dr.Ali Almesrawi asst. professor Ph.D.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
Introduction Social ecological approach to behavior change
Chapter 12 Quantitative Questions and Procedures.
Introduction Social ecological approach to behavior change
Program Evaluation ED 740 Study Team Project Program Evaluation
Overview of Research Designs
Reading Research Papers-A Basic Guide to Critical Analysis
Critical Appraisal วิจารณญาณ
Presentation transcript:

EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014

Agenda Quasi-evaluation studies Activity (if time allows)

Quasi-evaluation studies

Address specific questions (often employing a wide range of methods) Advocate use a particular method Whether the questions or methods are appropriate for assessing merit and worth is a secondary consideration Both are narrow in scope and often deliver less than a full assessment of merit and worth

Approach 7: Objectives-based studies Advance organizers – Statement of program objectives Purposes – To determine to what extent a program achieved objectives Sources of questions – Objectives as defined by staff, funder, or evaluator Questions – To what extent were each of the stated objectives met?

Objectives-based evaluation results from a national research center

Methods – Any relevant method for determining to which operationally defined objectives were met Pioneers – Ralph Tyler Use considerations – Must have clear, supportable objectives Strengths – Ease of application Weaknesses – Narrowness and inability to identify positive and negative side effects

Approach 7: The success case method Advance organizers – Comparison between successful and unsuccessful instances Purposes – To determine how well and in what respects a program is ‘working’ Sources of questions – Generally from program providers Questions – What are the noteworthy successes? – How were successes produced? – What factors contributed to success/failure?

Methods 1.Focus and plan the study 2.Create an impact model 3.Survey all participants 4.Interview a sample of success and nonsuccess cases 5.Communicate findings, conclusions, and recommendations Pioneers – Robert Brinkerhoff Use considerations – Intended to assist service providers in increasing ‘successes’ and decreasing ‘nonsuccesses’ Strengths – Ease of application – Use for improvement Weaknesses – Narrowness of scope

Approach 9: Outcome evaluation as value-added assessment Advance organizers – System-wide measures of ‘growth’ or ‘gains’ Purposes – ‘Value added’ by a program and its constituent parts Sources of questions – Oversight bodies Questions – What parts of a program contribute most to ‘growth’ or ‘gains’?

Methods – Gain score analysis, hierarchical linear modeling, etc. Pioneers – Raudenbush, Sanders, Horn, Timms, etc. Use considerations – Can be used to make and/or support policy decisions Strengths – Longitudinal rather than cross-sectional Weaknesses – Potential misuse by policy makers in placing ‘blame’

Approach 10: Experimental and quasi-experimental studies Advance organizers – Cause-and-effect hypotheses, competing treatments, etc. Purposes – To determine causal relationships between independent and dependent variables Sources of questions – Researchers, developers, policy makers, etc. Questions – To what extent is one treatment superior to another?

Methods – Random or other method of assignment to conditions Pioneers – Campbell, Cook, Shadish Use considerations – Addresses only one particular type of question Strengths – Strong causal conclusions (if assumptions are met) Weaknesses – Requires substantial expertise, time, money, etc.

Approach 11: Cost studies Advance organizers – Costs associated with program inputs, outputs, and outcomes Purposes – The costs and outcomes of one more more alternatives Sources of questions – Policy makers, planners, taxpayers, etc. Questions – What are the costs of obtaining desired outcomes?

Methods – Analysis of monetary and nonmonetary units Pioneers – Levin, McEwin, Yates, etc. Use considerations – Expertise required Strengths – ‘Bottom line’ conclusions of interest to most decision makers Weaknesses – Often difficult to validly execute

Approach 12: Connoisseurship and criticism Advance organizers – Specialized expertise, sensitivities, tacit knowledge, etc. Purposes – To describe, appraise, and illuminate Sources of questions – Determined by the ‘connoisseurs’ or ‘critics’ Questions – What are a program’s salient characteristics, strengths, and weaknesses?

Methods – Perceptual sensitivity, prior experience, refined insights, etc. Pioneers – Elliot Eisner Use considerations – An audience willing to accept the approach Strengths – Exploitation of refined expertise Weaknesses – Objectivity and reliability

Approach 13: Theory-based evaluation Advance organizers – A carefully specified ‘theory’ of how a program is intended to operate Purposes – To determine the extent to which a program is ‘theoretically sound’ Sources of questions – Determined by the guiding program theory Questions – To what extent does the program theory ‘work’ or not?

Methods – Any method appropriate for testing the program theory Pioneers – Chen, Donaldson, Weiss, Rogers, Rossi, etc. Use considerations – Difficulty in applying the approach Strengths – Useful for determining potential ‘measurement’ variables Weaknesses – Few programs are grounded by validated ‘theories’

Approach 14: Meta-analysis Advance organizers – Sufficient studies of the same or similar programs Purposes – To assemble and (statistically) integrate findings from multiple studies of the same or similar programs Sources of questions – Policy makers, ‘research repositories’, etc. Questions – What is the average effect of a particular type of program?

Methods – Statistical methods for integrating study results (varies widely) Pioneers – Glass Use considerations – Major source of contemporary ‘best practices’ across a variety of domains Strengths – Evidence of effectiveness over units, treatments, observations, and settings Weaknesses – Exclusive emphasis on program outcomes

Activity

We will split the class into two sections (1 and 2) – In each section, appoint one student to chair your appointed group – Each member of section 1 should select one of the approaches discussed today and discuss why it is useful – Members of section 2 should listen and take notes – Members of section 2 should then outline weaknesses of the selected approaches – Finally, the chair of each group should discuss the potential strengths, weaknesses, and utility of the selected approaches

Encyclopedia Entries Bias Causation Checklists Chelimsky, Eleanor Conflict of Interest Countenance Model of Evaluation Critical Theory Evaluation Effectiveness Efficiency Empiricism Independence Evaluability Assessment Evaluation Use Fournier, Deborah Positivism Relativism Responsive evaluation Stake, Robert Thick Description Utilization of Evaluation Weiss, Carol Wholey, Joseph