EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

Program Goals Just Arent Enough: Strategies for Putting Learning Outcomes into Words Dr. Jill L. Lane Research Associate/Program Manager Schreyer Institute.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Project Management Shuffle Directions: take the definitions from the following cards and write a song using the tune from “Cupid Shuffle”
Evaluation Research and Problem Analysis
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
 How to infer causation: 8 strategies?  How to put them together? S519.
Introduction to Research Methodology
EVAL 6970: Experimental and Quasi- Experimental Designs Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2013.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
 To assess the learners achievement at the end of a teaching-learning process, for instance, at the end of the unit.  Measures the learners attainment.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Research problem, Purpose, question
Types of Evaluation.
EVAL 6970: Experimental and Quasi- Experimental Designs Dr. Chris L. S. Coryn Dr. Anne Cullen Spring 2012.
Formulating the research design
Science and Engineering Practices
How to Develop the Right Research Questions for Program Evaluation
Proposal Writing for Competitive Grant Systems
Qualitative Research.
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Developing an IS/IT Strategy
What research is Noun: The systematic investigation into and study of materials and sources in order to establish facts and reach new conclusions. Verb:
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Qualifications Update: Environmental Science Qualifications Update: Environmental Science.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies.
Writing research proposal/synopsis
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Chapter 11: Qualitative and Mixed-Method Research Design
Ch. 2: Planning a Study (cont’d) pp THE RESEARCH PROPOSAL  In all empirical research studies, you systematically collect and analyze data 
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Experimental Research Methods in Language Learning Chapter 16 Experimental Research Proposals.
HOW TO WRITE RESEARCH PROPOSAL BY DR. NIK MAHERAN NIK MUHAMMAD.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
TYPES OF EVALUATION Types of evaluations ask different questions and focus on different purposes. This list is meant to be illustrative rather than exhaustive.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
CHAPTER 1 Understanding RESEARCH
Review: Alternative Approaches II What three approaches did we last cover? What three approaches did we last cover? Describe one benefit of each approach.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Unpacking the Elements of Scientific Reasoning Keisha Varma, Patricia Ross, Frances Lawrenz, Gill Roehrig, Douglas Huffman, Leah McGuire, Ying-Chih Chen,
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Qualifications Update: Human Biology Qualifications Update: Human Biology.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Qualitative Research EDUC 7741/Paris/Terry.
Introducing Unit Specifications and Unit Assessment Support Packs Environmental Science National 3, National 4 and National 5.
The Proposal AEE 804 Spring 2002 Revised Spring 2003 Reese & Woods.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
Research Design Quantitative Study Design - B. Back to Class 9.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
ABRA Week 3 research design, methods… SS. Research Design and Method.
What is Research Design? RD is the general plan of how you will answer your research question(s) The plan should state clearly the following issues: The.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
STEPS IN RESEARCH PROCESS 1. Identification of Research Problems This involves Identifying existing problems in an area of study (e.g. Home Economics),
Statistics & Evidence-Based Practice
DATA COLLECTION METHODS IN NURSING RESEARCH
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
IB Environmental Systems and Societies
RESEARCH BASICS What is research?.
Changing the Game The Logic Model
M & E Plans and Frameworks
Presentation transcript:

EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011

Agenda Question- and method-oriented approaches Questions and discussion

Evaluation Theory Tree

Question- and Method- Oriented Approaches Address specific questions (often employing a wide range of methods) Advocate use a particular method Whether the questions or methods are appropriate for assessing merit and worth is a secondary consideration Both are narrow in scope and often deliver less than a full assessment of merit and worth

Objectives-Based Studies Some statement of objectives serves as the advance organizer Typically, an internal study conducted in order to determine if the evaluand’s objectives have been achieved Operationalize objectives, then collect and analyze information to determine how well each objective was met

Objectives-based evaluation results from a national research center

Accountability, Particularly Payment by Results Narrows evaluation to questions about outcomes Stresses importance of obtaining external, impartial perspective Key components include pass-fail standards, payment for good results, and sanctions for unacceptable performance

Success Case Method Evaluator deliberately searches for and illuminates instances of success and contrasts them to what is not working Compares least successful instances to most successful instances Intended as a relatively quick and affordable means of gathering important information for use in improving an evaluand

Standard normal distribution and location of ‘success’ and ‘failure’ cases

Objective Testing Programs Testing to assess the achievements of individual students and groups of students compared with norms, standards, or previous performance

Outcome Evaluation as Value- Added Assessment Recurrent outcome and value-added assessment coupled with hierarchical gain score analysis Emphasis on assessing trends and partialling out effects of the different components of an educational system, including groups of schools, individual schools, and individual teachers The intent is to determine what value each is adding to the achievement of students

Performance Testing Devices that require students (or others) to demonstrate their achievements by producing authentic responses to select tasks, such as written or spoken answers, musical or psychomotor presentations, portfolios of work products, or group solutions to defined problems Performance assessments are usually life-skill and content-related performance tasks so that achievement can be demonstrated in practice

Experimental and Quasi- Experimental Design Studies Random assignment to one or experimental or control conditions and then contrasting outcomes Required assumptions can rarely be met As a methodology, addresses only a narrow set of issues (i.e., cause-and- effect) Done correctly, produces unbiased estimates of effect sizes

Flow of units through a typical randomized experiment

Management Information Systems Supply information needed to conduct and report on an evaluand Typically organized around objectives, specified activities, projected milestones or events, and budget Government Performance and Results Act (GPRA) of 1993 and Performance Assessment Rating Tool (PART)

Cost Studies Largely quantitative procedures designed to understand the full costs of an evaluand and to determine and judge what investments returned in objectives achieved and broader societal benefits Compares computed ratios to those of similar evaluands Can include cost-benefit, cost- effectiveness, cost-utility, return on investment, rate of economic return, etc.

Judicial and Advocate- Adversary Evaluation Essentially puts an evaluand on trial Role-playing evaluators implement a prosecution and defense Judge hears arguments within the framework of a jury trial Intended to provide balanced evidence on an evaluand’s strengths and weaknesses

Case Studies Focused, in-depth description, analysis, and synthesis Examines evaluand in context (e.g., geographical, cultural, organizational, historical, political) Mainly concerned with describing and illuminating an evaluand, not determining merit and worth Stake’s approach differs dramatically from Yin’s

Case study designs

Theory-Driven/Theory-Based Program evaluations based on a program theory often begin with either (1) a well- developed and validated theory of how programs of a certain type within similar settings operate to produce outcomes or (2) an initial stage to approximate such a theory within the context of a particular program evaluation The theory can then aid a program evaluator to decide what questions, indicators (i.e., manifest variables), and linkages (assumed to be causal) between and among program elements should be used to evaluate a program

Theory-Driven/Theory-Based The point of a theory development or selection effort is to identify advance organizers to guide the evaluation (e.g., in the form of a measurement model) Essentially these are the mechanisms by which program activities are understood to produce or contribute to program outcomes, along with the appropriate description of context, specification of independent and dependent variables, and portrayal of key linkages The main purposes of theory-based evaluation are to determine the extent to which the program of interest is theoretically sound, understand why it is succeeding or failing, and provide direction for program improvement

Linear program theory model

Ecological program theory model

Mixed-Method Studies Combines quantitative and qualitative techniques Less concerned with assessing merit and worth, more concerned with “mixing” methodological approaches A key feature is triangulation Aimed at depth, scope, and dependability of findings

Basic mixed-method designs

Meta-Analysis and Research Reviews Premised on the assumption that individual studies provide only limited information about the effectiveness of programs, each contributing to a larger base of knowledge Concentrate almost exclusively on the desired effects or outcomes of programs The principal ideology is that an evaluation should not be viewed in isolation, but rather as one of a set of tests of a program or intervention’s results across variations in persons, treatments, outcomes, contexts, and other variables

Meta-analysis forest plot