Evaluation Research and Problem Analysis

Slides:



Advertisements
Similar presentations
PhD Research Seminar Series: Valid Research Designs
Advertisements

A Systems Approach To Training
Andrea M. Landis, PhD, RN UW LEAH
Donald T. Simeon Caribbean Health Research Council
Comparator Selection in Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research and Quality (AHRQ)
Evaluation Research Pierre-Auguste Renoir: Le Grenouillere, 1869.
CHAPTER 12, evaluation research
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
8. Evidence-based management Step 3: Critical appraisal of studies
Assessing Program Impact Chapter 8. Impact assessments answer… Does a program really work? Does a program produce desired effects over and above what.
Research Design: The Experimental Model and Its Variations
Evaluation Research COMT 502. topics Evaluation research –Process of determining whether intervention has its intended result.
EXPERIMENTS AND OBSERVATIONAL STUDIES Chance Hofmann and Nick Quigley
PPA 503 – The Public Policy Making Process
Research problem, Purpose, question
Chapter One: The Science of Psychology
Chapter 2 Understanding the Research Process
Learning Outcomes from Report-Writing Unit
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
Experimental and Quasi-Experimental Designs
CSD 5100 Introduction to Research Methods in CSD First Day Opening Stretch Course Requirements/Syllabus What is Science? What is Research? The Scientific.
Chapter One: The Science of Psychology. Ways to Acquire Knowledge Tenacity Tenacity Refers to the continued presentation of a particular bit of information.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Techniques of research control: -Extraneous variables (confounding) are: The variables which could have an unwanted effect on the dependent variable under.
Assumes that events are governed by some lawful order
for quality and accountability
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 4 Designing Studies 4.2Experiments.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
THIS PRESENTATION IS INTENDED AS ONE COMPLETE PRESENTATION. HOWEVER, IT IS DIVIDED INTO 3 PARTS IN ORDER TO FACILITATE EASIER DOWNLOADING AND VIEWING,
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Learning Objectives In this chapter you will learn about the elements of the research process some basic research designs program evaluation the justification.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 4 Designing Studies 4.2Experiments.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 4 Designing Studies 4.2Experiments.
Strategic Planning Crossing the ICT Bridge Project Trainers: Lynne Gibb Sally Dusting-Laird.
Practice- Based Evaluation Research Week 4 Day 2 DIE 4564 Research Methods.
Designing New Programs Design & Chronological Perspectives (Presentation of Berk & Rossi’s Thinking About Program Evaluation, Sage Press, 1990)
Evaluation Research Dr. Guerette. Introduction Evaluation Research – Evaluation Research – The purpose is to evaluate the impact of policies The purpose.
Guidelines Recommandations. Role Ideal mediator for bridging between research findings and actual clinical practice Ideal tool for professionals, managers,
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
The Psychologist as Detective, 4e by Smith/Davis © 2007 Pearson Education Chapter One: The Science of Psychology.
CJ490: Research Methods in Criminal Justice UNIT #4 SEMINAR Professor Jeffrey Hauck.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
Session 2: Developing a Comprehensive M&E Work Plan.
Chapter 5 Population Health Quality and Safety Learning Objectives 1. Explain why it is difficult to monitor healthcare quality and safety at the population.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Issues in Evaluating Educational Research
Gathering a credible evidence base
Monitoring and Evaluation Frameworks
QIC-AG Logic Model Template
CHAPTER 4 Designing Studies
Understanding Results
CHAPTER 4 Designing Studies
Building a Strong Outcome Portfolio
CHAPTER 4 Designing Studies
Introduction to Experimental Design
CHAPTER 4 Designing Studies
CHAPTER 4 Designing Studies
CHAPTER 4 Designing Studies
CHAPTER 4 Designing Studies
DESIGN OF EXPERIMENTS by R. C. Baker
CHAPTER 4 Designing Studies
CHAPTER 4 Designing Studies
Presentation transcript:

Evaluation Research and Problem Analysis Chapter 10 Evaluation Research and Problem Analysis

Introduction Evaluation Research: Refers to a research purpose rather than a specific method; seeks to evaluate the impact of interventions; if some result was produced Problem Analysis: Designed to help public officials choose from alternative future actions Policy Intervention: An action taken for the purpose of producing some intended result Evidence-Based Policy: The actions of justice agencies are linked to evidence used for planning and evaluation

The Policy Process Begins with a demand supporting a new course of action or opposition to existing policy Policymakers consider ultimate goals and actions to achieve those goals Outputs - The means to achieve desired goals Impacts – Refer to basic question about what a policy seeks to achieve If some policy is taken, then we expect some result

Linking the Process to Evaluation Are policies being implemented as planned? Are policies achieving their intended goals? Evaluation seeks to link intended actions and goals of policy to empirical evidence that: Policies are being carried out as planned (process evaluation) Policies are having the desired effects (impact assessment) Often conducted together

Getting Started Learning policy goals is a key first step in doing evaluation research Evaluability Assessment: “Pre-evaluation” – researcher determines whether requisite conditions are present Support from relevant organizations What goals and objectives are; how they are translated into program components What kinds of records or data are available Who has a direct or indirect stake in the program

Problem Formulation and Measurement Different stakeholders often have different goals and views as to how a program should actually operate Must clearly specify program goals – Desired outcomes Create objectives – Operationalized statements Definition and measurement –Starget/beneficiary population, decide between using current measures or creating new ones Measure program contexts, outcomes, program delivery

Designs for Program Evaluation Randomized Evaluation Designs: Avoids selection bias, allows assumption that groups created by random assignment are statistically equivalent; may not be suitable when agency or staff makes exceptions Case Flow: Represents process through which subjects are accumulated into experimental and control groups Treatment Integrity: Whether an experimental intervention is delivered as intended; ≈ reliability Threatened by midstream changes in program

Conditions Requisite for Randomized Experiments Staff must accept random assignment and agree to minimize exceptions to randomization Case flow must be adequate to produce enough subjects in each group so that statistical tests will be able to detect significant differences in outcome measures Experimental interventions must be consistently applied to treatment groups and withheld from control groups Need equivalence prior to intervention, and ability to detect differences in outcome measures after intervention

Home Detention: Two Randomized Studies Combining home detention with ELMO Juvenile program paid less attention to delivering program elements and using ELMO info than adult Difficult to maintain desired level of control over experimental conditions Also difficult when more than one organization is involved Randomization does not control for variation in treatment integrity and program delivery; utilize other methods

Quasi-Experimental Designs No random assignment to Experimental and Control group Often “nested” in experimental designs as backups Lack built-in controls for selection & other Internal Validity threats You must construct Experimental and Control groups as similar as possible

Quasi-Experimental Designs Ex post evaluation: Conducted after experimental program has gone into effect Full Coverage Programs: Sentencing Guidelines Larger Treatment Units: Neighborhood crime prevention program Interrupted time-series designs: Require attention to different issues because researchers cannot normally control how reliably the experimental treatment is actually implemented Instrumentation, History, Construct Validity

Problem Analysis and Scientific Realism Problem analysis, coupled with scientific realism, helps public officials use research to select and assess alternative courses of action Realists suggest that similar interventions will have different outcomes in different contexts Evaluators should search for mechanisms (IVs) acting in context (assorted intervening variables) to explain outcomes (DVs) Appropriate in small-scale evaluations directed toward solving a particular problem in a specific context

Problem Analysis Problem Oriented Policing Problem solving: A fundamental tool in problem-oriented policing How-To-Do-It Guides: A general guide to crime analysis to support problem-oriented policing Problem & Response Guides: Describe how to analyze very specific types of problems and what are known to be effective or ineffective responses

Auto theft in chula vista Nanci Plouffe and Rana Sampson (2004) began their analysis of vehicle theft by comparing Chula Vista to other southern California cities Theft rates tended to be higher for cities closer to the border 10 parking lots accounted for 25% of thefts & 20% of break-ins in the city 6 of the 10 lots were among the top 10 calls-for-service locations in Chula Vista Auto theft hot spots also tended to be hot spots for other kinds of incidents

Other Applications of Policy Analysis Space-and Time-Based Analysis: increased prevalence due to technological advances Crime maps usually represent at least four different things: one or more crime types; space or area; some time period; and some dimension of land use, usually streets Problem solving tools and processes Strategic Approaches to Community Safety Initiatives (SACSI)

Block Groups with > 13% Abandoned Buildings   1998   1997   1996   1995 Block Groups with > 13% Abandoned Buildings

Political Context of Applied Research Different stakeholder interests can produce conflicting perspectives on evaluations Researcher must identify stakeholders & perspectives Educate stakeholders on why evaluation should be conducted Explain that applied research is used to determine what works and what does not Political concerns & ideology may color evaluation; be careful

Why Results Are Ignored Implications may not be presented in a way that non-researchers can understand Results sometimes contradict deeply held beliefs Vested interest in a program