Research Design and Outcomes

Slides:



Advertisements
Similar presentations
Domestic Violence Information System (DVIS) Presented by The Office of the State Attorney Fifteenth Judicial Circuit Palm Beach County, Florida.
Advertisements

Community-Oriented Defense Performance Indicators A Conceptual Overview Michael Rempel Center for Court Innovation Presented at the Community-Oriented.
Measuring and Monitoring Program Outcomes
Agenda: Zinc recap Where are we? Outcomes and comparisons Intimate Partner Violence Judicial Oversight Evaluation Projects: Program Theory and Evaluation.
Agenda: Block Watch: Random Assignment, Outcomes, and indicators Issues in Impact and Random Assignment: Youth Transition Demonstration –Who is randomized?
Types of Evaluation.
Evaluation of the Connecticut Judicial Branch’s Three Court-Mandated Family Violence Programs: FVEP, EXPLORE, and EVOLVE Stephen M. Cox, Ph.D, Professor.
Building Tribal programs that invest in children, youth, and families, while preserving tribal cultural values and traditions. Hankie P. Ortiz,Deputy Bureau.
A quick reflection… 1.Do you think Body Worn Video is a good idea? 2.Do you think Body Worn Video affects Criminal Justice Outcomes for Domestic Abuse.
PREPARED BY NPC RESEARCH PORTLAND, OR MAY 2013 Florida Adult Felony Drug Courts Evaluation Results.
Criteria for Assessing The Feasibility of RCTs. RCTs in Social Science: York September 2006 Today’s Headlines: “Drugs education is not working” “ having.
What is randomization and how does it solve the causality problem? 2.3.
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Judge Neil Edward Axel District Court of Maryland (retired) Maryland Highway Safety Judicial Conference December 2, 2015 Best Practices & Sentencing Alternatives.
Agenda: Quasi Experimental Design: Basics WSIPP drug court evaluation Outcomes and Indicators for your projects Next time: bring qualitative instrument.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
Domestic Violence Restorative Circles (DVRC) Program Men As Peacemakers.
IMPACT EVALUATION PBAF 526 Class 5, October 31, 2011.
Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association.
Experimental Design Ragu, Nickola, Marina, & Shannon.
How does coaching add value in organisations?
for CIT Program Operation Resource Development Institute
Probation and Community Justice Program Overview
Issues in Evaluating Educational Research
Juvenile Reentry Programs Palm Beach County
Learning Objectives Describe the seven phases of the criminal justice process. Identify at least two key victims’ rights in each phase of the criminal.
Cari-Ana, Alexis, Sean, Matt
A Meta-Analysis of Batterer Intervention Programs
Performance Improvement Projects: From Idea to PIP
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
FY17: Briefing on Jail Bed Contingency Funds
Summit County Probation Services
Front Line Innovation and Trials
Technical Assistance on Evaluating SDGs: Leave No One Behind
MDTF-JSS: Review of the Criminal Chain Process in Serbia
How to Assess the Effectiveness of Your Language Access Program
Jail Population Management and Pretrial Practice in California
MDTF-JSS: Review of the Criminal Chain Process in Serbia
Right-sized Evaluation
Medical College of Wisconsin
Evaluation of An Urban Natural Science Initiative
TEXAS STUDY USED MORE THAN 1
Assess Plan Do Review Resource 1: Types of Evaluation – which type of approach should I use? There are a number of different types of evaluation, the most.
Conducting Efficacy Trials
Collaborative Design for Smart Pupils STEM – IMPACT EVALUATION
Research Designs Social Sciences.
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Chapter Six Training Evaluation.
OPEN TO CHANGE Open Group Substance Abuse Curriculum
Chapter Eight: Quantitative Methods
Evaluating adaptation
Monitoring and Evaluating Conflict Sensitivity
Evaluate the effectiveness of the implementation of change plans
Monitoring and Evaluation of Postharvest Training Projects
The Southport Domestic & Family Violence Specialist Court:
Drug Courts: Some Answers to Our Burning Questions
Implementation Challenges
Perpetrator Programs: What we know about completion and re-offending
Health and Human Services Finance Committee
Building a Strong Outcome Portfolio
The European Barnahus Standards
Navigating the Justice System
Group Experimental Design
Chapter 11 EDPR 7521 Dr. Kakali Bhattacharya
Training Evaluation Chapter 6
OGB Partner Advocacy Workshop 18th & 19th March 2010
Monitoring and Evaluating FGM/C abandonment programs
EVALUATION FOR MODEL ADAPTATION WS3
Which Evaluation Designs Are Right for Your State?
Presentation transcript:

Research Design and Outcomes PBAF 526

Today Zinc recap Where are we? Outcomes and comparisons Intimate Partner Violence Judicial Oversight Evaluation Projects: Program Theory and Evaluation Questions

Zinc lessons about process evaluation We need description, data, and analysis to understand what activities took place To understand why a program works or doesn’t work, we need to connect activities to outcomes Process evaluation assesses early causal links in program theory

Next Stop: Impact Evaluation Start on left: What are our outcomes….What should we do? Then what are we doing? Then what did we accomplish…what was the impact?

Comparison: Need to disentangle effects of program from other influences Need comparisons for impact and process evaluation We want to know outcomes or activities for program group IF they didn’t receive the program—the “counter-factual” Use comparison group as similar as possible to intervention group Experimental comparison groups are randomized (by individual or other unit) Quasi-experimental research compares to a non-random group without the intervention

Comparisons may be:. -over time. -across intervention groups Comparisons may be: -over time -across intervention groups with and without program; levels of intervention (“dosage”)

Comparisons may be:. -over time. -across intervention groups Comparisons may be: -over time -across intervention groups with and without program; levels of intervention (“dosage”) Impact here!

What types of comparisons were used for these impact evaluations? School based mentoring program Jamaica PATH program Zinc in Nepal Intimate Partner Violence Judicial Oversight Demonstration

Outcomes reflect more than program influence Characteristics of participants Other pre-existing differences (e.g., other services) Changes over time (maturation) Changes in social or economic context Impact of evaluation (e.g., testing) (instrumentation)

Properties of Comparisons Internal Validity: Does measured impact only reflect contribution of program? External Validity: Does measured impact reflect what we could expect in a similar program elsewhere?

Outcomes Indicators Capture key conceptual results: How would you know if program worked? What would you see? What would change for participants? Include both shorter run and longer run outcomes Come from program theory, previous evaluations/research, stakeholders Indicators Measures of the outcomes Use multiple indicators to: Triangulate findings Assess multiple dimensions of outcome Prevent distortion of single measure Come from available data, previous research and evaluation, theory of how to best capture outcome, stakeholders

Properties of Indicators Validity: Does the indicator measure what we say it does? Reliability: Does the indicator have the same reading when the outcomes are the same (with different evaluators or over time with no outcome change)? Sensitivity: Is the indicator able to detect changes due to our program?

Judicial Oversight Demonstration: Coordinated response to intimate partner violence Program theory Outcomes and Indicators Comparison

Judicial Oversight Demonstration: Program Theory Collaboration between law enforcement, courts, and service providers Consistent responses to IPV offenses Coordinated victim advocacy and services Offender accountability and oversight These lead to goals of victim safety, accountability for offenders, and reduced repeat IPV

Judicial Oversight Demonstration: Comparisons to assess outcomes Three sites chosen based on capacity and interest in the demonstration (MA, MI, and WI) Goals of evaluation Test impact Learn from implementation analysis Two sites: JOD cases to cases in a similar comparison county One site: JOD cases to cases prior to JOD

Are JOD samples similar to comparisons?

Judicial Oversight Demonstration: Research tools IMPACT (in JOD and comparison sites): Interviews with victims and offenders in-person via computer early and 9 months later Admin. data from courts, probation system, and service providers IMPLEMENTATION: Site observation and interviews with service providers Focus Groups with victims and offenders

Judicial Oversight Demonstration Outcomes and Indicators Victim Safety Perceptions of quality of treatment and safety/well-being Satisfaction ratings on surveys Safety and well-being rating on survey Number of contacts with probation officials Safety Victim reports of new IPV % offenders re-arrested for IPV Offender Accountability Accountability Number of probation requirements % convicted and sentenced % with public defender, defense attorney % reported to Batterer Intervention Programs by follow-up Perceptions Rating of clarity of legal process Rating of fairness of judges and probation officials Rating of perception of certainty and severity of future penalties Recidivism

Example comparison in outcomes: Chi-sq is JOD vs comparison sample

Judicial Oversight Demonstration Results JOD is feasible and may benefit the justice system JOD did not increase victim perceptions of safety or well-being JOD increased offender accountability, but not key perceptions related to future offending JOD reduced repeat IPV where offenders were incarcerated Batterer intervention programs did not reduce IPV

Judicial Oversight Demonstration What can we say about internal validity? What can we say about external validity?

Projects: What does your program do? By what pathways do/will those activities affect outcomes? What questions will your evaluation address? Your job is to help each project in your group move forward