Agenda: Zinc recap Where are we? Outcomes and comparisons Intimate Partner Violence Judicial Oversight Evaluation Projects: Program Theory and Evaluation.

Slides:



Advertisements
Similar presentations
Indianapolis, Indiana Offender Notification Meetings.
Advertisements

Domestic Violence Information System (DVIS) Presented by The Office of the State Attorney Fifteenth Judicial Circuit Palm Beach County, Florida.
Experimental Research Designs
Community-Oriented Defense Performance Indicators A Conceptual Overview Michael Rempel Center for Court Innovation Presented at the Community-Oriented.
Measuring and Monitoring Program Outcomes
Correlation AND EXPERIMENTAL DESIGN
1 Learning from Christian Aid Bolivia Impact assessment - climate change advocacy in Bolivia.
Evaluation Research, aka Program Evaluation. Definitions Program Evaluation is not a “method” but an example of applied social research. From Rossi and.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Evaluation Research COMT 502. topics Evaluation research –Process of determining whether intervention has its intended result.
Presentation to the Virginia Criminal Sentencing Commission April 13, 2015 Judy Clarke, Executive Director, Virginia Center For Restorative Justice Mark.
Agenda: Block Watch outcome map Program Theory overview Evaluation theory overview Mentoring Evaluation Assignment 1 Evaluation Debrief.
Evaluation. Practical Evaluation Michael Quinn Patton.
Agenda: Block Watch: Random Assignment, Outcomes, and indicators Issues in Impact and Random Assignment: Youth Transition Demonstration –Who is randomized?
Types of Evaluation.
Targeting Offenders Prospects, Practices and Concerns June
Evaluating NSF Programs
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Building Tribal programs that invest in children, youth, and families, while preserving tribal cultural values and traditions. Hankie P. Ortiz,Deputy Bureau.
Evidence-based Practices (EBP) in Corrections
This project was supported by Award No VF-GX-0001, awarded by the National Institute of Justice, Office of Justice Programs, U.S. Department of Justice.
Presented by Margaret Robbins Program Director, TMCEC.
A quick reflection… 1.Do you think Body Worn Video is a good idea? 2.Do you think Body Worn Video affects Criminal Justice Outcomes for Domestic Abuse.
Association on American Indian Affairs Juvenile Justice Reform and the Juvenile Detention Alternatives Initiative (JDAI) Prepared by Jack F. Trope, Executive.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
A Follow-up of An Analysis of the New Mexico Screening and Tracking Data for DWI Offenders Judith S. Harmon, MA New Mexico Department of Health Office.
1 The New Jersey Experience: The Stationhouse Adjustment Program Part II Presented by: Raymond Massi, Jr., Law Enforcement Coordinator, US Attorney’s Office.
NC Sentencing and Policy Advisory Commission RECIDIVISM OF 16 AND 17 YEAR OLD AND JUVENILE OFFENDERS: FINDINGS FROM TWO STUDIES Presented to Youth Accountability.
PREPARED BY NPC RESEARCH PORTLAND, OR MAY 2013 Florida Adult Felony Drug Courts Evaluation Results.
Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Criteria for Assessing The Feasibility of RCTs. RCTs in Social Science: York September 2006 Today’s Headlines: “Drugs education is not working” “ having.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
Successful Concepts Study Rationale Literature Review Study Design Rationale for Intervention Eligibility Criteria Endpoint Measurement Tools.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
What is randomization and how does it solve the causality problem? 2.3.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Immediate Sanction Probation Pilot Project Virginia Criminal Sentencing Commission June 8, 2015.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Quasi Experimental and single case experimental designs
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Judge Neil Edward Axel District Court of Maryland (retired) Maryland Highway Safety Judicial Conference December 2, 2015 Best Practices & Sentencing Alternatives.
Chapter Thirteen – Organizational Effectiveness.  Be able to define organizational effectiveness  Understand the issues underpinning measuring organizational.
Evaluation design and implementation Puja Myles
Agenda: Quasi Experimental Design: Basics WSIPP drug court evaluation Outcomes and Indicators for your projects Next time: bring qualitative instrument.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
ADULT REDEPLOY ILLINOIS Mary Ann Dyar, Program Administrator National Association of Sentencing Commissions August 7, 2012.
Sentencing and the Correctional Process
Defending dignity. Fighting poverty. Peacebuilding Design, Monitoring & Evaluation Training Theory of Change Approach (TOC) 30 November 2011.
Improving Outcomes for Young Adults in the Justice System Challenges and Opportunities.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
Domestic Violence Restorative Circles (DVRC) Program Men As Peacemakers.
 First drug court opened in Miami-Dade, FL in 1989  Goal is to reduce recidivism by using graduated sanctions and incentives combined with treatment.
IMPACT EVALUATION PBAF 526 Class 5, October 31, 2011.
Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
Probation and Community Justice Program Overview
Research Design and Outcomes
Front Line Innovation and Trials
TEXAS STUDY USED MORE THAN 1
National Framework Collaborative Police Action on Intimate
The European Barnahus Standards
RECOMMENDATIONS OF THE SIX DELEGATES
Training Evaluation Chapter 6
Presentation transcript:

Agenda: Zinc recap Where are we? Outcomes and comparisons Intimate Partner Violence Judicial Oversight Evaluation Projects: Program Theory and Evaluation Questions

Zinc lessons about process evaluation: We need description, data, and analysis to understand what activities took place To understand why a program works or doesn’t work, we need to connect activities to outcomes Process evaluation assesses early causal links in program theory

Next stop: Impact Evaluation

Comparison: Need to disentangle effects of program from other influences Need comparisons for impact and process evaluation We want to know outcomes or activities for program group IF they didn’t receive the program—the “counter-factual” Use comparison group as similar as possible to intervention group Experimental comparison groups are randomized (by individual or other unit) Quasi-experimental research compares to a non- random group without the intervention

Comparisons may be: -over time -across intervention groups with and without program; levels of intervention (“dosage”)

Impact here!

What types of comparisons were used for these impact evaluations?: School based mentoring program Jamaica PATH program Intimate Partner Violence Judicial Oversight Demonstration

Outcomes reflect more than program influence: Characteristics of participants Other pre-existing differences (e.g., other services) Changes over time (maturation) Changes in social or economic context Impact of evaluation (e.g., testing) (instrumentation)

Properties of Comparisons: Internal Validity: Does measured impact only reflect contribution of program? External Validity: Does measured impact reflect what we could expect in a similar program elsewhere?

Outcomes: Capture key conceptual results: –How would you know if program worked? –What would you see? What would change for participants? Include both shorter run and longer run outcomes Come from Program theory, previous evaluations/research, stakeholders Indicators: Measures of the outcomes Use multiple indicators to: –Triangulate findings –Assess multiple dimensions of outcome –Prevent distortion of single measure Come from available data, previous research and evaluation, theory of how to best capture outcome, stakeholders

Properties of Indicators: Validity: Does the indicator measure what we say it does? Reliability: Does the indicator have the same reading when the outcomes are the same (with different evaluators or over time with no outcome change)? Sensitivity: Is the indicator able to detect changes due to our program?

Judicial Oversight Demonstration: Coordinated response to intimate partner violence Program theory Outcomes and Indicators Comparison

Judicial Oversight Demonstration: Program Theory Collaboration between law enforcement, courts, and service providers Consistent responses to IPV offenses Coordinated victim advocacy and services Offender accountability and oversight These lead to goals of victim safety, accountability for offenders, and reduced repeat IPV

Judicial Oversight Demonstration: Comparisons to assess outcomes Three sites chosen based on capacity and interest in the demonstration (MA, MI, and WI) Goals of evaluation –Test impact –Learn from implementation analysis Two sites: JOD cases to cases in a similar comparison county One site: JOD cases to cases prior to JOD

Are JOD samples similar to comparisons?

Judicial Oversight Demonstration: Research tools IMPACT (in JOD and comparison sites): –Interviews with victims and offenders in-person via computer early and 9 months later –Admin. data from courts, probation system, and service providers IMPLEMENTATION: –Site observation and interviews with service providers –Focus Groups with victims and offenders

Judicial Oversight Demonstration: Outcomes and Indicators Victim Safety –Perceptions of quality of treatment and safety/well-being Satisfaction ratings on surveys Safety and well-being rating on survey Number of contacts with probation officials –Safety Victim reports of new IPV % offenders re-arrested for IPV Offender Accountability –Accountability Number of probation requirements % convicted and sentenced % with public defender, defense attorney % reported to Batterer Intervention Programs by follow-up –Perceptions Rating of clarity of legal process Rating of fairness of judges and probation officials Rating of perception of certainty and severity of future penalties Recidivism Victim reports of new IPV % offenders re-arrested for IPV

Example comparison in outcomes:

Judicial Oversight Demonstration: Results: JOD is feasible and may benefit the justice system JOD did not increase victim perceptions of safety or well-being JOD increased offender accountability, but not key perceptions related to future offending JOD reduced repeat IPV where offenders were incarcerated Batterer intervention programs did not reduce IPV

Judicial Oversight Demonstration: What can we say about internal validity? What can we say about external validity?

Projects: What does your program do? By what pathways do/will those activities affect outcomes? What questions will your evaluation address?  Your job is to help each project in your group move forward