Presentation is loading. Please wait.

Presentation is loading. Please wait.

Agenda: Zinc recap Where are we? Outcomes and comparisons Intimate Partner Violence Judicial Oversight Evaluation Projects: Program Theory and Evaluation.

Similar presentations


Presentation on theme: "Agenda: Zinc recap Where are we? Outcomes and comparisons Intimate Partner Violence Judicial Oversight Evaluation Projects: Program Theory and Evaluation."— Presentation transcript:

1 Agenda: Zinc recap Where are we? Outcomes and comparisons Intimate Partner Violence Judicial Oversight Evaluation Projects: Program Theory and Evaluation Questions

2 Zinc lessons about process evaluation: We need description, data, and analysis to understand what activities took place To understand why a program works or doesn’t work, we need to connect activities to outcomes Process evaluation assesses early causal links in program theory

3 Next stop: Impact Evaluation

4 Comparison: Need to disentangle effects of program from other influences Need comparisons for impact and process evaluation We want to know outcomes or activities for program group IF they didn’t receive the program—the “counter-factual” Use comparison group as similar as possible to intervention group Experimental comparison groups are randomized (by individual or other unit) Quasi-experimental research compares to a non- random group without the intervention

5 Comparisons may be: -over time -across intervention groups with and without program; levels of intervention (“dosage”)

6 Impact here!

7 What types of comparisons were used for these impact evaluations?: School based mentoring program Jamaica PATH program Intimate Partner Violence Judicial Oversight Demonstration

8 Outcomes reflect more than program influence: Characteristics of participants Other pre-existing differences (e.g., other services) Changes over time (maturation) Changes in social or economic context Impact of evaluation (e.g., testing) (instrumentation)

9 Properties of Comparisons: Internal Validity: Does measured impact only reflect contribution of program? External Validity: Does measured impact reflect what we could expect in a similar program elsewhere?

10 Outcomes: Capture key conceptual results: –How would you know if program worked? –What would you see? What would change for participants? Include both shorter run and longer run outcomes Come from Program theory, previous evaluations/research, stakeholders Indicators: Measures of the outcomes Use multiple indicators to: –Triangulate findings –Assess multiple dimensions of outcome –Prevent distortion of single measure Come from available data, previous research and evaluation, theory of how to best capture outcome, stakeholders

11 Properties of Indicators: Validity: Does the indicator measure what we say it does? Reliability: Does the indicator have the same reading when the outcomes are the same (with different evaluators or over time with no outcome change)? Sensitivity: Is the indicator able to detect changes due to our program?

12 Judicial Oversight Demonstration: Coordinated response to intimate partner violence Program theory Outcomes and Indicators Comparison

13 Judicial Oversight Demonstration: Program Theory Collaboration between law enforcement, courts, and service providers Consistent responses to IPV offenses Coordinated victim advocacy and services Offender accountability and oversight These lead to goals of victim safety, accountability for offenders, and reduced repeat IPV

14 Judicial Oversight Demonstration: Comparisons to assess outcomes Three sites chosen based on capacity and interest in the demonstration (MA, MI, and WI) Goals of evaluation –Test impact –Learn from implementation analysis Two sites: JOD cases to cases in a similar comparison county One site: JOD cases to cases prior to JOD

15 Are JOD samples similar to comparisons?

16 Judicial Oversight Demonstration: Research tools IMPACT (in JOD and comparison sites): –Interviews with victims and offenders in-person via computer early and 9 months later –Admin. data from courts, probation system, and service providers IMPLEMENTATION: –Site observation and interviews with service providers –Focus Groups with victims and offenders

17

18 Judicial Oversight Demonstration: Outcomes and Indicators Victim Safety –Perceptions of quality of treatment and safety/well-being Satisfaction ratings on surveys Safety and well-being rating on survey Number of contacts with probation officials –Safety Victim reports of new IPV % offenders re-arrested for IPV Offender Accountability –Accountability Number of probation requirements % convicted and sentenced % with public defender, defense attorney % reported to Batterer Intervention Programs by follow-up –Perceptions Rating of clarity of legal process Rating of fairness of judges and probation officials Rating of perception of certainty and severity of future penalties Recidivism Victim reports of new IPV % offenders re-arrested for IPV

19 Example comparison in outcomes:

20 Judicial Oversight Demonstration: Results: JOD is feasible and may benefit the justice system JOD did not increase victim perceptions of safety or well-being JOD increased offender accountability, but not key perceptions related to future offending JOD reduced repeat IPV where offenders were incarcerated Batterer intervention programs did not reduce IPV

21 Judicial Oversight Demonstration: What can we say about internal validity? What can we say about external validity?

22 Projects: What does your program do? By what pathways do/will those activities affect outcomes? What questions will your evaluation address?  Your job is to help each project in your group move forward


Download ppt "Agenda: Zinc recap Where are we? Outcomes and comparisons Intimate Partner Violence Judicial Oversight Evaluation Projects: Program Theory and Evaluation."

Similar presentations


Ads by Google