INTRODUCTION TO EVIDENCE- BASED PRACTICES (EBPs) Prepared by the Justice Research and Statistics Association.

Slides:



Advertisements
Similar presentations
Evidence Based Practices Lars Olsen, Director of Treatment and Intervention Programs Maine Department of Corrections September 4, 2008.
Advertisements

Program Goals, Objectives and Performance Indicators A guide for grant and program development 3/2/2014 | Illinois Criminal Justice Information Authority.
Opening Doors: Federal Strategic Plan to Prevent and End Homelessness
Evidence Based Practices Violence Prevention Summer Institute 2005.
The Practice of Evidence Based Practice … or Can You Finish What You Started? Ron Van Treuren, Ph.D. Seven Counties Services, Inc. Louisville, KY.
Reclaiming Futures as part of the OJJDP required Three-Year State Plan A Presentation to State SAGs Tom Begich.
Bureau of Justice Assistance JUSTICE AND MENTAL HEALTH COLLABORATIONS Bureau of Justice Assistance JUSTICE AND MENTAL HEALTH COLLABORATIONS Presentation.
Overview Office of Justice Programs U.S. Department of Justice Presented by: Adam Spector Congressional Affairs Specialist Office of Communications, Office.
Center for the Study and Prevention of Violence University of Colorado Boulder
Implementing Evidence Based Practices in Probation Departments Presented By: Natalie Pearl, Ph.D.
Requires DSHS and HCA to expend state funds on: (1) Juvenile justice programs or programs related to the prevention, treatment, or care of juvenile offenders.
Linking Actions for Unmet Needs in Children’s Health
State Administrative Agency (SAA) 2007 Re-Entry Grant Training Workshop The Governor’s Crime Commission Re-Entry Grants and Federal Resource Support Programs.
An overview of NREPP and how to use it National Registry of Evidence-Based Programs and Practices.
Prepared by the Justice Research and Statistics Association INTRODUCTION TO EVIDENCE-BASED PRACTICES.
Mission The Mission of OJP is to increase public safety and improve the fair administration of justice across America through innovative leadership and.
1. 2 Implementing and Evaluating of an Evidence Based Nursing into Practice Prepared By Dr. Nahed Said El nagger Assistant Professor of Nursing H.
Evidence: What It Is And Where To Find It Building Evidence of Effectiveness Copyright © 2014 by JBS International, Inc. Developed by JBS International.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
National Forum on Youth Violence Prevention Dennis Mondoro Strategic Community Development Officer Office of Juvenile Justice and Delinquency Prevention.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
State Advisory Committee on Juvenile Justice and Delinquency Prevention March Board Update 2014.
OJJDP Funding TTA, Research, and Dissemination Why we do it & How we fund it Janet Chiancone, Associate Administrator Budget and Planning Division.
The 10 Key Components of Veteran’s Treatment Court Presented by: The Honorable Robert Russell.
Slide 1 Decisions, Decisions: Cost-Benefit Analysis & Justice Policymaking August 6, 2012 National Association of Sentencing Commissions Annual Conference.
2014 AmeriCorps External Reviewer Training
Evidence-based Practices (EBP) in Corrections
Evidence-Based Sentencing. Learning Objectives Describe the three principles of evidence- based practice and the key elements of evidence-based sentencing;
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
NASC 2012 ANNUAL CONFERENCE AUGUST 6, 2012 NASC 2012 ANNUAL CONFERENCE AUGUST 6, 2012 Ray Wahl Deputy State Court Administrator.
Implementation Strategy for Evidence- Based Practices CIMH Community Development Team Model Pam Hawkins, Senior Associate Association for Criminal Justice.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
1 1 Bureau of Justice Assistance: Resources for Tribal Justice Systems October 19, 2012.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
KENTUCKY YOUTH FIRST Grant Period August July
Justice Grants Administration GOVERNMENT OF THE DISTRICT OF COLUMBIA Executive Office of the Mayor 1 Edward “Smitty” Smith, Director.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
EVALUATING THE IMPACT OF ADDING THE RECLAIMING FUTURES APPROACH TO JUVENILE TREATMENT DRUG COURTS: RECLAIMING FUTURES/JUVENILE DRUG COURT EVALUATION Josephine.
Understanding TASC Marc Harrington, LPC, LCASI Case Developer Region 4 TASC Robin Cuellar, CCJP, CSAC Buncombe County.
California Statewide Prevention and Early Intervention (PEI) Projects Overview May 20, 2010.
CEBP Learning Institute Fall 2009 Evaluation Report A collaborative Partnership between Indiana Department of Corrections & Indiana University November.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Juvenile Delinquency Professor Brown. Unit 7: The History of Juvenile Justice and Police Work with Juveniles Unit Overview-This unit examines the history.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
Getting There from Here: Creating an Evidence- Based Culture Within Special Education Ronnie Detrich Randy Keyworth Jack States.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
1 Welcome and Opening Remarks This project was supported by Grant No DG-BX-K021 awarded by the Bureau of Justice Assistance. The Bureau of Justice.
Best Practices in Demonstrating Evidence Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Carla Ganiel, AmeriCorps State and National.
Police providing crime information and evidence Public Safety Infrastructure Fund Grants Presenter: Superintendent Peter Brigham Title: Assistant Director,
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
January 2012 Coalition of Community Corrections Providers of New Jersey Employment Forum.
ADULT REDEPLOY ILLINOIS Mary Ann Dyar, Program Administrator National Association of Sentencing Commissions August 7, 2012.
RTI International RTI International is a trade name of Research Triangle Institute. Evaluation of the Office of Juvenile Justice and Delinquency.
MODELS FOR SUCCESS: AN INTEGRATED APPROACH FOR JUVENILE DRUG COURT Reclaiming Futures/Juvenile Drug Court Evaluation Southwest Institute for Research on.
Improving Outcomes for Young Adults in the Justice System Challenges and Opportunities.
1 Center Mission Statements SAMHSA ? CSAT Improving the Health of the Nation by Bringing Effective Alcohol and Drug Treatment to Every Community CMHS Caring.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
Session 2: Developing a Comprehensive M&E Work Plan.
 As of July 1, 2014, 61 operational courts: › 28 Adult Drug Courts  5 Hybrid Drug/OWI Courts › 14 OWI Courts › 9 Veterans Treatment Courts › 4 Mental.
What Makes A Juvenile Justice Program Evidence Based? Clay Yeager Policy Director, Blueprints for Healthy Youth Development.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Welcome to Workforce 3 One U.S. Department of Labor Employment and Training Administration Webinar Date: April 30, 2014 Presented by: U.S. Departments.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Juvenile Reentry Programs Palm Beach County
Intercept 5 Community Supervision
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Evidence Based Practices
Presentation transcript:

INTRODUCTION TO EVIDENCE- BASED PRACTICES (EBPs) Prepared by the Justice Research and Statistics Association

History of EBPs  Mid-1800s: Use of scientific methods to establish the efficacy of medical treatments  1938: Federal Food, Drug, and Cosmetic (FDC) Act required safety of new drugs be scientifically demonstrated  1962: FDC Act amended in 1962 to require demonstrated efficacy as well as safety  1976: Office of Information Technology report  few medical procedures supported by clinical trials  sparked the modern EBP movement in medicine

History of EBPs in Criminal Justice  1975: Robert Martinson and colleagues: “nothing works” in corrections  insufficient scientific evidence supporting correctional interventions  led to discussion/research on demonstrating effectiveness in criminal justice programming  1996: Congress required a "comprehensive evaluation of the effectiveness" of Department of Justice crime prevention grants  report by Dr. Lawrence Sherman and colleagues  early effort to identify EBPs in criminal justice by reviewing research and evaluation studies

Where Does Evidence Come From?  Two key elements of the Office of Justice Programs’ (OJP) definition of “evidence-based” programs and practices:  Effectiveness has been demonstrated by causal evidence, generally obtained through high quality outcome evaluations  Causal evidence depends on the use of scientific methods to rule out, to the extent possible, alternative explanations for the documented change.

Why Focus on EBPs?  Without evidence of effectiveness, cannot ensure that resources are being used properly:  Potential waste of money on ineffective interventions  Missed opportunity to change lives (victims, offenders)  Some non evidence-based interventions may actually cause harm (e.g., increase recidivism)

What About Innovation?  An evidence-based approach still leaves room for new, untested programs, provided:  Programs are grounded in theory or evidence about “what works” in a particular area  Programs incorporate “logic models” that: Identify program goals and objectives Indicate how program activities will lead to goals and objectives  Resources are available to evaluate new programs

What is Effectiveness?  Reducing crime  Policing interventions  Reducing recidivism  Correctional interventions  Reducing victimization/revictimization  Prevention/victim-based interventions

What are Scientific Methods?  Scientific evidence is:  Objective: observable by others, based on facts, free of bias or prejudice;  Replicable: can be observed by others using the same methods that were used to produce the original evidence;  Generalizable: applicable to individuals/circumstances beyond those used to produce the original evidence.

Randomized Controlled Trials (RCTs)  Comparing a group that receives a treatment/intervention (experimental group) with a group that does not (control group)  To attribute observed outcomes to the intervention, the two groups must be equivalent  The best way to ensure equivalency is to randomly assign individuals to the two groups. This is a randomized controlled trial.

RCT Example: Drug Court Assessment All offenders eligible for drug treatment TREATMENT group: Offenders randomly assigned to drug court CONTROL group: Offenders randomly assigned to traditional criminal court

Quasi-Experiments  Quasi-experimental designs can be used to control some group differences  Example: using a “wait list” of eligible program participants to compare with the treatment group  Because they do not involve random assignment, they are not as powerful as RCTs  Group differences other than intervention might affect outcomes

Non-Experiments  Do not involve comparisons between groups  Example: assessing a rape awareness campaign by assessing knowledge of women in the community at the end of the campaign.  Evidence of effectiveness is weak  Other factors might have produced women’s knowledge aside from the campaign.

What is Not Scientific Evidence?  Scientific evidence does not include:  Opinions  Testimonials  Anecdotes  Example: positive attitudes about a program by staff or participants ≠ evidence of effectiveness.

Systematic Reviews and Meta-Analysis  Systematic reviews: experts look at a large number of studies using standardized criteria to assess effectiveness.  Meta-analysis: a statistical method that combines the results of multiple evaluations to determine whether they show positive program outcomes.

Key Resources for Identifying EBPs  OJP’s CrimeSolutions.gov  Rates 270 programs as “effective” “promising” or “no evidence”  OJJDP’s Model Programs Guide (  Rates over 200 juvenile justice programs as either “exemplary,” “effective,” or “promising” Both based on expert reviews using standardized criteria

Key Resources (cont’d)  What Works in Reentry Clearinghouse (  BJA-funded initiative maintained by the Council of State Governments  56 reentry initiatives rated by experts using standardized coding instruments: Strong evidence of a beneficial effect Modest evidence of a beneficial effect No statistically significant findings Strong evidence of a harmful effect Modest evidence of a harmful effect

Key Resources (cont’d)  National Registry of Evidence-based Programs and Practices (NREPP)  Developed by the Substance Abuse and Mental Health Services Administration (SAMHSA)  Rates almost 300 mental health and substance abuse interventions based on expert reviews of quality and dissemination readiness

Illinois: Smarter Solutions for Crime Reduction  Illinois Criminal Justice Information Authority (ICJIA)  An online resource for policymakers and practitioners  Definition of EBP  List of effective strategies/program components  Reports and resources aPage=EBPInfo

Smarter Solutions for Crime Reduction  Many definitions of “evidence-based” and multiple strategies for assessing effectiveness.  Challenges and limitations posed by implementing evidence-based strategies under the exact conditions necessary for program fidelity.  The Authority endorses incorporating specific evidence- based principles within programs.

ICJIA Effective Planning Activities/Processes  Assessment of existing services and gaps using available data  Community engagement in planning new initiatives and supporting existing strategies  Strategic planning to assess agency or system capacity and to identify appropriate interventions  Adoption of promising or evidence-based practices or programs wherever possible  Creation of logic models to guide the direction of the practice/program  Development of programmatic and performance measures to assess implementation and effectiveness

ICJIA Effective Components/Strategies*  Principle 1: Assess Actuarial Risk/Needs  Principle 2: Enhance Intrinsic Motivation  Principle 3: Target Interventions  Risk Principle: Prioritize supervision and treatment resources for higher risk offenders  Need Principle: Target interventions to criminogenic needs  Responsivity Principle: Be responsive to temperament, learning style, motivation, culture, and gender when assigning programs  Dosage: Structure 40-70% of high-risk offenders’ time for 3-9 months  Treatment Principle: Integrate treatment into the full sentence/sanction requirements

ICJIA Effective Components/Strategies*  Principle 4: Skill Train with Directed Practice (use Cognitive Behavioral Treatment methods)  Principle 5: Increase Positive Reinforcement  Principle 6: Engage Ongoing Support in Natural Communities  Principle 7: Measure Relevant Processes/Practices  Principle 8: Provide Measurement Feedback * These are taken from the National Institute of Corrections’ Implementing Evidence-Based Practice in Community Corrections: The Principles of Effective Intervention (

ICJIA Program Goals, Objectives and Performance Indicators  Why focus on goals, objectives and performance measures?  Strengthen grant proposals  Strengthen a program, regardless of funding source For more information: mance_Measures_2012.pdf

ICJIA Grantee Data Reports  Collect standard performance metrics required by the federal funding source  Collect project-specific performance measures drawn from the program description  Templates for program description structured to capture program logic model

ICJIA Grantee Narrative Information  Highlights program achievements  Describes barriers to program implementation  Describes efforts to address barriers  Gives context to the data  Provides examples of program activities  Documents challenges

How ICJIA Uses Data Reports  ICJIA uses data reports to:  Document the work of the program  Assure the project is being implemented as intended  Provide feedback on program impact to the Authority Budget Committee and Board  Become aware of needs and barriers to implementation  Compile information required for ICJIA’s reports to federal funders

Federal Technical Assistance Resources  BJA NTTAC (  OJJDP NTTAC (  OVC TTAC ( All provide web-based training and resources and broker one-on-one technical assistance

Grant Technical Assistance  Authority Website (  Illinois Criminal Justice Information Authority Federal & State Grants Unit: A guide for grantees  Program Goals, Objectives, and Performance Indicators: A guide for grant and program development  How to Successfully Obtain Grant Funding -- And Be Glad You Did: Keys to successful grant applications  Neighborhood Recovery Initiative Grant Materials and Reporting Training Webinar  A Grant Proposal Guidebook: Planning, Writing and Submitting a Grant Proposal

Authority Contacts  Federal and State Grants Unit (FSGU)  Research and Analysis Unit (Statistical Analysis Center)