RSAT Program Performance Measures National Workshop – Chicago 2014

Slides:



Advertisements
Similar presentations
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Advertisements

OJJDP Performance Measurement Training 1 Incorporating Performance Measurement in the Formula Grant RFP and Application Format Presenter: Pat Cervera,
State Administrative Agency (SAA) 2007 Re-Entry Grant Training Workshop The Governor’s Crime Commission Re-Entry Grants and Federal Resource Support Programs.
COLLEGE SPARK WASHINGTON 2012 Community Grants Program Application Webinar 12/22/201110:00 AM 1/4/20122:00 PM.
Standards and Guidelines for Quality Assurance in the European
SPF SIG State-Level Evaluation PARTICIPANT LEVEL INSTRUMENT (PLI)
Title I-D, Subpart 2 Neglected, Delinquent, and At- Risk Youth Program ESEA Odyssey Summer 2013 Russ Sweet Education Specialist Oregon Department of Education.
SPF SIG State-Level Evaluation COMMUNITY LEVEL INSTRUMENT (CLI): PART 2.
Update: Web Data Collection System (WDCS) Title I Administrative Meeting—September 30, 2010 Kristi Peters, Research and Evaluation Coordinator 1.
Erica Cummings Grant Coordinator 1.  The New Mexico Department of Homeland Security and Emergency Management (DHSEM) is responsible for:  Monitoring.
Improving Government Effectiveness Tracy Gallo – State of Vermont June Sweeney - Office of the State Auditor.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
DOL Performance and Reporting Requirements for YouthBuild Grantees Redondo Beach, CA February 7, 2007.
ASCA PBMS Implementation Is your agency ready to participate in PBMS? Let’s look at the issues.
Welcome! Please join us via teleconference: Phone: Code:
HUB Technical Assistance Review November CQI.
The CAMP Performance Reporting Process Michelle Meier Nathan Weiss Office of Migrant Education U.S. Department of Education New Directors Meeting Phoenix,
Title III Native American and Alaska Native Children in School Program Grantee Performance Reporting June 19, 2014 Prepared under the Data Quality Initiative.
TRAC e-Training. What are the TRAC Annual Goals? Required for all grantees Specific to TRAC activities o Number of consumers served o Infrastructure indicators.
Integrity Accountability Solutions RSAT Program Performance Measures National Workshop – Chicago 2014 Jimmy Steyee Deputy Program Manager at CSR, Incorporated.
Developing Effective Performance Measures: A Skill-Building Workshop David J. Bernstein, Ph.D., Senior Study Director, Westat Sponsored by the Government.
Session 5: Selecting and Operationalizing Indicators.
Performance Measurement Tool (PMT) What Data Tells Us About Current RSAT Programming Presented by: Jimmy Steyee New Orleans, LA July 17, 2015.
How to Demonstrate the Value of Contract Education Jack J. Phillips, Ph.D. Chairman ROI Institute, Inc. May 11, :00 AM- 12:00 PM 5 Levels of Evaluation.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Integrity Accountability Solutions RSAT Program Performance Measures National Workshop – Chicago 2014 Jimmy Steyee Deputy Project Manager at CSR, Incorporated.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
Welcome! These slides are designed to help you think through presenting your benchmark planning and progress. Feel free to pick and choose the slides that.
Submitting a Proposal This document will guide you through submitting a proposal to Carnegie Corporation of New York. Access the form Access Carnegie Corporation.
Primary Prevention Institute
Project monitoring and evaluation
Designing Effective Evaluation Strategies for Outreach Programs
Organisation Control KPI’s & an industry Review
Functional Area Assessment
The national impact of student opportunities
The assessment process For Administrative units
OSEP TA&D Program Performance Measurement
OUTCOME MEASUREMENT TRAINING
Advances in Aligning Performance Data and Budget Information:
H-1 B Ready to Work Grants
RSAT Performance Measures
Office of Developmental Programs Quality Assessment & Improvement for HCBS Services SELF ASSESSMENTS June 23, 2017  11:00 AM June 26, 2017  1:00 PM.
Promising Practices for Increasing Certificate and Credentialing Outcomes H-1B Ready to Work.
Fulton County Justice & Mental Health Task Force
THREE-YEAR LOCAL STRATEGIC PLANNING
RSAT History, Best Practices and Future
Kristin Stainbrook, Ph.D.
Webinar: ESSA Improvement Planning Requirements
U.S. Department of Housing & Urban Development
Gathering Information: Monitoring your Progress
Kristin Stainbrook, PhD
Administering Federal Programs-A Charter School Perspective
National Core Indicators
Applicants’ Orientation Meeting
Gathering Information: Monitoring your Progress
Strengthening Program Management Building Capacity, Supporting the Work & Ensuring Quality in Supportive Service Programs Tom Balsley Office of.
Capacity Building for HMIS Leads
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
Using Data to Monitor Title I, Part D
Navigating SWIS Webinar
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
America’s Promise Success Factors
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
Technical and Advisory Meeting
Training for 2018 Funded Program Evaluation form
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
APR Informational Webinar
Navigating SWIS Webinar
Budgeting Conversation
Presentation transcript:

RSAT Program Performance Measures National Workshop – Chicago 2014 Jimmy Steyee Deputy Project Manager at CSR, Incorporated jsteyee@csrincorporated.com James.D.Steyee@ojp.usdoj.gov

Overview Performance measurement overview Data collection process RSAT grant program and FY2013 accomplishments

Why report “performance” data? Satisfy Government Performance and Results Act of 1993 and the Government Performance and Results Modernization Act of 2010 (collectively referred to as GPRA) Accountability Transparency Inform budgets Fulfill ad hoc data requests Draft annual and quarterly reports and GrantStat Inform targeted TTA strategy

What is performance measurement? Performance measurement focuses on whether a program is achieving its objectives Performance can be defined and characterized by both quantitative (numeric) and qualitative (narrative) metrics “What happened?”, “What activities occurred?” Can be used to demonstrate activity and accomplishments Differs from program evaluation Multiple types (ex. program, outcome, process) “Does this work?”, “How well is it working?” “Why is it working?” We’re counting on your data to tell the uses and stories of BJA funds. Your good work is what we’ll translate into best practices and success stories, but please don’t interpret that to mean that we only want to see perfect numbers and exaggerated figures. The reporting you do is the feedback loop we need to create a better program. If your program isn’t working out the way you thought it would, we want to know about it. Without that honest reporting of successes and failures, we won’t know how to best target TTA resources and get you the help you need. Each of the BJA programs have mandatory performance measures requirements. All have the same 7 narrative questions that comprise your progress report, and then most programs have additional, program-specific questions. Programs without demonstrated performance will be out with new administration, more scrutiny One of BJA’s key objectives for the office is Communicating the value of justice initiatives to stakeholders. Results of measures let us do just that. We’re not collecting the data for a gotcha exercise.. We’re trying to use the information grantees gather to make better decisions which will affect you at the state and local level. It’s about collecting information to compile for “brag-worthy” reports to superiors, those who make funding, programmatic decisions, etc. Your success is our success. We want information to make good decisions to help you succeed. We’re looking for the good, the bad, and the ugly. Accuracy is most important. We may have to make some course corrections if the outcomes of the program aren’t what we expected, maybe grantees need more TA, the program period needs extending, etc. Need accurate information before we can best help you. We understand that when you’re in the field doing the work of the grant, it’s not always easy to understand the connection between reporting information and how that affects the program. It may seem like a circuitous route, but I’d like to explain the connection to you. Your success is our success.

BJA PM Development Process Measures developed collaboratively by CSR, BJA, and other stakeholders Process starts with the development of a logic model Identifies expected program objectives, activities, and outcomes Next step is developing draft measures Vetted internally and by grantees Final revisions made and final version released Measures are dynamic Reviewed and modified periodically to ensure appropriateness

BJA PM Development Process For Example Objective: Serve high risk/high need individuals Activity: Risk and needs assessment screening Outputs: # individuals screened, # individuals found to be high risk/high need Outcome: increase in the number and percentage of high risk/high need individuals

Data Collection Process Data for about 21 programs is collected through the Performance Measurement Tool (PMT) Access at www.bjaperformancetools.org Program-specific resources are available in under “Info & Resources” including, webinars, FAQs, PMT user guide, and questionnaires Regular submission of performance data is required as a condition of all BJA grant awards PMT data and reports are regularly reviewed by staff – incomplete or delinquent reporting can result in the freezing of grant funds Requirement is separate from all other BJA grant-related reporting (ex. GMS)

RSAT Grantee Accomplishments PMT Completion Rate RSAT: 97% average TOTAL AWARDS (not including sub-awards): 176 as of April 2014 PMT (all programs): 91% TOTAL AWARDS (not including sub-awards) : 3,627

RSAT Grantee Accomplishments Top Ten States—Total Enrolled (as of December 31, 2013) # State Total Enrolled % of Total Enrolled 1 WV 1240 8.15 2 TX 1136 7.47 3 CA 1083 7.12 4 IN 983 6.46 5 IL 787 5.17 6 MA 728 4.79 7 AL 710 4.67 8 OK 661 4.35 9 LA 646 4.25 10 GA 555 3.65

RSAT Grantee Accomplishments Top Ten States—Successful Individuals Completing the Program (January–December 31, 2013) # State # Individuals Completing Program Percentage 1 TX 735 6.22 2 VA 715 6.06 3 IL 709 6.00 4 MA 665 5.63 5 CA 604 5.12 6 LA 597 5.06 7 IN 586 4.96 8 GA 542 4.59 9 MI 466 3.95 10 AL 456 3.86

Residential- and Jail-Based Programs

RSAT Residential/Jail Program Enrollment

*High Risk/High Need Participants

Residential/Jail-Based Program Completion Rate

Residential/Jail-Based Program Completions and Exits: January–December 2013

Residential/Jail-Based Program Completions: January–December 2013   Jail Residential Participants Who Completed the Program: Time Frame N % 0 to 3 Months 1457 48 1233 15 4 to 6 Months 1258 42 3366 41 7 to 9 Months 246 8 2735 33 10 or More Months 47 2 872 11 Total 3008 100 8206

Residential/Jail-Based Program Unsuccessful Exits: January–December 2013 Participants Who Did Not Complete Program: Time Frame N % 0 to 3 Months 828 85 1989 49 4 to 6 Months 121 12 1563 38 7 to 9 Months 18 2 347 9 10 or More Months 4 168 Total 971 100 4067

Residential/Jail-Based Program Unsuccessful Exits: January–December 2013 Participants Who Did Not Complete Program: Reasons N % Termination for a New Charge 77 8 66 2 Release or Transfer to Another Facility 261 27 852 21 Death or Serious Illness 26 3 85 Voluntary Drop Out 149 15 420 10 Failure to Meet Program Requirements 152 16 1173 29 Violation of Institutional Rules 258 1225 30 Other 48 5 246 6 Total 971 100 4067

Residential/Jail-Based Program Services: October–December 2013

Aftercare Programs

Aftercare Programs: Enrollment

Aftercare Programs: Case Planning and High Risk/High Need Participants

Aftercare Programs: Program Completions and Program Exits

Aftercare Programs: Program Completions and Program Exits—January-December 2013

Aftercare Program Services: October–December 2013

Aftercare Programs: Program Completions—January-December 2013   Aftercare Participants Who Completed the Program: Time Frame N % 0 to 3 Months 597 77 4 to 6 Months 122 16 7 to 9 Months 26 3 10 or More Months 34 4 Total 779 100

Aftercare Programs: Program Exits—January-December 2013 Participants Who Did Not Complete Program: Time Frame N % 0 to 3 Months 828 85 4 to 6 Months 121 12 7 to 9 Months 18 2 10 or More Months 4 Total 971 100

Aftercare Programs: Program Unsuccessful Exits—January-December 2013 Participants Who Did Not Complete Program: Reasons N % Termination for a New Charge 137 18 Release or Transfer to Another Facility 32 4 Death or Serious Illness 2 Voluntary Drop Out 141 Failure to Meet Program Requirements 374 48 Absconded 47 6 Other 46 Total 779 100

Summary Risk and Needs Assessment Screening and Treatment Planning Substance Abuse Testing Program Services Substance abuse treatment; cognitive; behavioral, social; and vocational services Program Length— Residential Programs: 6-12 months Jail-Based: at least 3 months

Contact Information If you have any questions about program performance measures or issues related to the PMT system, contact the PMT Help Desk. Hours: 8:30am to 5:30pm EST Email: bjapmt@csrincorporated.com Phone: 1-888-252-6867 For training and technical assistance, contact AHP. For other grant related issues, contact your BJA grant manager.

Q&A Jimmy Steyee Deputy Project Manager at CSR, Incorporated jsteyee@csrincorporated.com James.D.Steyee@ojp.usdoj.gov