Presentation is loading. Please wait.

Presentation is loading. Please wait.

RSAT Program Performance Measures National Workshop – Chicago 2014

Similar presentations


Presentation on theme: "RSAT Program Performance Measures National Workshop – Chicago 2014"— Presentation transcript:

1 RSAT Program Performance Measures National Workshop – Chicago 2014
Jimmy Steyee Deputy Project Manager at CSR, Incorporated

2 Overview Performance measurement overview Data collection process
RSAT grant program and FY2013 accomplishments

3 Why report “performance” data?
Satisfy Government Performance and Results Act of 1993 and the Government Performance and Results Modernization Act of 2010 (collectively referred to as GPRA) Accountability Transparency Inform budgets Fulfill ad hoc data requests Draft annual and quarterly reports and GrantStat Inform targeted TTA strategy

4 What is performance measurement?
Performance measurement focuses on whether a program is achieving its objectives Performance can be defined and characterized by both quantitative (numeric) and qualitative (narrative) metrics “What happened?”, “What activities occurred?” Can be used to demonstrate activity and accomplishments Differs from program evaluation Multiple types (ex. program, outcome, process) “Does this work?”, “How well is it working?” “Why is it working?” We’re counting on your data to tell the uses and stories of BJA funds. Your good work is what we’ll translate into best practices and success stories, but please don’t interpret that to mean that we only want to see perfect numbers and exaggerated figures. The reporting you do is the feedback loop we need to create a better program. If your program isn’t working out the way you thought it would, we want to know about it. Without that honest reporting of successes and failures, we won’t know how to best target TTA resources and get you the help you need. Each of the BJA programs have mandatory performance measures requirements. All have the same 7 narrative questions that comprise your progress report, and then most programs have additional, program-specific questions. Programs without demonstrated performance will be out with new administration, more scrutiny One of BJA’s key objectives for the office is Communicating the value of justice initiatives to stakeholders. Results of measures let us do just that. We’re not collecting the data for a gotcha exercise.. We’re trying to use the information grantees gather to make better decisions which will affect you at the state and local level. It’s about collecting information to compile for “brag-worthy” reports to superiors, those who make funding, programmatic decisions, etc. Your success is our success. We want information to make good decisions to help you succeed. We’re looking for the good, the bad, and the ugly. Accuracy is most important. We may have to make some course corrections if the outcomes of the program aren’t what we expected, maybe grantees need more TA, the program period needs extending, etc. Need accurate information before we can best help you. We understand that when you’re in the field doing the work of the grant, it’s not always easy to understand the connection between reporting information and how that affects the program. It may seem like a circuitous route, but I’d like to explain the connection to you. Your success is our success.

5 BJA PM Development Process
Measures developed collaboratively by CSR, BJA, and other stakeholders Process starts with the development of a logic model Identifies expected program objectives, activities, and outcomes Next step is developing draft measures Vetted internally and by grantees Final revisions made and final version released Measures are dynamic Reviewed and modified periodically to ensure appropriateness

6 BJA PM Development Process
For Example Objective: Serve high risk/high need individuals Activity: Risk and needs assessment screening Outputs: # individuals screened, # individuals found to be high risk/high need Outcome: increase in the number and percentage of high risk/high need individuals

7 Data Collection Process
Data for about 21 programs is collected through the Performance Measurement Tool (PMT) Access at Program-specific resources are available in under “Info & Resources” including, webinars, FAQs, PMT user guide, and questionnaires Regular submission of performance data is required as a condition of all BJA grant awards PMT data and reports are regularly reviewed by staff – incomplete or delinquent reporting can result in the freezing of grant funds Requirement is separate from all other BJA grant-related reporting (ex. GMS)

8 RSAT Grantee Accomplishments
PMT Completion Rate RSAT: 97% average TOTAL AWARDS (not including sub-awards): 176 as of April 2014 PMT (all programs): 91% TOTAL AWARDS (not including sub-awards) : 3,627

9 RSAT Grantee Accomplishments
Top Ten States—Total Enrolled (as of December 31, 2013) # State Total Enrolled % of Total Enrolled 1 WV 1240 8.15 2 TX 1136 7.47 3 CA 1083 7.12 4 IN 983 6.46 5 IL 787 5.17 6 MA 728 4.79 7 AL 710 4.67 8 OK 661 4.35 9 LA 646 4.25 10 GA 555 3.65

10 RSAT Grantee Accomplishments
Top Ten States—Successful Individuals Completing the Program (January–December 31, 2013) # State # Individuals Completing Program Percentage 1 TX 735 6.22 2 VA 715 6.06 3 IL 709 6.00 4 MA 665 5.63 5 CA 604 5.12 6 LA 597 5.06 7 IN 586 4.96 8 GA 542 4.59 9 MI 466 3.95 10 AL 456 3.86

11 Residential- and Jail-Based Programs

12 RSAT Residential/Jail Program Enrollment

13 *High Risk/High Need Participants

14 Residential/Jail-Based Program Completion Rate

15 Residential/Jail-Based Program Completions and Exits: January–December 2013

16 Residential/Jail-Based Program Completions: January–December 2013
Jail Residential Participants Who Completed the Program: Time Frame N % 0 to 3 Months 1457 48 1233 15 4 to 6 Months 1258 42 3366 41 7 to 9 Months 246 8 2735 33 10 or More Months 47 2 872 11 Total 3008 100 8206

17 Residential/Jail-Based Program Unsuccessful Exits: January–December 2013
Participants Who Did Not Complete Program: Time Frame N % 0 to 3 Months 828 85 1989 49 4 to 6 Months 121 12 1563 38 7 to 9 Months 18 2 347 9 10 or More Months 4 168 Total 971 100 4067

18 Residential/Jail-Based Program Unsuccessful Exits: January–December 2013
Participants Who Did Not Complete Program: Reasons N % Termination for a New Charge 77 8 66 2 Release or Transfer to Another Facility 261 27 852 21 Death or Serious Illness 26 3 85 Voluntary Drop Out 149 15 420 10 Failure to Meet Program Requirements 152 16 1173 29 Violation of Institutional Rules 258 1225 30 Other 48 5 246 6 Total 971 100 4067

19 Residential/Jail-Based Program Services: October–December 2013

20 Aftercare Programs

21 Aftercare Programs: Enrollment

22 Aftercare Programs: Case Planning and High Risk/High Need Participants

23 Aftercare Programs: Program Completions and Program Exits

24 Aftercare Programs: Program Completions and Program Exits—January-December 2013

25 Aftercare Program Services: October–December 2013

26 Aftercare Programs: Program Completions—January-December 2013
Aftercare Participants Who Completed the Program: Time Frame N % 0 to 3 Months 597 77 4 to 6 Months 122 16 7 to 9 Months 26 3 10 or More Months 34 4 Total 779 100

27 Aftercare Programs: Program Exits—January-December 2013
Participants Who Did Not Complete Program: Time Frame N % 0 to 3 Months 828 85 4 to 6 Months 121 12 7 to 9 Months 18 2 10 or More Months 4 Total 971 100

28 Aftercare Programs: Program Unsuccessful Exits—January-December 2013
Participants Who Did Not Complete Program: Reasons N % Termination for a New Charge 137 18 Release or Transfer to Another Facility 32 4 Death or Serious Illness 2 Voluntary Drop Out 141 Failure to Meet Program Requirements 374 48 Absconded 47 6 Other 46 Total 779 100

29 Summary Risk and Needs Assessment Screening and Treatment Planning
Substance Abuse Testing Program Services Substance abuse treatment; cognitive; behavioral, social; and vocational services Program Length— Residential Programs: 6-12 months Jail-Based: at least 3 months

30 Contact Information If you have any questions about program performance measures or issues related to the PMT system, contact the PMT Help Desk. Hours: 8:30am to 5:30pm EST Phone: For training and technical assistance, contact AHP. For other grant related issues, contact your BJA grant manager.

31 Q&A Jimmy Steyee Deputy Project Manager at CSR, Incorporated


Download ppt "RSAT Program Performance Measures National Workshop – Chicago 2014"

Similar presentations


Ads by Google