Download presentation
Presentation is loading. Please wait.
Published bySamantha Jefferson Modified over 9 years ago
1
DefineDevelopAchieveMeasureImprove
2
Data are good… …Results are better
3
“When you have mastered numbers, you will in fact no longer be reading numbers, any more than you read words when reading books. You will be reading meanings.” Harold S. Geneen “Statistics are used like a drunk uses a lamp post: for support, not illumination” Vince Scully
4
“Busy Work:” Activity meant to take up time but not necessarily yield productive results. (The American Heritage® Dictionary of the English Language, Fourth Edition copyright ©2000) “BUSY DATA”…
7
Many pretrial programs don’t know how to define “success” or measure progress towards strategic outcomes. A focus on “busy data” prevents leaders from measuring what really matters to their programs, systems, defendants and communities. BOTTOM LINE: Pretrial Leaders must move from “data building” to “results management.”
8
Achieve better results in public safety, court appearance and defendant accountability Create a results-oriented culture that values mission-related strategic functions Allow pretrial programs to show through solid performance oriented information their value in a high-functioning criminal justice system Allow better decisions on program resources, quality and effectiveness
9
Inputs and Outputs From this… Outcomes Efficiencies To this…
10
o Shifts the organization’s focus from activities to results, from how a program operates to the good it accomplishes o Frees leaders to lead o Focuses and motivates management and staff on common goals and purposes o Identifies what works and what’s promising o Positions organization within the system and community as successful, increasing support and resources
11
Measures are numeric (rates, totals) or qualitative (perception, feedback) indicators of how well an organization performs its mission-related and strategic functions. Outcome measures focus on mission Performance measures gauge operational goals that support the mission. Performance Measurement is the process of tracking progress toward achieving mission and goals, including information on how well resources are transformed into goods and services, the quality of outputs and outcomes, and the effectiveness of operations in meeting program objectives.
12
Inputs: Resources used to produce goods or services, including human, financial, facility, or material resources. Example: number of defendants supervised, number of defendants needing treatment or community service placements. Outputs: Indicators that count an organization’s services and goods. Example: the number of assessments, program placements, sanctions imposed.
13
Efficiencies: Indicators of an organization’s achieving a stated target. Example: percent of defendants sanctioned for noncompliance. Outcomes: Indicators of the actual impact of an organization's actions. An outcome measure is a means for quantified comparison between the actual result and the intended result. Example: Appearance, safety, success and recidivism.
14
In 2011, NIC published Measuring What Matters: Outcome and Performance Measures for the Pretrial Field, a compilation of the Network’s suggested outcome and performance measures and mission critical data. www.nicic.gov/library/025172
15
NIC believes the suggested measures are appropriate for any pretrial release program whose mission includes reducing the likelihood of future missed court appearances and/or new arrests or filed charges during pretrial supervision. Select outcome measures are appropriate for programs that have jail population management as a mission critical objective.
16
Appearance rate measures the percentage of supervised defendants who make all scheduled court appearances. This is the central outcome measure for pretrial services programs. Nearly all programs have maximizing appearance rates as part of their mission. Enabling legislation or local rules for pretrial programs usually cite court appearance as a chief objective. National pretrial release standards identify minimizing failures to appear as a central function for pretrial programs. Recommended Data: Cases with a verified pretrial release and/or placement to the pretrial program and the subset of this population that have no bench warrants/capiases issued for missed scheduled court appearances. The appearance rate also may be tracked by various defendant populations, although the primary group targeted should be defendants released to the agency’s supervision.
18
Safety rate tracks the percentage of supervised defendants who are not charged with a new offense during case adjudication. A new offense is one: whose offense date occurs during the defendant’s period of pretrial release; that includes a prosecutorial decision to charge; and that carries the potential of incarceration or community supervision upon conviction. This excludes arrest warrants executed during the pretrial period for offenses committed before the defendant’s case filing. Recommended Data: the number of defendants with a verified pretrial release or placement to the pretrial program and the subset of this population with no rearrests on a new offense. Programs also may track separate safety rates by charges type (for example, misdemeanors, felonies or local ordinance offenses) or severity (violent crimes, domestic violence offenses or property crimes) or by defendant populations.
21
Concurrence rate is the ratio of defendants whose supervision level or detention status corresponds to their assessed risk of pretrial misconduct. Conditions of supervision recommended and imposed do not have to match exactly; however, the overall supervision level should be comparable. For example, a recommendation for release on personal recognizance with no conditions would not match a release with a weekly reporting requirement. This measure counts only defendants eligible by statute for pretrial release and excludes defendants detained on statutory holds, probation or parole warrants or holds and detainers from other jurisdictions. Recommended Data: the number of release and detention recommendations and subsequent release and detention outcomes.
22
Success rate measures the percentage of released defendants who are 1) not revoked for technical violations, 2) appear for all scheduled court appearances, and 3) remain arrest-free. The measure excludes defendants that are detained following a guilty verdict and those revoked due to non pretrial related holds. Recommended Data: the total number of defendants released to the program and the subset of this population that experience no condition violations, failures to appear or rearrests. Depending on the pretrial program’s information system, revocations may show as subsequent financial release or detention orders.
24
Detainee length of stay presents the average lengths of jail stay for pretrial detainees that are eligible by statute for pretrial release. This is a significant outcome measure for the estimated 27 percent of pretrial programs located within corrections departments and with missions to help control jail populations and a performance measure for other pretrial programs. Release is defined here as the defendant’s full discharge from jail custody. Recommended Data: admission and release dates for all pretrial-related jail detentions.
25
Universal Screening: The percentage of defendants eligible for release by statute that the program assesses for release eligibility. Recommendation Rate: The percentage of time the program follows its risk assessment criteria when recommending release or detention. Response to Defendant Conduct: Frequency of policy-approved responses to compliance and noncompliance with court-ordered release conditions. Pretrial Intervention Rate: The pretrial agency’s effectiveness at resolving outstanding bench warrants, arrest warrants and capiases.
26
Outcomes Appearance Rate: The percentage of supervised defendants who make all scheduled court appearances. Safety Rate: The percentage of defendants that are not charged with a new offense during the pretrial stage. Concurrence Rate: The ratio of defendants whose supervision level or detention status corresponds with their assessed risk of pretrial misconduct. Success Rate: The percentage of released defendants who are 1) not revoked for technical violations due to condition violations, 2) appear for all scheduled court appearances, and 3) are not charged with a new offense during pretrial supervision. Pretrial Detainee Length of Stay: The average lengths of jail stay for pretrial detainees that are eligible by statute for pretrial release. Risk AssessmentRisk Management Screening: The percentage of defendants eligible for release by statute that the program assesses for release eligibility. Recommendation: The percentage of time the program follows its risk assessment criteria when recommending release or detention. Response: Frequency of policy-approved responses to compliance and noncompliance with release conditions. Intervention: The pretrial agency’s effectiveness at resolving outstanding bench warrants, arrest warrants and capiases. External Factors/Assumptions Community Legal Defendant System
28
IT STARTS WITH THE MISSION: Pretrial Services assists the Court make reasonable and appropriate bail decisions, consistent with State law. Pretrial Services is committed to maximizing court appearance by released defendants, promoting a safer community, and advancing fair administration of justice.
29
IT STARTS WITH THE MISSION: “To promote pretrial justice and enhance public safety” Pretrial Services Agency for the District of Columbia
30
Mission Outcome I Outcome II Outcome III Goals Strategic Area 1 Strategic Area 2 Strategic Area 3 Objectives Performance Measure 1 Performance Measure 2 Performance Measure 3
32
Inputs I. II III IV V Outputs I. II. III IV. V. Intermediate Outcomes IIIIIIIV Outcomes Appearance Rate Safety Rate Concurrence Rate Success RateLength of Stay
33
o Tie measures to mission, goals and objectives. Use performance measurement to track progress and direction toward strategic objectives. o Use results for mission-driven items. Data should be the foundation for new initiatives, budgets, strategic planning, and priorities. o Leaders lead! In high performing organizations, management is active in measurement creation, articulating mission/vision/goals, and disseminating expectations and results. However, Management must see a value in measurement if they are to commit. o Create a measurement framework and advertise it at all levels. Everyone must know how measures relate to their work. Accountability is key as is knowing that what you do is worthwhile.
34
o Create measurement systems that are positive, not punitive. Successful performance frameworks are not "gotcha" systems, but learning environments that help the organization identify what works/what doesn’t and continue with/improve on what works and fix/replace what doesn’t. That said…. o Tie compensation, rewards, and recognition to performance measurements. High functioning agencies link performance evaluations and rewards to measures of success; rewards link directly to performance. This sends a clear message to the organization about what's important. o Share results with staff, customers, and stakeholders. Performance measure results should be openly and widely shared.
35
Judges mostly support implementation of the measures but opposed or expressed skepticism about publicly reporting them due to concerns they will be misconstrued and result in unfair comparisons. They are worried the numbers will not accurately reflect operations because of external matters affecting cases. “The only individuals interested in the results would be a lawyer planning to run against an incumbent, a young journalist seeking a headline or a disgruntled litigant seeking to file a JTC (Judicial Tenure Commission) complaint against a judge,” the 12-judge Macomb bench said in a document submitted to state officials, although it mentions only “certain” judges oppose the reporting proposal. The bench’s concerns were similar to those mentioned in memos submitted by the Michigan Judges Association, Michigan Probate Judges Association, Michigan District Judges Association and other jurists statewide.
36
o Identify outcomes and indicators. Measures imposed by outsiders may not meet the desired criteria. However… o Seek input when identifying program outcomes. Staff, customers, and partners can point out important outcomes. o Data collection and analysis pose technical challenges. Hiring a technical expert often will often save time, offer reassurance and improve results. o Trial-run the measurement system. The trial run must last long enough to encompass all key data collection points and must involve at least a representative group of program participants. Expect that the trial run will identify problems; that’s the point.
37
o Developing an outcome measurement system takes time. Hatry, van Houten, Plantz, and Greenway (1996) identified eight steps grouped in three developmental phases: initial preparation (getting ready to begin, choosing outcomes to measure, specifying indicators for the outcomes and preparing to collect data), a trial run (trying out data collection procedures and data analysis and reporting methods), and implementation (adjusting the outcome measurement system and using the findings). o Monitor and improve measurement systems. Programs change and programs learn. The system must keep up!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.