Download presentation
Presentation is loading. Please wait.
Published byHarriet Carroll Modified over 6 years ago
1
How to Assess the Effectiveness of Your Language Access Program
Robin Ghertner May 23, 2017 National Center for State Courts
2
Why assess your program?
3
Why do we assess any program?
Attract or Maintain Funding Assess Fidelity to Model Improve Outcomes Identify Gaps in Services Increase Efficiency Stakeholder Involvement Compliance
4
Why Assess Language Access?
Compliance with federal, state and local mandates Attract funding or protect against cuts Ensure services are delivered effectively
5
Program maturity
6
Is this a good model of program maturity…
Mature program Performance measures Developing program Impact evaluation Process evaluation New program
7
…or is this? Creating Developing Becoming established Adapting
8
Role of Assessment Creating Developing Becoming established Adapting
9
Language Access Logic Model
Inputs Funds Volunteers Staff Contractors Clients Attorneys Activities Translation Interpretation Outputs Documents translated Clients served Cases interpreted Languages served Courts served Days served Immediate Outcomes Clients understand communication Clients and Court able to successfully communicate Long Term Outcomes Fair and equitable treatment
10
Language Access as Part of Larger Logic Model
Inputs Activities Outputs Immediate Outcomes Long Term Outcomes Inputs Funds Volunteers Staff Contractors Clients Attorneys Activities Translation Interpretation Outputs Documents translated Clients served Cases interpreted Languages served Courts served Days served Immediate Outcomes Clients understand communication Clients and Court able to successfully communicate Long Term Outcomes Fair and equitable treatment
11
Components of Assessment
Performance Measurement Program Evaluation Process evaluation Impact evaluation
12
Performance Measurement vs Program Evaluation
Ongoing Monitoring and reporting Typically focus on outputs Compare measures against a standard Addresses “how are we doing” Not designed to address causality Program Evaluation “One time” Assess how well program working Typically focus on processes or outcomes Answer specific questions Addresses “does this work” or “how do we make it better” Typically more rigorous – and expensive
13
Role of Assessment Creating Developing Becoming established Adapting
Performance measurement Creating Developing Becoming established Adapting Process evaluation Impact evaluation
14
OK! Let’s Do This! Erica is a the new language access coordinator for a county court. She wants to set herself up for success and understand how her program is working. Let’s walk her through one approach.
15
Performance measurement
16
How to Measure your Program
Select appropriate measures Collect data on your measures Set standards Use measures to inform decisions
17
Step 1. Select Measures What do you want to measure?
INPUTS. Easiest to measure but less useful OUTPUTS. Direct measure but may not mean goal is reached OUTCOMES. Gets at goals but hardest to measure
18
Depth Breadth How much service is provided
Range of ways service is provided
19
Category Depth Breadth Depth and Breadth Input Number of staff, contractors, and volunteers engaged Number of staff hours used Languages spoken by staff, contractors, and volunteers Amount of financial expenditures Pay differentials for bilingual employees Use of translation/interpretation technologies. Number of relationships with stakeholders Output Number of clients served (relative to?) Number of hours interpreted Number of documents translated Number of languages translated/interpreted Different ways services provided (e.g. in-person interpretation versus telephonic or video-remote interpretation versus translation) Outcome Outcomes for LEP clients Number of client complaints received LEP client satisfaction Equitable treatment
20
Erica’s Measures Dollars spent per case.
Proportion of cases needing interpretation that receive it. Proportion of cases needing interpretation, in each language. Client satisfaction with services. Number of complaints received per number of LEP cases
21
Step 2. Collect Data How do you measure? How often do you measure?
Existing program records New tracking tools Surveys/questionnaire How often do you measure? Balance frequency with burden
22
Erica’s Data Collection
Measure Collection Frequency Dollars spent per case Budget data Weekly Number of cases interpreted New system to log cases Number of cases in each language Client satisfaction Survey questionnaire At end of each case Complaints received Complaint hotline and receptionist tracking form
23
Step 3. Define Standards What are your benchmarks?
Inadequate, Below Expectations, Poor Adequate, Meets Expectations, Good Proficient, Exceeds Expectations, Excellent Use historical data to understand measures Consult with stakeholders Validate the standards!
24
Erica’s Standards Spends 2 months collecting regular data.
Looks at trends in data to understand current performance. Shares trend data with her manager, other county courts, and local advocacy organization. Sets initial levels for “Meets Expectations” and “Exceeds Expectations” Monitors measures for a few months to validate those measures.
25
Erica’s Standards Measure Collection Method Frequency Standard
Dollars spent per case Budget data Weekly $200 – meets $150 – exceeds Proportion of cases interpreted New system to log cases 75% - meets 90% - exceeds Proportion of cases in each language Client satisfaction Survey questionnaire At end of each case 80% satisfied – meets 80% very satisfied - exceeds Complaints received Complaint hotline and receptionist tracking form 25% - meets 10% - exceeds
26
Step 4. Use measures to inform decisions
How can we improve? Do measures reflect accurately? Why? What does it mean? Trends over time Comparison across sites Comparison to benchmark Analyze Data Interpret results Program adjustments Measure adjustments
27
Erica uses her measures
28
Be careful with your interpretation!
Did Site C really improve that much? Maybe measurement improved Maybe they saw they were being assessed on the measure and put all their emphasis there Maybe county applied more resources there Maybe volunteers chipped in Maybe ESL classes started Maybe the LEP population changed Or maybe it did actually improve! What did they do differently?
29
Program evaluation
30
Types of Program Evaluations
Process Evaluation Operating as intended Identifying improvements or efficiencies Stakeholder involvement Impact Evaluation Effect of program out outcomes (in reference to counterfactual)
31
Process Evaluation Often in media res – While program is operating
Objectives: Does our program run how we expect it to? What are stakeholder perspectives, and are they involved as they should be? Are there improvements we can make? Methods: qualitative and quantitative. Focus groups, interviews Surveys Program data
32
Impact Evaluation Often at a point when program is established and need to assess effectiveness Objectives: Did our program meet its outcome goals? Relative to some baseline, how did our program impact recipients? Methods: quantitative. Surveys and program data Statistical analysis
33
Impact Evaluation Degrees of rigor! Often done by outside evaluators
Randomized control trials Quasi-experimental designs Regression modeling Often done by outside evaluators Independent and objective
34
Example Impact Evaluation Designs
Pre/Post Pre/Post with Control or Comparison Group Interrupted Time Series Discontinuity Design
35
Are you ready to evaluate?
BEYOND CREATING What is your program maturity? Do you have the internal capacity or funds? Do you know how you will you use the results? Are you committed? YES YES YES
36
What type of evaluation should you do?
WHAT do you want to know? WHO needs to know it? HOW much resources do you have? WHEN do you need to have answers? Do you need EXTERNAL VALIDATION and INDEPENDENCE?
37
Where to begin? Identify what your evaluation questions are
Identify broad approach (process vs impact) Identify evaluator Get stakeholder buy-in Plan for how you will disseminate and use results
38
Robin Ghertner robinghertner@gmail.com 917-613-8442
Questions? Robin Ghertner
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.