Download presentation
Presentation is loading. Please wait.
Published byDonald Hodges Modified over 9 years ago
1
September 18-19, 2006 – Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development HMIS as a Tool to Measure Performance of Programs Barbara Ritter, Michigan Darlene Joseph Mathews, Washington DC Tom Albanese, Columbus OH
2
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 2 Overview Learning Objectives Performance Measurement Models –Michigan –Washington, DC –Columbus, OH
3
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 3 Workshop Learning Objectives: Participants will understand which data elements within an HMIS correlate to performance measurement, how to set up measurement systems, and how to monitor the effectiveness of measurement systems. Participants will understand how to operate HMIS systems to measure client outcomes, program performance, and monitor system-wide effectiveness.
4
September 18-19, 2006 – Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development Measurement: An Ongoing Collaboration Barbara Ritter, Michigan Statewide HMIS Project Director MI Coalition Against Homelessness
5
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 5 Michigan Statewide HMIS Funding Organizations Program Leadership Staff who collect information and enter data
6
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 6 Selecting What to Measure… Mix of measures: –Process Indicators Time to referral or placement in housing –Short-term or Intermediate objectives - measure critical processes Improved income at discharge Linked with needed supportive services –Outcomes – measures sustained change Stable Housing Stable Employment
7
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 7 Coordinating the Process of Measurement How do we know what success looks like? –Client Characteristics / Who is Served –When measurement occurs –Establishing Base Lines –Benchmarking / Comparing Performance –Improving Performance –The Environment / Contextual Variables –Evaluate Outcomes
8
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 8 Defining the Population Who is Served Developing shared screening questions. Data Standards plus: –Have they had a lease in their own name? –Were they homeless as children? –What is their housing history? –What is their education history? –What is their employment history? –What is their health history? –Are there disabilities of long duration?
9
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 9 When Measurement Occurs Intermediate Outcomes –Measured during care or at discharge –Reflect stable and optimized program processes –Customer Satisfaction Follow-up Outcomes –Measured after discharge for some period of time –Were changes achieved in care sustained? –Did we impact other long-term primary outcomes?
10
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 10 Base Lines & Targets Critical for evaluating change and optimizing performance –Realistic / honest targets –Stabilized Processes Percentiles are consistent across time –Systematic and Routine Measurement Before; During; and, After program change.
11
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 11 Benchmarking Working with “like” programs to: –Identify realistic targets –Identify performance that is off norm –Share ideas to improve –Address measurement problems –Build transparency
12
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 12 The Context of Services Lack of affordable housing Vacancy rates Unemployment rates Lack of access to supporting services –Long Waiting Lists –No Insurance or Pay Source Barriers to success or community assets
13
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 13 What Will the Measurement Project Do? 1.Support the development of initial measures (indicators, objectives and outcomes) and targets through meetings with funders, program leadership and staff. 2.Provide a menu of measures for program leadership to select from.
14
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 14 What will the Measurement Project Do? (Cont.) 3.Provide reports that may be run monthly with a click of the button on all selected measures. The reports will aggregate data that is viewable through the HMIS for Programs, Agencies, CoCs, and Statewide. 4.Convene routine Benchmarking meetings to discuss measurement issues as well as performance ideas.
15
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 15 What will the Measurement Project Do? (Cont.) 5.Maintain, with local support, a database of contextual variables. 6.Share/problem-solve measurement issues. Support the evolution of measures. 7.Provide data for statewide analysis as defined in contracts.
16
September 18-19, 2006 – Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development Measurement: An Example from our Capital Darlene Mathews, The Community Partnership for the Prevention of Homelessness (TCP) Washington, DC
17
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 17 The Community Partnership Since 1999, the Community Partnership has used various databases and software to manage data One of the first “Tier 1” agencies to pilot HMIS Today our HMIS system has over 200 programs, 300 users and over 5,000 clients entered
18
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 18 The Washington Cumulative Monthly Report Purpose: to create a standard reporting tool that was easy to use and address multiple performance indicators First foray into performance measurement through HMIS Used from 2001-2004 Comprehensive Custom Report created by Bowman Internet Systems Captured basic demographic information, education, employment & disabilities
19
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 19 Method TCP ran report, sent it to Providers monthly TCP set general standards for each Provider based on their scope of work Programs were required to review data, sign off on its accuracy as it relates to scope of work goals and or make corrections Supplemented qualitative monthly reporting deliverables
20
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 20 Benefits to using the Washington Cumulative Monthly Report Report was accessible to everyone: Providers could use the report, run it independently and monitor their own programs Providers got in the habit of using HMIS as a data collection tool Informed other reports that we were required to complete
21
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 21 Drawbacks of using the Washington Cumulative Monthly Report Costliness: Overtime as Service Point was updated, we had to pay extra to update this report Reporting weaknesses: It could aggregate program information, but not break down results by program Changing needs: Our needs became more complex over time
22
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 22 SuperNOFA Measurement Method Purpose: creates a comprehensive performance measurement method to inform the funding and ranking process Introduced in 2005 Compares like to like programs Sets benchmarks for program outcomes
23
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 23 Method We organized our renewal programs to create “like to like” groupings for comparisons We set benchmarks for outcomes that were slightly above HUD standards outlined in the NOFA We exported all raw data into Excel to perform calculations and analysis Each program was given a quality score based on these three measures: 1.Occupancy, 2.Change in Income, 3a.Destinations and Exit or 3b.Stability in the Program (for permanent housing programs)
24
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 24 Method (cont.) Occupancy Score –Based on client entry and exit information Calculation: Number of occupied bed nights/Number of total bed nights in the period Change in Income- Based on cash and non-cash benefits tracked through HMIS Calculation: Change in the number of income sources for a client after entry/total clients served Destination- Based on Exit Sheet Calculation: Positive Destinations/total destinations Stability-Based on Entry and Exit Sheets Calculation: Length of Stay > 6 months/number of clients in program
25
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 25 Method (cont.) Each program was given a score for each of the indicators Like to like programs were compared and ranked This process was also used to inform the funding of our District programs and used again this year for the NOFA ranking process
26
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 26
27
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 27
28
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 28
29
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 29
30
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 30 Benefits of the NOFA Measurement System Fair and balanced way of comparing “like to like” programs We were able to change benchmarks and thresholds for various levels of the Continuum It can give you a good idea of how an individual program is performing as well as an aggregate snapshot of multiple programs
31
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 31 Drawbacks of NOFA measurement system Very time intensive task Still a need for a more frequent quality control that a monthly or quarterly report can give
32
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 32 Where we are today We took a step back and really analyzed the needs of our entire system We applied for an HMIS grant Developed an HMIS work plan to organize our short and long term reporting goals Developed an HMIS Team to make the work plan a reality –Individuals were assigned tasks and responsibilities –Team included-Consultants, HMIS Coordinator, Policy Analysts and Federal and District Contract Managers
33
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 33 Where we are today (cont.) Organization-We took time to formulate and structure the type of queries and reports that we need to develop for ongoing reporting and measurement Timelines-We identified the manner in which we want our data presented, the time frame for producing it and the steps in developing it Technical Assistance-We hired a consultant who completes user trainings every month on HMIS. Training topics range from basic, intermediate and advanced use, to reporting capabilities and better data entry tips.
34
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 34 One size does not fit all Custom Design-We are creating and redesigning assessments for special areas of our Continuum such as Central Family Intake and Outreach Programs so that HMIS better meets their needs
35
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 35 Using reporting tools to develop new reporting methods TCP will be undertaking the development of new custom reports to be used on a quarterly basis to conduct performance measurement using the advanced reporting tool To reduce the time intensive aspect of performance measurement we plan on using the scheduling function to run a report on: -HUD required data elements -Length of stay, Occupancy and Income -Report could be run by in the aggregate and broken out by program
36
September 18-19, 2006 – Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development Measurement: Program Evaluation that Supports System Analysis Tom Albanese, Community Shelter Board Columbus, Ohio
37
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 37 Community Shelter Board The Community Shelter Board was created in 1986 to respond to the growing needs of homelessness in Franklin County. Non-profit intermediary –Funder: shelter, supportive housing, and related services –Planning & evaluation: Continuum of Care, system, funder collaborative, HMIS –Coordination of services
38
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 38 CSB HMIS Since 1990, CSB has collected data on persons accessing shelter – characteristics and utilization Current: –16 agencies, 47 programs –Coverage: 95% of all shelter beds; All supportive housing developed in last 6 years –33 required data elements: varies by program type 2006-07 HMIS expansion to include all HUD funded homeless programs 2007-08 HMIS upgrade
39
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 39 HMIS Data Quality Assurance Program data required to meet QA standards: –Timeliness All required data elements by 4 th working day of month –Completeness 95% of all clients for each required data element (<5% not reported/null) –Accuracy Congruent with program type, population served, capacity, etc. Matches agency client record (e.g. entry/exit dates match) –Consistency Consistent with past program performance/outcomes
40
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 40 Performance Measures Benchmarks set by CSB Board of Trustees Included in annual agency contracts –Establish annual Program Outcomes Plan (POP) for each program HMIS data reporting and evaluation methodologies Daily shelter utilization monitoring Quarterly performance monitoring Annual performance evaluation Strategic planning Community Reports
41
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 41 Outcome Measure Methodology: Example Successful Housing Outcome: Tier I Adult Emergency Shelter Source: Homeless Census Report (1) Defined: The number of distinct household exits with a ‘Permanent’ or ‘Transitional’ housing exit, excluding exits to family or friends. Refer to the Housing Outcomes Appendix for a list of destinations and their correlation to housing and shelter outcomes. Calculated: The number of households served that exited with a successful housing outcome (based on the last exit) / the number of total distinct households served that exited the program. (1) Homeless Census Report is a standard CSB report that is produced using Crystal Reports.
42
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 42 Performance Measures PreventionOutreachEmergency Shelter Resource Specialists Direct HousingPermanent Supportive Housing Number Served Successful Housing Outcomes Average Length of Stay Recidivism Movement Successful Income Outcomes (Tier II only) Direct Client Assistance Utilization Occupancy (Tier II only) Housing Stability Housing Retention
43
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 43 Program Outcomes Plan: Emergency Shelter Example Measure Semi-Annual Goal Households Served - #600 Average Length of Stay per Household30 days Successful Housing Outcomes #75 Successful Housing Outcomes %15% Recidivism - %<10% Access to resources to avoid shelter admission and stabilize housing Pass certification Basic needs met in secure, decent environmentPass certification Ongoing engagement with the neighborhoodPass certification Efficient use of a pool of community resources CSB costs per household consistent with CSB budget
44
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 44 Quarterly Indicator Reports Key performance measures: system & program Issued to CSB Board of Trustees & Continuum of Care Steering Committee Posted to www.csb.org
45
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 45 Annual Program Evaluation Completed based on first half of FY for use in following FY funding determination Program outcomes compared to planned outcomes Data mostly derived from HMIS Programs are scored as: –High: no less than one not achieved –Medium: half or more achieved –Low: less than half achieved Long-standing, unresolved issues could also lower rating Issued to CSB Board of Trustees, funders Posted to www.csb.org
46
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 46 Strategic Planning Rebuilding Lives Initiative 1997 Community “charge” to assess homeless services for single men impacted by downtown economic development Data sources –Analysis of CSB MIS data –Comprehensive Community Needs Assessment –Analysis of best practices –Review of national model programs Rebuilding Lives Updated Strategy
47
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 47 CSB undertook analysis of shelter stays that mirrored methodology developed by Dennis Culhane, Univ. of Penn. –Used CSB MIS data for 7,944 men using shelter system from 1994-1996 –Developed 3 “clusters” that grouped men based on cumulative length of shelter stay and number of shelter stays –Analyzed shelter usage for each cluster Rebuilding Lives Typology Study
48
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 48 Community Reports CSB Annual Report Rebuilding Lives Updates 2005 Community Report on Homelessness –Emergency shelter system (1995-2005) –Point-in-time count (unsheltered/sheltered)
49
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 49 Impact of HMIS Data & Performance Measurement Communicates more accurate description of population (vs. individual, anecdotal needs) Creates higher level of understanding of community problem (builds community support) Informs resource allocation (annual funding) Informs program development/improvement (CQI) Informs policy options (more shelter vs. housing)
50
September 18-19, 2006 - Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development 50 Presenters… Tom Albanese, L.S.W. Director of Programs & Planning Community Shelter Board 115 West Main Street, LL Columbus, Ohio 43215 (P) 614.221.9195 (F) 614.221.9199 talbanese@csb.org Darlene Joseph Mathews Policy Analyst The Community Partnership 801 Pennsylvania Avenue, SE Suite 360 Washington, DC 20003 (P) 202-543-5298 x 115 (F) 202-543-5653 dmathews@community- partnership.org Barbara Ritter Project Director Michigan Statewide HMIS PO Box 13037 Lansing, MI 48933 (P) (517) 485-6536 ext 12 (517) 485-6682 britter@mihomeless.org
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.