Download presentation
Presentation is loading. Please wait.
Published bySheila Cornelia Norman Modified over 8 years ago
2
Director of Evaluation and Accountability Manager, UW’s 2-1-1 Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri Vainavicz, CIRS Manager, UW’s 2-1-1 Heart of West Michigan United Way Grand Rapids, Michigan 2 5/23/2013
3
Understand and utilize quality assurance measures for managing and strengthening I&R Services 3 5/23/2013
4
4
5
5 What does your I&R service provide your community? What is the “impact”? How do you know this?
6
“…the systematic collection of information about the activities, characteristics, and outcomes or programs to make judgments about the program, improve effectiveness, and/or inform decisions about future programming” 6 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University
7
A means for organizational learning Time and effort well spent ensuring- o The effectiveness of programs o The organization’s ability to adapt to a changing environment 7 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University
8
An episodic event but an ongoing development process Something you do only to satisfy a funder Something you do only to promote your work A test or a punishment 8 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University
9
Accountability to the public and funding entities 9 5/23/2013
10
Outcomes – benefits or changes for program participants Outputs – direct products summations; volume) of program activities Activities – what the program does Inputs – all of the resources necessary to deliver the program 10 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University
11
Counseling, mentoring, feeding, sheltering, building, entertaining, educating MISSION 11 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University
12
1 Follow-up 2 3 4 5 6 Agency feedback Secret shopper Data base quality checks Call accounting data Abandonment Average time to answer Average time on call Call trends (scheduling) Monitoring calls 12 5/23/2013
13
Quality Assurance tools can work together. No one quality assurance tool measures or demonstrates all of the components of the I&R program. A gap or question identified by one tool may be filled or answered, affirmed or contradicted by another tool in your Quality Assurance tool kit. 13 5/23/2013
14
14 5/23/2013
15
The I&R service has a process for examining its viability as an organization, the effectiveness of its services, its appropriate involvement in the community and its overall impact on the people it serves. (Standard 29 – Quality Indicator 1) 15 5/23/2013
16
…method for tracking call volume, average speed of answer, abandoned calls, average call handling time and income call patterns (Standard 29 – Quality Indicator 2) …creates internal reports to assess operational effectiveness (Standard 29 – Quality Indicator 3) 16 5/23/2013
17
…conducts an annual evaluation of I&R activities (including the resource database and website) that involve inquirers, service providers… (Standard 29 – Quality Indicator 4) The I&R conducts regular customer satisfaction surveys (Standard 29 – Quality Indicator 5) 17 5/23/2013
18
The I&R service involves inquirers, service providers and others…in the evaluation process; and modifies the program in response to evaluation… (Standard 29 – Quality Indicator 6) 18 5/23/2013
19
19 5/23/2013
20
Telephone call to I&R inquirers to gather information about their 2-1-1 experience Allows for evaluating program effectiveness Results used to make better strategic decisions about service delivery 20 5/23/2013
21
21 5/23/2013
22
Strengths of service delivery Areas for service delivery Improvement Operations Benefits to callers Outcomes Subjective level of satisfaction with service Caller Satisfaction 22 5/23/2013
23
80% will report being provided with appropriate referrals Operations 70% will report they received requested services from the referral agency Outcomes 92% will report that they are satisfied or very satisfied with the I&R services they received Caller Satisfaction 23 5/23/2013
24
Describes the overall characteristics of callers followed-up on Age, Gender, City, Nature of request, etc. Demographic Data Frequencies Percentages Means / Averages Quantitative Data Responses to open-ended questions Qualitative Data 24 5/23/2013
25
Identification of community service gaps Identification of incorrect or outdated agency/database information Identification of reasons callers are not receiving services Identification of I&R program strengths and potential staff training needs 25 5/23/2013
26
26 5/23/2013
27
Questionnaire mailed to a sample of community agencies to gain their perception and experience with the I&R program 27 5/23/2013
28
Accuracy of referrals The perception and experience with the I&R program from the perspective of agencies 28 5/23/2013
29
Survey link mailed to 20% of the local agencies on the database Agencies asked to track the referral source for new clients for one month and to identify those referred by I&R program (previous surveys) Agencies complete the survey Analyze the results 29 5/23/2013
30
30 5/23/2013
31
Observation of I&R call to determine and measure how well established call standards and elements are met 31 5/23/2013
32
Quality of the I&R communication, whether essential elements were completed, familiarity with the phone system and database and general performance of the I&R specialist Identifies best practices and strengths Identifies gaps in knowledge about community resources and other areas for staff development 32 5/23/2013
33
Callers are provided message that their conversation may be monitored I&R manager logs on to be able to listen to calls carried out on an I&R specialist’s phone extension or listens to recorded calls. I&R manager listens and records which call elements were completed during the call I&R manager shares the observations with the I&R specialist I&R manager and team look for trends to identify strengths or gaps 33 5/23/2013
34
Average score on silent monitoring of 80% of possible total score (88 out of a possible 110 points) 1% of calls monitored 34 5/23/2013
35
Average score on silent monitoring-91 (83% of possible total score) 35 5/23/2013
36
36 5/23/2013
37
Describe a change in policy or procedure in your program that was based on evaluation. What was measured and what was the change? 37 5/23/2013
38
Agency presentations and site visits Schedules adjusted to assure the right number at the right time Increase silent monitoring to try to gain a more objective measure in response to agency survey input that referrals are not as accurate as we desire. Added temp resource database staff to update resource database Hired someone with bi-lingual language skills when filling a vacant position Found additional resources for staff 38 5/23/2013
39
Dashboard o Identify: Strengths Gaps Next Steps Solutions 39 5/23/2013
40
American Evaluation Association (AEA) evaluation search – http://www.eval.org/find _an_evaluator/evaluator_search.asp http://www.eval.org/find _an_evaluator/evaluator_search.asp Local affiliates of AEA Michigan Association for Evaluation Local colleges and universities 40 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University
41
5/23/2013 41 Robert McKown Sr. Director of Evaluation & Accountability (616) 752-8639 rmckown@hwmuw.org Sherri Vainavicz 2-1-1 Program Manager (616) 752-86341 svainavicz@hwmuw.org For More Information Contact Thank you!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.