Director of Evaluation and Accountability Manager, UW’s 2-1-1 Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

6/3/ LA County1 The How of Metrics What to Collect and How to Use it Amy Latzer Chief Operations Officer, 211 LA County 31st I&R Annual Training.
Program and Implementation. Plans for the next 90 minutes and beyond! Define quality assurance for a program Implementation plans Utilization of.
Introduction to Monitoring and Evaluation
Collecting Citizen Input Management Learning Laboratories Presentation to Morrisville, NC January 2014.
1 Mystery Shopping SHIP Directors’ Conference June 11, 2007 Julie Leonard & Erika Melman BearingPoint, Inc.
Build a Better Mousetrap: Quality Assurance and Call Specialist Performance Evaluation.
SCWDC Policy Training Delivery Design: Individual Recommended:  Read the policy prior to taking this training.  It is helpful to have a copy of.
Project Monitoring Evaluation and Assessment
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
An Assessment Primer Fall 2007 Click here to begin.
1 Introduction to Workforce Planning and Development in State of Alaska Executive Branch Departments.
Title I Needs Assessment and Program Evaluation
Chapter 15 Evaluation.
Training and assessing. A background to training and learning 1.
Performance Management Upul Abeyrathne, Dept. of Economics, University of Ruhuna, Matara.
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
Continuous Quality Improvement (CQI)
Molly Chamberlin, Ph.D. Indiana Youth Institute
Quality Improvement Prepeared By Dr: Manal Moussa.
2 Call Center Metrics: Best Practices in Performance Measurement and Management to Maximize Quitline Efficiency and Quality by Penny Reynolds The Call.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Coordinated Entry.  Helping people move through the system faster  Sends households to intervention best fit from the start  Reduce new entries into.
PROGRAMS MONITORING AND SUPERVISION
The Information Component: Help Desk Performance Measures
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
State of Maine: Quality Management and National Core Indicators.
UNDERSTANDING CUSTOMER REQUIREMENTS
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Program Evaluation Using qualitative & qualitative methods.
SPF SIG State-Level Evaluation COMMUNITY LEVEL INSTRUMENT (CLI): PART 2.
The Evaluation Plan.
Performance and Development Culture Preparing for P&D Culture accreditation April 2008.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
If you don’t know where you’re going, any road will take you there.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Working Definition of Program Evaluation
Military Family Services Program Participant Survey Training Presentation.
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
BUFFALO 311 CRM-CM Overview CITY OF BUFFALO Division of Citizen Services.
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Welcome! Please join us via teleconference: Phone: Code:
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Military Family Services Program Participant Survey Briefing Notes.
Using Client Satisfaction Surveys as a Quality Improvement Tool.
Copyright  2005 McGraw-Hill Australia Pty Ltd PPTs t/a Australian Human Resources Management by Jeremy Seward and Tim Dein Slides prepared by Michelle.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Catholic Charities Performance and Quality Improvement (PQI)
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Orientation Serving Mecklenburg County. Welcome Orientation to CRC …an innovative network that will help you better connect with and serve consumers July.
Call Center Metrics: Best Practices in Performance Measurement and Management to Maximize Quitline Efficiency and Quality by Penny Reynolds The Call Center.
Performance and Quality Improvement
Presentation transcript:

Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri Vainavicz, CIRS Manager, UW’s Heart of West Michigan United Way Grand Rapids, Michigan 2 5/23/2013

Understand and utilize quality assurance measures for managing and strengthening I&R Services 3 5/23/2013

4

5  What does your I&R service provide your community?  What is the “impact”?  How do you know this?

“…the systematic collection of information about the activities, characteristics, and outcomes or programs to make judgments about the program, improve effectiveness, and/or inform decisions about future programming” 6 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University

 A means for organizational learning  Time and effort well spent ensuring- o The effectiveness of programs o The organization’s ability to adapt to a changing environment 7 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University

 An episodic event but an ongoing development process  Something you do only to satisfy a funder  Something you do only to promote your work  A test or a punishment 8 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University

Accountability to the public and funding entities 9 5/23/2013

 Outcomes – benefits or changes for program participants  Outputs – direct products summations; volume) of program activities  Activities – what the program does  Inputs – all of the resources necessary to deliver the program 10 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University

Counseling, mentoring, feeding, sheltering, building, entertaining, educating MISSION 11 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University

1 Follow-up Agency feedback Secret shopper Data base quality checks Call accounting data Abandonment Average time to answer Average time on call Call trends (scheduling) Monitoring calls 12 5/23/2013

Quality Assurance tools can work together. No one quality assurance tool measures or demonstrates all of the components of the I&R program. A gap or question identified by one tool may be filled or answered, affirmed or contradicted by another tool in your Quality Assurance tool kit. 13 5/23/2013

14 5/23/2013

 The I&R service has a process for examining its viability as an organization, the effectiveness of its services, its appropriate involvement in the community and its overall impact on the people it serves. (Standard 29 – Quality Indicator 1) 15 5/23/2013

 …method for tracking call volume, average speed of answer, abandoned calls, average call handling time and income call patterns (Standard 29 – Quality Indicator 2)  …creates internal reports to assess operational effectiveness (Standard 29 – Quality Indicator 3) 16 5/23/2013

 …conducts an annual evaluation of I&R activities (including the resource database and website) that involve inquirers, service providers… (Standard 29 – Quality Indicator 4)  The I&R conducts regular customer satisfaction surveys (Standard 29 – Quality Indicator 5) 17 5/23/2013

 The I&R service involves inquirers, service providers and others…in the evaluation process; and modifies the program in response to evaluation… (Standard 29 – Quality Indicator 6) 18 5/23/2013

19 5/23/2013

 Telephone call to I&R inquirers to gather information about their experience  Allows for evaluating program effectiveness  Results used to make better strategic decisions about service delivery 20 5/23/2013

21 5/23/2013

Strengths of service delivery Areas for service delivery Improvement Operations Benefits to callers Outcomes Subjective level of satisfaction with service Caller Satisfaction 22 5/23/2013

80% will report being provided with appropriate referrals Operations 70% will report they received requested services from the referral agency Outcomes 92% will report that they are satisfied or very satisfied with the I&R services they received Caller Satisfaction 23 5/23/2013

Describes the overall characteristics of callers followed-up on Age, Gender, City, Nature of request, etc. Demographic Data Frequencies Percentages Means / Averages Quantitative Data Responses to open-ended questions Qualitative Data 24 5/23/2013

 Identification of community service gaps  Identification of incorrect or outdated agency/database information  Identification of reasons callers are not receiving services  Identification of I&R program strengths and potential staff training needs 25 5/23/2013

26 5/23/2013

 Questionnaire mailed to a sample of community agencies to gain their perception and experience with the I&R program 27 5/23/2013

 Accuracy of referrals  The perception and experience with the I&R program from the perspective of agencies 28 5/23/2013

 Survey link mailed to 20% of the local agencies on the database  Agencies asked to track the referral source for new clients for one month and to identify those referred by I&R program (previous surveys)  Agencies complete the survey  Analyze the results 29 5/23/2013

30 5/23/2013

 Observation of I&R call to determine and measure how well established call standards and elements are met 31 5/23/2013

 Quality of the I&R communication, whether essential elements were completed, familiarity with the phone system and database and general performance of the I&R specialist  Identifies best practices and strengths  Identifies gaps in knowledge about community resources and other areas for staff development 32 5/23/2013

 Callers are provided message that their conversation may be monitored  I&R manager logs on to be able to listen to calls carried out on an I&R specialist’s phone extension or listens to recorded calls.  I&R manager listens and records which call elements were completed during the call  I&R manager shares the observations with the I&R specialist  I&R manager and team look for trends to identify strengths or gaps 33 5/23/2013

 Average score on silent monitoring of 80% of possible total score (88 out of a possible 110 points)  1% of calls monitored 34 5/23/2013

 Average score on silent monitoring-91 (83% of possible total score) 35 5/23/2013

36 5/23/2013

Describe a change in policy or procedure in your program that was based on evaluation. What was measured and what was the change? 37 5/23/2013

 Agency presentations and site visits  Schedules adjusted to assure the right number at the right time  Increase silent monitoring to try to gain a more objective measure in response to agency survey input that referrals are not as accurate as we desire.  Added temp resource database staff to update resource database  Hired someone with bi-lingual language skills when filling a vacant position  Found additional resources for staff 38 5/23/2013

 Dashboard o Identify: Strengths Gaps Next Steps Solutions 39 5/23/2013

 American Evaluation Association (AEA) evaluation search – _an_evaluator/evaluator_search.asp _an_evaluator/evaluator_search.asp  Local affiliates of AEA  Michigan Association for Evaluation  Local colleges and universities 40 5/23/2013 Source: Salvatore Alaimo, PhD – Grand Valley State University

5/23/ Robert McKown Sr. Director of Evaluation & Accountability (616) Sherri Vainavicz Program Manager (616) For More Information Contact Thank you!