for CIT Program Operation Resource Development Institute

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
Grande Prairie Community Youth Intervention Program A Safe Communities Initiative Crystal Hincks Research Associate Centre for Criminology and Justice.
How to Write Goals and Objectives
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Molly Chamberlin, Ph.D. Indiana Youth Institute
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
WRITING EFFECTIVE GRANT PROPOSALS With an Eye toward Performance and Evaluation Issues.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Facts and Figuring it Out How to Complete the Healthy School Report Card.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Regional Dental Consultants’ Meeting Presented by Emerson Robinson, DDS, MPH Region II and V Dental Consultant.
1 Outcomes: Outcomes: Libraries Change Lives — Libraries Change Lives — Oh yeah? Prove it. Oh yeah? Prove it. The Institute of Museum and Library Services.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Program Design Global Health Fellowship St Luke’s/Roosevelt New York.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Publisher to insert cover image here Chapter 9
District Engagement with the WIDA ELP Standards and ACCESS for ELLs®: Survey Findings and Professional Development Implications Naomi Lee, WIDA Research.
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
National Coalition Academy Summary
Developing Community Assessments
Measurement Tools ENACTUS TRAINING Quality of Life
Designing Effective Evaluation Strategies for Outreach Programs
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Part 1 Being professional
Blue Hills Community Health Alliance (CHNA 20)
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Annual Evaluation (TFI 1.15 )
Investment Logic Mapping – An Evaluative Tool with Zing
Session 1 – Study Objectives
Budgetary Processes and Public Expenditure Management Core Course
ATTC Network Orientation
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
RAPID RESPONSE program
Measurement Tools ENACTUS TRAINING Quality of Life
Ross O. Love Oklahoma Cooperative Extension Service
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Program Evaluation Essentials-- Part 2
Presented by: Gail V. Barrington, PhD, CMC
Short term Medium term Long term
Strategic Prevention Framework - Evaluation
Measuring Project Performance: Tips and Tools to Showcase Your Results
Webinar: ESSA Improvement Planning Requirements
College of Public Health and Human Sciences
2018 CIT, International Annual Conference
General Notes Presentation length - 10 – 15 MINUTES
Consortium of State Organizations for Texas Teacher Education
Introduction to M&E Frameworks
Project Title: (PEARS Action Plan-Step 1)
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Planning Training Programs
TECHNOLOGY ASSESSMENT
Overview of the Kansas Technical Assistance System Network: Using Technical Assistance to Facilitate Implementation of Evidence-based Practices Kerry.
Developing Sustainable Projects
Data Collection: Designing an Observational System
Using Logic Models in Project Proposals
Career Development I Final Presentation
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Implementation Part 2: Focus on MAT Implementation
Presentation transcript:

for CIT Program Operation Resource Development Institute Establishing Ongoing Evaluation Activity as a Management Tool for CIT Program Operation Kimberlee C. Murphy, Ph.D. Resource Development Institute

Learning Objectives You will be able to …   You will be able to … Identify areas needing assessment. 2. Identify data indicators needed to assess program results. Assess overall effectiveness/outcomes of your CIT program. Understand where evaluation fits into program operation and improvement.

Evaluation Versus Research Characteristic Evaluation Research Used to … determine effectiveness or worth advance knowledge in the field Inquiry based on … policy/program interests of stakeholders intellectual curiosity Information for … program monitoring, quality improvement efforts broad knowledge/theory Conducted within … changing setting, actors, priorities, timelines, etc. controlled setting … or attempt at “controlled”

How Evaluation Fits Into Program Operation Program Revision Program Evaluation

Why Spend Time Evaluating 1. Funder often requires it. 2. Get an objective view of what you’re doing that works well. . Will give you evidence as you work to get “buy in” and support for your program. 3. Will help you identify what areas need to be improved for quality improvement efforts. 4. With data showing effectiveness, you increase agency sustainability by making the program more competitive for grant and other funding.

Two Questions to Guide the Evaluation What do you want to tell an audience about the outcomes of your work? . Goals achieved . Other results What do you need to know to monitor program operation? . Activities carried out – types, how many, how often . Resources used – including players involved . People served – description of, how many

Logic Model Inputs Outputs Outcomes Initial Intermediate Long-term (Learning) Intermediate (Action) Long-term (Conditions) Resources Materials Equipment Staff Stakeholders Trainings Products Services People trained People served Knowledge Skills Attitudes Opinions Behaviors Practices Social actions Social Economic Civic Environmental

Telling an Audience about the Outcomes As a result of our CIT program … Police officers are more knowledgeable about mental illness and how to work with individuals with mental illness in crisis. 2. Citizens with mental illness get diverted from the criminal justice system into needed community services. 3. Citizens and law enforcement officers are safer in the field.

Logic Model Inputs Outputs Outcomes Initial Intermediate Long-term (Learning) Intermediate (Action) Long-term (Conditions) knowledge of mental illness knowledge of community resources skill in resolving without restraint Diverting from criminal justice Connection to services Safety in field

Monitoring Program Operation To monitor our CIT program, we need to know … 1. What types of trainings are being offered, and how many trainings are being offered? 2. What resources are we using to train (e.g., materials, staff), and what resources do we need that we do not currently have? 3. What types of people are being trained by our program, and how many are being trained?

Logic Model Inputs Outputs Outcomes Initial Intermediate Long-term (Learning) Intermediate (Action) Long-term (Conditions) Trainings for police officers Training materials Staff to coordinate Teachers Trainings conducted Officers trained Calls getting CIT officer response knowledge on mental illness knowledge on community resources skill in resolving without restraint Diverting from criminal justice Connection to services Safety in field

Characteristics of Good Data Indicators 1. Clear definition of what is being measured and how it is coded. 2. Measured same way across time, and across coders. 3. Can show change over time. 4. Does not reach beyond the scope of the program. 5. Data are readily available.

Logic Model Inputs Outputs Outcomes Initial Intermediate Long-term (Learning) Intermediate (Action) Long-term (Conditions) Trainings . types, how often, types of trainees Materials . types, quantity Staff . types, # staff . types, # trainings Officers trained . types & # trained, # PDs with trained officers, % officers trained per PD Calls getting CIT officer response . # calls w/ 1+ CIT officers responding Knowledge of mental illness . test score, rating Knowledge on resources . # trainees w/ resource list Skill in resolving w/o restraint . Rating scale of skill level Diverting from criminal justice . types & # arrests Connection to services . # people starting MH service(s) Safety in field . # injuries . Perceived safety reported by officers

Common Data Sources and Tips Surveys . Good when needing data from multiple people. . The shorter the better. . Reading level of 8th grader for general population. . Avoid open-ended questions. Interviews . Good for getting detailed, follow-up question, and/or sensitive data.

Common Data Sources and Tips – cont’d. Focus . Good for getting preliminary/exploratory information. Groups . The smaller the group, the better the participation.   Documents . E.g., incident reports, agency records, intake forms. . Easy to incorporate data collection into existing forms. . Data coding can be time intensive, depending on document. Online Data . Good for county/state level data.

Looking at Outcomes Pre-Post Look for change in outcome from before to after Comparisons program participation. Treatment v Look at differences between those in the program or Comparison new treatment group versus those not in the program Group or new treatment group. . People in both groups should be similar, except for the group to which they belong.

After Evaluation Results 1. Celebrate your strengths! Get the word out to your audience(s). 2. Identify outcomes showing limited or no improvement. 3. Determine possible reason(s) for “less than ideal” outcomes. Note: Outputs section of the logic model can shed light here. 4. Make needed changes to your program operation. 5. Continue evaluating.

Example of Training Evaluation - Instructor Feedback Class: Overview of Mental Health Disorders Instructor: Person A % Participants Not at all A little Somewhat Very Extremely How clear was the information provided by the instructor? 11.6 48.8 39.5 How well did the instructor tailor the material for the needs of the audience? 9.3 44.2 37.2 How useful was the information for your job? 18.6 46.5 34.9 How could the topic/class have been improved? Direct Quotes: “I would like to have actual documentation (handouts) made for each student. This includes all speakers or most speakers. Just highlights that could be carried in car. Info & resources.” “Have more specific objectives – and stay within those objectives.” “Less theoretical science, more relevant application.” “More signs and symptoms of each disease or disorder.”

Questions, Consultations, Etc.: Kimberlee C. Murphy, Ph.D. Resource Development Institute 222 W. Gregory Blvd., G-2, Kansas City, MO 64114 Mailing address: P.O. Box 10163, Kansas City, MO 64171 murphyk@rdikc.org 816-221-5000, x3