Presentation is loading. Please wait.

Presentation is loading. Please wait.

for CIT Program Operation Resource Development Institute

Similar presentations


Presentation on theme: "for CIT Program Operation Resource Development Institute"— Presentation transcript:

1 for CIT Program Operation Resource Development Institute
Establishing Ongoing Evaluation Activity as a Management Tool for CIT Program Operation Kimberlee C. Murphy, Ph.D. Resource Development Institute

2 Learning Objectives You will be able to …
You will be able to … Identify areas needing assessment. 2. Identify data indicators needed to assess program results. Assess overall effectiveness/outcomes of your CIT program. Understand where evaluation fits into program operation and improvement.

3 Evaluation Versus Research
Characteristic Evaluation Research Used to … determine effectiveness or worth advance knowledge in the field Inquiry based on … policy/program interests of stakeholders intellectual curiosity Information for … program monitoring, quality improvement efforts broad knowledge/theory Conducted within … changing setting, actors, priorities, timelines, etc. controlled setting … or attempt at “controlled”

4 How Evaluation Fits Into Program Operation
Program Revision Program Evaluation

5 Why Spend Time Evaluating
1. Funder often requires it. 2. Get an objective view of what you’re doing that works well. . Will give you evidence as you work to get “buy in” and support for your program. 3. Will help you identify what areas need to be improved for quality improvement efforts. 4. With data showing effectiveness, you increase agency sustainability by making the program more competitive for grant and other funding.

6 Two Questions to Guide the Evaluation
What do you want to tell an audience about the outcomes of your work? . Goals achieved . Other results What do you need to know to monitor program operation? . Activities carried out – types, how many, how often . Resources used – including players involved . People served – description of, how many

7 Logic Model Inputs Outputs Outcomes Initial Intermediate Long-term
(Learning) Intermediate (Action) Long-term (Conditions) Resources Materials Equipment Staff Stakeholders Trainings Products Services People trained People served Knowledge Skills Attitudes Opinions Behaviors Practices Social actions Social Economic Civic Environmental

8 Telling an Audience about the Outcomes
As a result of our CIT program … Police officers are more knowledgeable about mental illness and how to work with individuals with mental illness in crisis. 2. Citizens with mental illness get diverted from the criminal justice system into needed community services. 3. Citizens and law enforcement officers are safer in the field.

9 Logic Model Inputs Outputs Outcomes Initial Intermediate Long-term
(Learning) Intermediate (Action) Long-term (Conditions) knowledge of mental illness knowledge of community resources skill in resolving without restraint Diverting from criminal justice Connection to services Safety in field

10 Monitoring Program Operation
To monitor our CIT program, we need to know … 1. What types of trainings are being offered, and how many trainings are being offered? 2. What resources are we using to train (e.g., materials, staff), and what resources do we need that we do not currently have? 3. What types of people are being trained by our program, and how many are being trained?

11 Logic Model Inputs Outputs Outcomes Initial Intermediate Long-term
(Learning) Intermediate (Action) Long-term (Conditions) Trainings for police officers Training materials Staff to coordinate Teachers Trainings conducted Officers trained Calls getting CIT officer response knowledge on mental illness knowledge on community resources skill in resolving without restraint Diverting from criminal justice Connection to services Safety in field

12 Characteristics of Good Data Indicators
1. Clear definition of what is being measured and how it is coded. 2. Measured same way across time, and across coders. 3. Can show change over time. 4. Does not reach beyond the scope of the program. 5. Data are readily available.

13 Logic Model Inputs Outputs Outcomes Initial Intermediate Long-term
(Learning) Intermediate (Action) Long-term (Conditions) Trainings . types, how often, types of trainees Materials . types, quantity Staff . types, # staff . types, # trainings Officers trained . types & # trained, # PDs with trained officers, % officers trained per PD Calls getting CIT officer response . # calls w/ 1+ CIT officers responding Knowledge of mental illness . test score, rating Knowledge on resources . # trainees w/ resource list Skill in resolving w/o restraint . Rating scale of skill level Diverting from criminal justice . types & # arrests Connection to services . # people starting MH service(s) Safety in field . # injuries . Perceived safety reported by officers

14

15 Common Data Sources and Tips
Surveys . Good when needing data from multiple people. . The shorter the better. . Reading level of 8th grader for general population. . Avoid open-ended questions. Interviews . Good for getting detailed, follow-up question, and/or sensitive data.

16 Common Data Sources and Tips – cont’d.
Focus . Good for getting preliminary/exploratory information. Groups . The smaller the group, the better the participation. Documents . E.g., incident reports, agency records, intake forms. . Easy to incorporate data collection into existing forms. . Data coding can be time intensive, depending on document. Online Data . Good for county/state level data.

17 Looking at Outcomes Pre-Post Look for change in outcome from before to after Comparisons program participation. Treatment v Look at differences between those in the program or Comparison new treatment group versus those not in the program Group or new treatment group. . People in both groups should be similar, except for the group to which they belong.

18 After Evaluation Results
1. Celebrate your strengths! Get the word out to your audience(s). 2. Identify outcomes showing limited or no improvement. 3. Determine possible reason(s) for “less than ideal” outcomes. Note: Outputs section of the logic model can shed light here. 4. Make needed changes to your program operation. 5. Continue evaluating.

19 Example of Training Evaluation - Instructor Feedback
Class: Overview of Mental Health Disorders Instructor: Person A % Participants Not at all A little Somewhat Very Extremely How clear was the information provided by the instructor? 11.6 48.8 39.5 How well did the instructor tailor the material for the needs of the audience? 9.3 44.2 37.2 How useful was the information for your job? 18.6 46.5 34.9 How could the topic/class have been improved? Direct Quotes: “I would like to have actual documentation (handouts) made for each student. This includes all speakers or most speakers. Just highlights that could be carried in car. Info & resources.” “Have more specific objectives – and stay within those objectives.” “Less theoretical science, more relevant application.” “More signs and symptoms of each disease or disorder.”

20

21 Questions, Consultations, Etc.:
Kimberlee C. Murphy, Ph.D. Resource Development Institute 222 W. Gregory Blvd., G-2, Kansas City, MO Mailing address: P.O. Box 10163, Kansas City, MO 64171 , x3


Download ppt "for CIT Program Operation Resource Development Institute"

Similar presentations


Ads by Google