GET READY 1 Task 1: Assemble and Orient an Outcome Measurement Workgroup Workgroup: Task 3: Develop Timeline Task 4: Distribute your game plan to Key Players.

Slides:



Advertisements
Similar presentations
MEASURES OF SUCCESS: An Evaluators Perspective Carol L. Colbeck Director & Associate Professor Center for the Study of Higher Education The Pennsylvania.
Advertisements

Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
1 Performance Management Challenges and Opportunities Harry P. Hatry The Urban Institute Washington DC.
Introduction to Monitoring and Evaluation
Standards Definition of standards Types of standards Purposes of standards Characteristics of standards How to write a standard Alexandria University Faculty.
M & E for K to 12 BEP in Schools
How to Create a Meeting Agenda
Evaluation at The Prince’s Trust Fire Service Prince's Trust Association meeting 18 th February 2010 Subtitle.
Kathy Keeley Northland Foundation Strengthening Communities April 14, 2011 Writing Result Measures and Program Evaluation.
1 Performance Measurement Community Literacy March 19, 2007 Harry P. Hatry The Urban Institute Washington DC.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
New Teacher Induction Program (NTIP) Orientation and Overview.
SEM Planning Model.
Developing a Logic Model
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Formative and Summative Evaluations
Motivation Develop a framework of motivational strategies that you can apply: (a)in planning and selecting instructional strategies (b)in tasks and activities.
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Goal 3, Volunteer Development and Systems to Support Youth: Logic Model and Communications Plan Situation Statement During 2005, over 11,218 adult volunteers.
Standards and Guidelines for Quality Assurance in the European
Construction Management Practice Implications of the Theory of Construction Management.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
LOGIC MODELS AND OUTCOME MEASUREMENT United Way Community Investment Process Training.
Molly Chamberlin, Ph.D. Indiana Youth Institute
A Tool to Monitor Local Level SPF SIG Activities
Setting Up a Group chapter 5. Setting Group Demands -Planning -Organisation -Judgement -Problem- solving -Willingness to look for creative solutions.
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
Aligning Academic Review and Performance Evaluation (AARPE)
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Planning for Medical Detailing/ Provider Behavior Change Communications SESSION 3.
Involving the Whole Organization in Creating or Restructuring a Volunteer Program Louise DeIasi DeCava Consulting.
Writing job descriptions The Easy Way! 1. Why Job Descriptions? *Helps the HR department to determine the right pay range *Attract the right candidates.
1 The Point in Time Enumeration Process in Washington, D.C. Darlene Mathews The Community Partnership for the Prevention of Homelessness
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Certificate IV in Project Management Introduction to Project Management Course Number Qualification Code BSB41507.
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
The Logic Model An Introduction. Slide 2 Innovation Network, Inc. Who We Are National nonprofit organization Committed to evaluation as a tool for empowerment.
IFAS Extension Goal 3, Logic Model and Communications Plan Organizational Strategies and Learning Environments to Support Youth Situation Statement Florida.
Draft Outline of Framework for Evaluation of Family Resource Centres Kieran McKeown Version V6 – 7 th September 2011, CDI Conference in Dublin Family Support.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Community Business Leaders of Tomorrow Kyle Merten Jeanie Long “Leading towards a better tomorrow.”
Project monitoring and Control
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
Copyright © 2014 by The University of Kansas Refining the Program Intervention Based on Research.
By: Catherine Mendoza. Evaluate Implement Develop Analyze Design.

Dr. Shulagna Sarkar Assistant Professor , Jt. Coordinator – PGDM – HRM
Training and Developing a Competitive Workforce 17/04/2013.
The Interactive Model Of Program Planning
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
(c) 2007 McGraw-Hill Higher Education. All rights reserved. Accountability and Teacher Evaluation Chapter 14.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Breathitt County Schools
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Using Logic Models to Create Effective Programs
Warm-Up: Behavior Inventory  Complete the inventory on understanding your behavior.  Tally up your scores at the end.  Complete the reflection.  Complete.
Employee Development Human Resource Management. Employee Training: Trends n Four economic and demographic trends u Unskilled and undereducated youth u.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Managing Human Resources
Assessment in student life
Entry into CIMA Membership – Practical Experience Requirements
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
Entry into CIMA Membership – Practical Experience Requirements
Assessment of Service Outcomes
Evaluating Life Skills Development Through 4-H Challenge Course Outcomes VanderWey, S. Cooper, R. North American Association for Environmental Education.
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
Presentation transcript:

GET READY 1 Task 1: Assemble and Orient an Outcome Measurement Workgroup Workgroup: Task 3: Develop Timeline Task 4: Distribute your game plan to Key Players Task 2: Decide which programs to start with. Program: Communication Plan: 8 Steps of Measuring Program Outcomes InputsActivitiesOutputs

Initial & Intermediate Increased Improved Reduction Gain Create Admitted Receive See Develop Attract Begin Expand Enhance Client Satisfaction Enroll Participate Recruit Identify Classify Attend Report Meet Follow Sign Up Utilize Common Action Verbs Used for Outcome Statements Long Term Are Met Satisfied Maintain Improve Increased Achieve Sustain Client Satisfaction Continued Established Graduate Moved to Next Level Avoid Retain Reduce Positive Response Implement CHOOSE THE OUTCOMES YOU WANT TO MEASURE 2

Knowledge/Learning/Attitude a) Skills (knowledge, learning) Common Indicators: % increase in scores after attending % that believe skills were increased after attending % increase in knowledge (before/after program) b) Attitude Common Indicators: % improvement reported by parent, teacher, co-worker, other % improvement as reported by participant c) Readiness (qualification) Common Indicators: % feeling well-prepared for a particular task/undertaking % meeting minimum qualifications for next level/undertaking Behavior a)Incidence of bad behavior Common Indicators: Incidence rate Relapse/recidivism rate % reduction in reported behavior frequency b) Incidence of desirable activity Common Indicators: Success rate % that achieve goal Rate of improvement c) Maintenance of new behavior Common Indicators: Number weeks/months/years continued % change over time % moving to next level/condition/status % that do not reenter the program/system Condition/Status a) Participant social status Common Indicators: % with improved relationships % who graduate % who move to next level/condition/status % who maintain current level/condition/status % who avoid undesirable course of action/behavior b) Participant economic condition Common Indicators: % who establish career/employment % who move to long term housing % who maintain safe and permanent housing % enrolled in education programs % who retain employment % with increased earnings c) Participant health condition Common Indicators: % with reduced incidence of health problem % with immediate positive response % that report positive response post-90 days Common Framework of Outcomes Building A Common Outcomes Framework to Measure Non-Profit Performance – Urban Institute

CHOOSE THE OUTCOMES YOU WANT TO MEASURE 2 Who/What (the target subject) Change desired (action verb) In what (expected results) By when Imagine your client in program or day after leaving program. What new knowledge, attitude and skills are seen? Initial Imagine your client 3-9 months after leaving program. What improved attitudes or behaviors are seen? Intermediate Imagine your client 6+ months after leaving program. How has condition or status improved? Ideal, ultimate goal Long -Term

EXAMPLE

SPECIFY INDICATORS FOR YOUR OUTCOMES 3 *Refer to Common Outcome and Performance Indicator Packet

Task 2: Decide what factors could influence participant Outcomes SPECIFY INDICATORS FOR YOUR OUTCOMES 3 Demographics (age, gender, education,income level, disability, single parent…) Level of Difficulty (very difficult to help, moderate difficulty, minor difficulty) Level of Involvement (high, moderate, low participation) Organizational unit (if more than one service delivery facility) Service Delivery (group session vs 1-on-1,live vs taped) PREPARE TO COLLECT DATA ON YOUR INDICATORS 4 _ Task 1: Identify data sources for your indicators _ Task 2: Data Collection _ Task 3: Pretest your data collection instruments and procedures

TRY OUT YOUR OUTCOME MEASUREMENT 5 Task 1: Describe your trial strategy Task 2: Who are your data collectors? How will you train data collectors on instrument used? (How often do you survey/pre-post?, How long between, How to approach participant, What demographics are we really needing to ensure are represented…) Task 3: Track and collect outcome data (create a data collection spreadsheet – refer to sample)

Sample Data Collection Form

ANALYZE AND REPORT YOUR FINDINGS 6 _Task 1: Enter the data and check for errors _Task 2: Tabulate the data _Task 3: Analyze data broken out by key characteristics _Task 4: Provide Explanatory Information related to your findings _Task 5: Present your data in Clear and Understandable form Tips for Formatting Your Reports Consider the needs of your audience: what information are they looking for? Keep it Simple Include a summary of major points Don’t crowd too much on a page Define unfamiliar terms Define each outcome indicator Highlight points of interest with bold type, circles or arrows Use color to help highlight key findings Label charts and tables clearly – titles, rows, columns, axes… Identify source and date of the data and note limitations Provide context (history or comparisons) Add variety to data presentation by using bar or pie charts Internal repots should be much more detailed than external

IMPROVE YOUR SYSTEM 7 Task 1: Review Your Trial Run Experience, Make Necessary Adjustments, and Start Full-Scale Implementation Workgroup Review QuestionsYesNo Did you get all of the data you needed? Did you measure what you intended to measure? Does what you measured still seem to represent important outcomes for which your program should be held accountable? Task 2: Monitor and Review your system periodically Reviewed?Aspects to Review Data Collection Instruments Training of data collectors Data collection procedures Data entry procedures Time and Cost in collecting and analyzing data Monitor procedures used during trial run

USE YOUR FINDINGS 8 _ Detect Needed Improvements _ Motivate Staff, Volunteers and Clients _ Use in Program Planning _ Report to Board _ Report to Funders _ Report to Community Make Your Data Pay Off