Looking at your program data

Slides:



Advertisements
Similar presentations
Chapter 9 Fluency Assessment Jhanyce A. Acosta. What? * Fluency Assessment -a method of listening to students read aloud in order to gathering their data,
Advertisements

M & E for K to 12 BEP in Schools
What Gets Measured Gets Done Presented by Frances Head George Elliott.
Tired of hanging around Evaluating projects with young people.
Specific outcomes can be compared to resources expended; and successful programs can be highlighted;
Project Monitoring Evaluation and Assessment
Using your data to make your project better Ceris Anderson - StreetGames.
Creating an Evaluation Plan Freya Bradford, Senior Consultant NorthSky Nonprofit Network May 23, 2013.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
GTM for Product Leaders Project Overview A project that guides product leaders and their teams in developing a successful go-to-market strategy.
Measuring for Success Module Nine Instructions:
Session 12 Implementing and Evaluating the Campaign Session 12 Slide Deck Slide 12-1.
Unit 2: Managing the development of self and others Life Science and Chemical Science Professionals Higher Apprenticeships Unit 2 Managing the development.
STUDY PLANNING & DESIGN TO ENHANCE TRANSLATION OF HEALTH BEHAVIOR RESEARCH Lisa Klesges, Russell Glasgow, Paul Estabrooks, David Dzewaltowski, Sheana Bull.
Key Performance Measures, Evaluation Plans, and Work Plan
Observation System Kidderminster College June 2012.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
The Evaluation Plan.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
WHY DOES NAZ EXIST? Jaquan faces these odds 37% high school graduation rate 25% homeless or high mobility 16% college readiness 13% graduate from college.
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Media Services North Campus Feb 2001 Miami-Dade Community College Enrollment Management Media Services North Campus Feb 2001 Presentation to The Board.
Welcome! Please join us via teleconference: Phone: Code:
Regional Seminar 2005 EVALUATING POLICY Are your policies working? How do you know? School Development Planning Initiative.
From Output to Outcome: Quantifying Care Management Kelly A. Bruno, MSW and Danielle T. Cameron, MPH National Health Foundation Background Objectives Methods.
Copyright © 2014 by The University of Kansas Refining the Program Intervention Based on Research.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Reduce Waiting & No-Shows  Increase Admissions & Continuation Conducting PDSA Change Cycles Plan-Do-Study-Act Steve Gallon, Ph.D. NIATx.
Developing SEA Change’s Evaluation Plan
Fís Foghlaim Forbairt www. pdst. ie © PDST 2014 This work is made available under the terms of the Creative Commons Attribution Share Alike 3.0 Licence.
Transition Improvement Plan (TIP) Introduction to the Indicator 14 Post School Outcomes (PSO) Report
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
CHAPTER 16 Preparing Effective Proposals. PRELIMINARY CONSIDERATIONS  Conducting a Preliminary Assessment  Prior to Writing the Proposal  How Fundable.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Continuous Quality Improvement Basics Created by Michigan’s Campaign to End Homelessness Statewide Training Workgroup 2010.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Importance of Monitoring and Evaluation. Lecture Overview  Monitoring and Evaluation  How to Build an M&E System.
Family Engagement Evidence for the SIP: Did We Do It and Did It Work? August 2016 Beth Vaade & Bo McCready, MMSD Research & Program Evaluation Office.
Strategic planning A Tool to Promote Organizational Effectiveness
Logic Models How to Integrate Data Collection into your Everyday Work.
Strategic Planning Forum Number Three
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Monitoring and Evaluation
Functional Area Assessment
Facilitation Tool: Team Agreement
& WEBINAR “Know what you are shooting at” – Why Employee Goals Management is Critical? WINNER of Microsoft Code For Honor 2014 Large Enterprise Software.
Module 5 HAIL Research This module provides an overview of how researchers will collect data with both Independent Living Specialist(ILS) and consumers.
CILIP Professional Registration & Portfolio Building
Module 2 HAIL Program This module provides an overview of the HAIL Process and demonstrates how to set and achieve short term health goals.
Department of Post-Secondary Education, Training and Labour
Child Outcomes Summary (COS) Process Training Module
Club Chair Workshop
Program Analysis Overview
Governance and leadership roles for equality and diversity in Colleges
Module 2 HAIL Program This module provides an overview of the HAIL Process and demonstrates how to set and achieve short term health goals.
And Improve Process of Care
General Notes Presentation length - 10 – 15 MINUTES
Functional Area Assessment Process
Quality Improvement Indicators and Targets
12/9/2018 3:38 PM Quality Improvement Plans (QIPs): Aligning the Content to the QIP Guidelines.
Logic Model, Rubrics & Data collection tools
Chicago Public Schools
Strengthening Program Management Building Capacity, Supporting the Work & Ensuring Quality in Supportive Service Programs Tom Balsley Office of.
Improvement 101 Learning Series
Monitoring and Evaluation
Service Evaluation SWFDP, Pretoria, November 2013
Measuring Course Effectiveness
Developmental Disability Targeted Case Management
Presentation transcript:

Looking at your program data Leading for Impact

What you’re trying to learn Existing program data can show how well a program is implemented and producing your desired outcomes What you’re trying to learn Sample metrics How many clients is my program reaching? # of clients enrolled Do some of our clients face barriers to or higher risks of succeeding? # of clients demonstrating all of target risk factors How many clients are receiving the full, intended extent of my program? # of clients completing program What is common about clients who do not complete the program? demographics/traits/behavior of clients not meeting targets Are clients seeing a difference with my program? # of clients who achieved intended outcome How many clients find the program useful and engaging? average client tenure in # of months (or minutes, years, etc.) Do certain programs lead to better outcomes? # of services or program aspects each client utilizes

Your organization might already have access to more useful program data than you think… Enrollment or exit forms Did you collect any information about clients when they first joined your program, or decided to leave your program? These might contain useful demographic and outcome-related information on your clients. Attendance records Do your staff keep track of how many clients show up at sessions, or how many sessions each client attends? These can be useful in assessing how effectively you are reaching your clients. Program guidelines and staff forms Are there certain guidelines you provide staff members who implement your program? Do staff record progress or notes about program sessions? This can give you a better idea of the actual delivery of your programs. Meeting notes and board updates Do you have notes or summaries of meetings or sessions, with information related to your programs and clients? You might find useful anecdotal information that your organization gathered informally in the past. External baseline/outcome data How can you tell that your clients have seen a change due to your program? Is there easily accessible data related to the baseline and intended outcomes of your typical client? You can compare your program results to external data to gauge your relative and absolute performance …and if not, you can always conduct interviews and surveys to collect relevant information about your programs and clients.