Community program evaluation school

Slides:



Advertisements
Similar presentations
Collecting Citizen Input Management Learning Laboratories Presentation to Morrisville, NC January 2014.
Advertisements

Designing an Effective Evaluation Strategy
©2007 by the McGraw-Hill Companies, Inc. All rights reserved. 2/e PPTPPT.
Project Monitoring Evaluation and Assessment
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
11 Populations and Samples.
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Setting the success criteria to evaluate project success Tiina Lell
CDC Evaluation Process Harlen Hays, MPH Office of Health Promotion, KDHE.
How to Develop the Right Research Questions for Program Evaluation
VIRTUAL BUSINESS RETAILING
COLLECTING QUANTITATIVE DATA: Sampling and Data collection
Impact assessment framework
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Too expensive Too complicated Too time consuming.
“Opening the Doors of Policy-Making: Central Asia and South Caucasus” (UDF- GLO ) Skills Development Training for CSOs Istanbul, June 2-3, 2011 In-depth.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Evaluation Revisiting what it is... Who are Stakeholders and why do they matter? Dissemination – when, where and how?
Introduction to policy briefs What is a policy brief? What should be included in a policy brief? How can policy briefs be used? Getting started.
Evaluation design and implementation Puja Myles
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Designing Studies In order to produce data that will truly answer the questions about a large group, the way a study is designed is important. 1)Decide.
Exploring the Literacy Standards: CCSS & Main Idea.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 11. Reporting.
Evaluating with Perspective: Mobilizing Community Participation in Program Evaluation …its not too late!
Russell & Jamieson chapter Evaluation Steps 15. Evaluation Steps Step 1: Preparing an Evaluation Proposal Step 2: Designing the Study Step 3: Selecting.
District Engagement with the WIDA ELP Standards and ACCESS for ELLs®: Survey Findings and Professional Development Implications Naomi Lee, WIDA Research.
Conducting Market Research
Logic Models How to Integrate Data Collection into your Everyday Work.
Market research THE TIMES 100.
Evaluating the Quality and Impact of Community Benefit Programs
Evaluation and Assessment
Project monitoring and evaluation
Decade of Roma Inclusion Progress monitoring
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Right-sized Evaluation
Part III – Gathering Data
SEM II : Marketing Research
FEASIBILITY STUDY Feasibility study is a means to check whether the proposed system is correct or not. The results of this study arte used to make decision.
Sampling Techniques & Samples Types
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Program Evaluation Essentials-- Part 2
Version 0.1Assessment Method Overview - 1 Process Assessment Method An objective model-independent method to assess the capability of an organization to.
Perspective Interview: Sofia Perez
Welcome.
Safety Culture Self-Assessment Methodology
Stat 217 – Day 28 Review Stat 217.
Monitoring and Evaluation of Postharvest Training Projects
Market Research Sampling Methods.
Warm Up Imagine you want to conduct a survey of the students at Leland High School to find the most beloved and despised math teacher on campus. Among.
Using Data for Program Improvement
CATHCA National Conference 2018
Evaluating Impacts: An Overview of Quantitative Methods
Using Data for Program Improvement
TECHNOLOGY ASSESSMENT
Evaluation tools training
Sampling Chapter 6.
Chapter 4 Instructional Media and Technologies for Learning
DESIGN OF EXPERIMENTS by R. C. Baker
Using Logic Models in Project Proposals
Integrating Gender into Rural Development M&E in Projects and Programs
STEPS Site Report.
The Research Process & Surveys, Samples, and Populations
Presentation transcript:

Community program evaluation school Final Review Ensuring Use and Sharing Lessons Learned May 19, 2016 Community program evaluation school

REVIEW TOPICS: Evaluation Types/Models Logic Models Sample Size/Sampling Reporting Others?? We’re going to touch on some of the issues you raised last week, when some said they wanted more information about sampling and sample sizes, and quality of data – then we’ll look a little bit at basic steps involved in analyzing quantitative and qualitative data, doing some practice along the way. Review difference between quantitative and qualitative data, and examples …

Formative Summative Steady State Program Implementation Simple & Complicated Programs Evaluation Activities by Program Stage Program Design & Initiation Mid-course Program Corrections Steady State Program Implementation Implementation Evaluation Outcome Evaluation Needs Assessment Logic Model/Theory of Change Formative Evaluation Process Evaluation Formative Summative

Quantitative surveys Since you usually cannot interview the whole target group, you have to create a sample (smaller group of the whole). Ideally your sample is a random sample. In statistics, a simple random sample is group of individuals (a sample) chosen from a larger population. Each individual is chosen randomly and entirely by chance. Asking all your friends in RP is NOT a random sample Asking everyone who comes to the TD centre is NOT a random sample Asking everyone who goes to the pool is NOT a random sample

Quantitative surveys Some ideas for (easy) random samples For a random sample of RP residents: Random digit dialing Randomly choose households Randomly pick blocks On those blocks randomly pick houses In one study, we randomly choose a first house, then sampled every 4 houses after that. On each block we only interviewed 10 houses or stopped when we reached the end.

Quantitative surveys How many people do I need to approach??? Can do a formal sample size calculation Can informally estimate that you need at least 10 persons per variable in your data set Have swimming skills of RP residents increased? Swim skills (yes/no or degree of skills—4-5 variables), background like prior swimming lessons or experience(3-4 variables) , learned to swim at the RP pool (3-4 variables), age (1), gender, confidence (1) about the pool. (15 variables). Need 150 persons minimum

Evaluation Logic Model Inputs Outputs Activities Participation Outcomes – Impacts Short Medium Long Evaluation Questions – Planning Evaluation Questions – Process Evaluation Questions - Outcomes For example… Indicators

Review where we are – note that the step is called “justify conclusions” Because as central as data analysis is to evaluation, evaluators know that the evidence gathered for an evaluation does not necessarily speak for itself. Conclusions become justified when analyzed and findings (“the evidence”) are interpreted through the prism of values and standards that stakeholders bring, and then judged accordingly. Justification of conclusions is fundamental to utilization-focused evaluation. When agencies, communities, and other stakeholders agree that the conclusions are justified, they will be more inclined to use the evaluation results for program improvement.

Ensure Use of Evaluation Findings Don’t wait until the end! Build it into every stage Collaborative, participatory Dissemination: Consider your audience Full disclosure and impartial reporting Don’t wait until the end. - involve the stakeholders during the planning stages to think through how potential findings will influence decision-making, planning; Build it into every stage of the evaluation process: During every stage of the evaluation, you’ll want to think about who will benefit from the evaluation findings and how best to communicate them. For example, you may want to conduct regular meetings with stakeholders to share findings in real time – or if that’s not feasible, send regular notices/newsletters about what’s been learned. This keeps everybody engaged, and focused, and most apt to use the findings The more collaborative, and participatory you are in the evaluation process, the more apt the findings are to be used When you’re figuring out how to disseminate the findings, you need to consider your audience. Match the timing, style, tone, and format of your findings to the audience. For example,

PRACTICE: Match Stakeholders to Format Evaluation Course Funders (University of Toronto, Learning Centre) Course Participants Regent Park Community

Your turn Pool Staff (e.g. lifeguards, swimming instructors) Pool users (e.g. families, seniors) Regent Park Community

Evaluation Report – Sample Outline Executive Summary Background and Purpose Evaluation Methods Results Discussion and Recommendations Executive Summary – 1 page summary with the most important findings (for the audience) and lessons learned – Not technical or jargony, be brief and clear and impartial Background and Purpose include program background Evaluation rationale Stakeholder identification and engagement Program description Key evaluation questions/focus Methods Design Sampling procedures Measures/performance indicators Data collection and processing procedures Analysis Limitations Results Discussion and Recommendations