Rebecca McQuaid www.rmcqconsult.com rebecca.mcquaid@gmail.com Evaluation 101 …a focus on Programs Rebecca McQuaid www.rmcqconsult.com rebecca.mcquaid@gmail.com.

Slides:



Advertisements
Similar presentations
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Advertisements

REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
School Improvement Work Day. Continuous School Improvement The Model of Process Cycle for School Improvement provides the foundation to address school.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Logic Modeling for Success Dr Kathryn Wehrmann Illinois State University.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Presented By: Tracy Johnson, Central CAPT
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Reporting and Using Evaluation Results Presented on 6/18/15.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
The Evaluation Plan.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Merrill Area United Way Outcomes Training (Part 2) Art Lersch Associate Professor Community Resource Development Educator University of Wisconsin – Extension,
Regional Educational Laboratory at EDC relnei.org Logic Models to Support Program Design, Implementation and Evaluation Sheila Rodriguez Education Development.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Session 2: Developing a Comprehensive M&E Work Plan.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Measuring the Results of Your Volunteer Efforts With acknowledgement to Nikki Russell, United Good Neighbors of Jefferson County.
Logic Modeling: Analyzing and Communicating Contributions to First Year Student Success Laura E. Martin, Director Office of Periodic Review, Assessment,
Logic Models How to Integrate Data Collection into your Everyday Work.
Breakthrough School Improvement
GFC M&E 101 THE BASICS OF MONITORING AND EVALUATION AT GFC
Chapter 33 Introduction to the Nursing Process
Project monitoring and evaluation
Session VII: Formulation of Monitoring and Evaluation Plan
Using Logic Models for SSSP, Student Equity, And BSI Activities
Designing Effective Evaluation Strategies for Outreach Programs
Agcas Scotland Knowing your outcomes
Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
How to Assess the Effectiveness of Your Language Access Program
QIC-AG Logic Model Template
Using Logic Models in Program Planning and Grant Proposals
QUALITY IMPROVEMENT FINAL QUARTERLY COLLABORATIVE WORKSHOP
Strategic Planning for Learning Organizations
Introduction to Program Evaluation
Research in Social Work Practice Salem State University
Programme Review Dhaya Naidoo Director: Quality Promotion
Anduamlak Meharie Office of Research
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Program Evaluation Essentials-- Part 2
Strategic Prevention Framework - Evaluation
Is there another way besides accreditation?
Measuring Project Performance: Tips and Tools to Showcase Your Results
Strategic Planning Setting Direction Retreat
Logic Models and Theory of Change Models: Defining and Telling Apart
General Notes Presentation length - 10 – 15 MINUTES
2018 OSEP Project Directors’ Conference
Chicago Public Schools
Introduction to M&E Frameworks
Strategic Prevention Framework - Introduction
Step 5: Justify Conclusions
Changing the Game The Logic Model
OGB Partner Advocacy Workshop 18th & 19th March 2010
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Using Logic Models in Project Proposals
Quality Framework Overview
Week #3 Recap: Needs Assessments
Presentation transcript:

Rebecca McQuaid www.rmcqconsult.com rebecca.mcquaid@gmail.com Evaluation 101 …a focus on Programs Rebecca McQuaid www.rmcqconsult.com rebecca.mcquaid@gmail.com

About me… Having an “evaluation brain” means… Being curious! Asking questions Paying attention Reflecting Challenging our beliefs/ assumptions

About you… What do you want to take away? Exercise: Interests & Challenges This presentation WILL address: Program Evaluation Why, what and when… and some of the how Skims the surface – Backgrounder, 101 This presentation will NOT address: Developmental Evaluation Reporting (e.g., How to “tell the story”) Capacity (Time, money, staff) What do you bring to the table? Experience with eval…

Eval 101 This presentation WILL address: Program Evaluation This presentation will NOT address: Developmental Evaluation Reporting (How to “tell the story”) Capacity (Time, money, staff) This presentation WILL address: Program Evaluation Why, what and when … and some of the how Backgrounder/ Refresher?

EVALUATE … & INCREASE IMPACT! Why? *Source: Evaluation for Organizational Learning: Basic Concepts and Practical Tools (Learning for Action Group) EVALUATE … & INCREASE IMPACT!

At its heart, evaluation answers… What is working? 2) What is not? Helps determine… How a program is working How successful a program is in achieving the desired outcomes The impact a program has on the target group Level of satisfaction* with the program Program strengths and weaknesses Cost effectiveness and efficiency of operation At its heart, evaluation answers… What is working? 2) What is not? 3) What difference is being made?

Program Evaluation Goals & Objectives (“Why”) Activities (“What”) Anticipated Outcomes Actual Outcomes Conclusions (for learning!) Summarizes…

It can tell us… Is a program working? Achieving outcomes Level of satisfaction Strengths & weaknesses Cost effectiveness Efficiency Impact on intended audience Summarizes…

Types of Eval Formative How? Begins during development and continues throughout program How? Needs Assessments Process Evaluation Summarizes…

Needs Assessment Takes place before program Is there a need? What is it? First step in planning/ design One-time analysis Summarizes…

Process Evaluation Takes place during program Assesses how program outcome(s) achieved Collects data to improve program for efficiency & results Summarizes…

Helps determine… Are processes used to achieve goals and objectives effective? Was program carried out as planned? What worked? Improvements? Participant satisfaction with program Summarizes…

Types of Eval Summative How? Outcomes-Based Evaluation Done at end of program Focuses on success & effectiveness of program in reaching the stated objectives How? Outcomes-Based Evaluation Summarizes…

Outcomes-Based Eval Were (pre-determined) outcomes met? After implementation Two important points…

Helps determine… Impact on intended audience/ population? Success in achieving outcomes? Is there evidence to support continuation? Expansion?

PSA: Timing matters! “It’s much more effective to build evaluation in up front during the program planning process – this ensures that you gather the right data at the right time”

Types of Eval – Which one? Formative Needs Assessments Process Evaluation Summative Outcomes-Based Evaluation Consider… type of program, purpose of eval, resource availability (people, $$), program timeframe

Ready to Evaluate? CREATE A… Tools Program Profile Program Profile Goals and Objectives Inputs and resources Program activities Target audience/ populations Tools Program Profile Logic Model CREATE A…

Logic Model … evaluable? Visual of how a program (theoretically) works Shows cause-and-effect relationship between activities and outcomes Any problems or gaps in program? … evaluable? Outputs - Concrete/quantifiable results of project activities – i.e. number of people that attended a workshop, reports, training tools developed, number of promotional activities carried out, etc. Outcomes - Actual impacts/changes as a result of the project.

Logic Model For: Individual programs Specific Detail Activities, Outcomes Implementation tool

Theory of Change For: Organizations, initiatives High-level overview Strategies & Outcomes Guiding/ Communication tool

Logic Model Common logic model mistakes: Activities are not action-oriented or specific enough to really describe what is being done. Relationships between activities and outcomes are unclear. Outcomes are not specific or measurable. Outputs and outcomes are confused. ** Outputs - Concrete/quantifiable results of project activities – i.e. number of people that attended a workshop, reports, training tools developed, number of promotional activities carried out, etc. Outcomes - Actual impacts/changes as a result of the project. *Source: Evaluation for Organizational Learning: Basic Concepts and Practical Tools (Learning for Action Group)

Overarching goal: To decrease the rate of unemployment among young adults (aged 18-24) in the City of Kingston. E.g., Create posters for college & university campuses advertising career counselling program services E.g.,Two part-time staff with dedicated hours to young adult employment counselling E.g., Reduce avg. amount of time young adults spend in career counselling programs by 4 months E.g., Increase 1-on-1 career advising sessions with young adult clients to at least twice a week Decreased rate of unemployment among young adults in Kingston Common logic model mistakes: Activities are not action-oriented or specific enough to really describe what is being done. Relationships between activities and outcomes are unclear. Outcomes are not specific or measurable. Outputs and outcomes are confused. ** Outputs - Concrete/quantifiable results of project activities – i.e. number of people that attended a workshop, reports, training tools developed, number of promotional activities carried out, etc. Outcomes - Actual impacts/changes as a result of the project. M

(Short term) Outcomes OBJECTIVES Appreciate the role of evaluation in organizational learning Understand different evaluation approaches, what they are for & when they are used Use a Logic Model template to begin to organize a “picture” of your program Note: Objectives would be phrased: To appreciate the role of evaluation in organizational learning… Changes that occur as a direct result of the program activities/strategies over the short-term Are developed from the program objectives

(Long term) Outcome GOAL E.g., Develop capacity for participants in “Evaluation 101” workshops to plan theory-based program evaluations Broader level changes that the program is working toward and hopes to achieve/contribute to over the long-term Developed from the program goals

Evaluation Framework Outlines: What will be monitored How success will be measured Who/where data will be collected from How data will be collected Summarizes…

Data Collection Methods: Common ones: Depend on the type of data that needs to be collected and who data is being collected from Summarizes… Common ones: Document review, Focus groups, Interviews, Surveys, Observation, Tracking/monitoring

Eval Framework (cont’d) Data collection tools are derived from the evaluation framework Develop a data collection plan Where will you get information? When will it be collected? From whom? How many people? Etc. Consistency = reliability Analyze data & interpret results Share results Summarizes…

(Short term) Outcomes OBJECTIVES Appreciate the role of evaluation in organizational learning Understand different evaluation approaches, what they are for & when they are used Use a Logic Model template to begin to organize a “picture” of your program Note: Objectives would be phrased: To appreciate the role of evaluation in organizational learning… Changes that occur as a direct result of the program activities/strategies over the short-term Are developed from the program objectives

Rebecca McQuaid www.rmcqconsult.com rebecca.mcquaid@gmail.com Evaluation 101 …a focus on Programs Rebecca McQuaid www.rmcqconsult.com rebecca.mcquaid@gmail.com