Download presentation
Presentation is loading. Please wait.
1
Program Evaluation Essentials-- Part 2
Brad Rose, PhD. Program Development & Funding Data Collection & Outcome Measurement Feedback & Continuous Improvement Impact & Assessment Reporting
2
Summary of Part I : Approaching an Evaluation
General questions to consider when conducting a program evaluation: Why is the evaluation being conducted? What is the "it" that is being evaluated? What are the intended outcomes of the program? What is the program meant to achieve? What are the key questions to be answered? Who is the evaluation for?
3
Summary of Part I : Approaching an Evaluation (cont.)
Which information will stakeholders find useful? How will evaluation findings be used? What will be the “product”/deliverable of the evaluation? What are the potential sources of information/data? What is the optimal design for the evaluation? Which methods will yield the most valid, accurate, and persuasive evaluation findings and conclusions?
4
Part A. Describe your program. Write a one or two paragraph description. Be sure to consider what and what not will be evaluated. Identify key stakeholders for the evaluation (stakeholder analysis).
5
Describe What is Being Evaluated
Who does what to whom and when do they do it? (4 Ws) Who (which people, and/or which positions) are doing/carrying out the program? Who (or what) are they working to change? Describe who benefits and what the benefits are. What resources are involved? (not just money, but knowledge, cooperation of others, etc.) What specifically is supposed to change or be different as a result of the program doing what it dose? (Usually multiple outcomes.)
6
Describe Your Program’s Primary Outcomes/Changes
Consider what will be changed or different as a result of the program? Attitudes Knowledge Behavior Feelings Competencies/Skills
7
Name of Person or Organization
Stakeholder Analysis Name of Person or Organization Importance (0-3) Influence Notes/ Comments Total Score Key: 0 = unknown, 1 = little, 2 = some, 3 = very
8
Part B. Complete a logic model of your program.
Identify 2-3 evaluation questions.
10
Logic Model Questions INPUTS- Which resources does the program invest or use? For example: staff, volunteers, time, money, materials, research, background knowledge, equipment, curricula, etc. OUTPUTS- Which activities, events, actions, etc. does the program employ or implement? What “happens” as the program does what it does? What would an observer see? OUTCOMES (i.e., changes and results)- What are the short-term (i.e., relatively immediate—one year) changes the program makes happen? What specifically will be different because of the implementation of the program? Medium-term changes? (Note that the kinds of things that typically change include: awareness, attitudes, knowledge, skills, practices, policies, etc.) IMPACTS- What are the longer-term changes the program/initiative/organization hopes to achieve or to influence?
11
Logic Model Questions CONTEXT- What is/are the need(s) for the program? What specific issues and needs does the program/initiative/organization address? ASSUMPTIONS- What underlying assumptions does the program hold about how and why it does what it does? ASSUMPTIONS- What is it about the program (i.e. which features, characteristics, processes, mechanisms, activities, etc.) that makes the desired changes or differences likely to happen (for, individuals, for communities, and other stakeholders)?
12
Example of Evaluation Questions
What effect(s) did the program have on its participants and stakeholders (e.g., changes in knowledge, attitudes, behavior, skills and practices)? Did the activities, actions, and services (i.e., outputs) of the program provide high quality services and resources to stakeholders? Did the activities, actions, and services of the program raise the awareness and provide new and useful knowledge to participants?
13
Part C. Discuss potential evaluation designs.
Identify sources and kinds of data to be gathered.
14
Evaluation Design Quantitative:
Non-experimental design: Pre- and post-test, “a single group interrupted time series” (Observation Treatment Observation) Experimental design: Compare outcomes among “treatment” and “control” groups Random Observation Treatment Observation Observation No Treatment Observation
15
Evaluation Design Qualitative Interviews/focus groups with participants (parents and community members) Observations of program activities Document analysis Case studies Narratives/stories
16
Part D. Prepare for data collection and analysis (Consider who, what, when). Discuss evaluation reports. Discuss use of evaluation findings and distribution of report.
17
Preparing for the Evaluation
Choose evaluation methods and data sources Identify staff and other evaluation implementers Consider issues of confidentiality and anonymity Develop questionnaires, interview protocols, focus group protocols, and observation protocols Choose or develop tests or other learning assessment tools Choose sample of population
18
Data Collection Communicate with study participants, teachers, parents, about the evaluation and data collection Conduct interviews, focus groups, surveys, and observations Conduct/Collect student and teacher artifacts, and other data/evidence, (e.g. test scores)
19
Data Analysis Analyze quantitative data Analyze qualitative data
Write user-friendly reports and presentations Explain what the data mean and/or indicate
20
Data Reporting Distribute preliminary evaluation report (get feedback)
Provide brief summaries to key stakeholders Write final evaluation report Disseminate report and findings to key stakeholders
21
Remember: Typical Methods and Tools
Interviews (individual and focus group) Surveys and questionnaires Observations Review of existing data/records Collection and statistical analysis of existing or newly gathered quantitative data
22
Remember: What are the Potential Sources of Information?
Interviews (telephone and/or in-person), focus groups Program documents (records, logs, monitoring notes, ledgers, etc.) Surveys, quantitative/statistical data (existing or to-be-collected) Observations of program implementation Testimony of experts Key question: What are the most accessible and cost effective sources of information about program participants?
23
Remember: What will be the “Product” or “Deliverable” of the Evaluation?
How are evaluation findings to be presented and disseminated? Written report A briefing paper Public presentation White paper/journal article/web resource
24
Remember: Which Information will Stakeholders Find Most Useful/Valuable?
Statistical evidence of a program’s impact (graphs, tables, charts)—i.e. quantitative accounts Stories and narrative Illustrations (narratives about individuals’, groups’, communities’ and/or organizations’ changes, challenges, and successes)– i.e. qualitative accounts BOTH quantitative and qualitative evidence of outcomes, changes, and challenges i.e., mixed methods
25
“What is not started today is never finished tomorrow
“What is not started today is never finished tomorrow.” -Johann Wolfgang von Goethe “Do not wait until the conditions are perfect to begin. Beginning makes the conditions perfect.” -Alan Cohen
26
Questions? info@bradroseconsulting.com 617-512-4709 on-line Resources:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.