Download presentation
Presentation is loading. Please wait.
Published byKathlyn Cameron Modified over 9 years ago
0
Session 3: DATA ANALYSIS
How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 3: DATA ANALYSIS Anita M. Baker, Ed.D. Evaluation Services Hartford Foundation for Public Giving, Nonprofit Support Program: BEC Bruner Foundation
1
They may NOT be sold or redistributed in whole or part for a profit.
These materials are for the benefit of any 501c3 organization. They MAY be used in whole or in part provided that credit is given to the Bruner Foundation. They may NOT be sold or redistributed in whole or part for a profit. Copyright © by the Bruner Foundation 2012 * Please see supplementary materials for a sample agenda, activities and handouts Bruner Foundation Rochester, New York
2
How to Use the Bruner Foundation Evaluation Essentials for Program Managers Powerpoint Slides The Evaluation Essentials for Program Managers slides were developed as part of a Bruner Foundation special project, by evaluation trainer Anita Baker – Evaluation Services, and jointly sponsored by the Hartford Foundation for Public Giving. They were tested initially with a single organization in Rochester, NY (Lifespan) as part of the Evaluation Support Project The materials were revised and re-tested with three nonprofit organizations as part of the Anchoring Evaluation project in The slides, intended for use in organizations that have already participated in comprehensive evaluation training, include key basic information about evaluation planning, data collection and analysis in three separate presentations. Organization officials or evaluation professionals working with nonprofit organization managers are encouraged to review the slides, modify order and add/remove content according to training needs. (Please note that the final session includes general information about analysis planning as well as analysis of both quantitative and qualitative data, and presentation of findings. Specific strategies related to data collection, i.e., analysis of survey data or interview data, and information about development of tables and graphs are included in the supplementary powerpoint presentation. Additional Materials To supplement these slides there are sample agendas, supporting materials for activities, and other handouts. There are “placeholder” slides with just a picture of the target with an arrow in the bullseye that signify places where activities can be undertaken. Be sure to move or eliminate these depending on the planned agenda.Other more detailed versions of the Evaluation Essentials materials area also available in Participatory Evaluation Essentials: An Updated Guide for Nonprofit Organizations and Their Evaluation Partners and the accompanying 6-session slide presentation. These materials are also available on the Bruner Foundation and Evaluation Services websites free of charge. Whether you are an organization leader or an evaluation professional working to assist nonprofit organization staff, we hope that the materials provided here will support your efforts. When you have finished using the Evaluation Essentials for Program Managers series have trainees take our survey. Bruner Foundation Rochester, New York
3
What is Evaluation Anyway?
Program Evaluation Participatory Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, inform decisions. Trained evaluation personnel and practice- based decision-makers coming together to learn about , design, conduct and use results of program evaluation.
4
How are evaluation data collected?
All have limitations and benefits Require preparation on the front end: Instrument Development and testing Administration plan development Analysis plan development Interviews Surveys Observations Record Reviews
5
Evaluation Data Collection Options
Qualitative Data Quantitative Data Interviews Conducting guided conversations with key people knowledgeable about a subject Record Review Collecting and organizing data about a program or event and its participants from outside sources Surveys Administering a structured series of questions with discrete choices Focus Groups Facilitating a discussion about a particular issue/question among people who share common characteristics External Record Review Utilizing quantitative data that can be obtained from existing sources We will discuss surveys, observations, focus groups, interviews, and record reviews in depth Observations Documenting visible manifestations of behavior or characteristics of settings
6
Surveys: Series of items with pre-determined response choices
Can be completed by administrator or respondents Can be conducted “paper/pencil” phone, internet (e-survey) using alternative strategies Instruments are called – surveys, “evaluations,” questionnaires USE SURVEYS TO: Study attitudes and perceptions Collect self-reported assessment of changes in response to program Collect program assessments Collect some behavioral reports Test knowledge Determine changes over time. PRE POST GRAND CLAIMS
7
Interviews: One-sided conversation with questions mostly pre- determined, but open-ended. Respondent answers in own terms. Can be conducted in person on phone one-on-one, or groups Instruments are called – protocols, schedules or guides USE INTERVIEWS TO: Study attitudes and perceptions Collect self-reported assessment of changes in response to program Collect program assessments Document program implementation Determine changes over time.
8
Observations: Observations are conducted to view and hear actual program activities. Users of reports will know what and how events occur. Can be focused on programs overall participants pre-selected features Instruments are called – protocols, guides, checklists USE OBSERVATIONS TO: Document program implementation Witness levels of skill/ability, program practices, behaviors Determine changes over time.
9
Record Reviews: Accessing existing internal information, or information collected for other purposes. Can be focused on own records records of other orgs adding questions to existing docs Instruments are called – protocols USE REC REVIEW TO: Collect some behavioral reports Conduct tests, collect test results Verify self-reported data Determine changes over time
10
What happens after data are collected?
Data are analyzed, results are summarized. Findings must be converted into a format that can be shared with others. Action steps should be developed from findings. “Now that we know _____ we will do _____.”
11
Important Data-Related Terms
Data can exist in a variety of forms Records: Numbers or text on pieces of paper Digital/computer: Bits and bytes stored electronically Memory: Perceptions, observations or facts stored in a person’s mind Qualitative, Quantitative Primary v. Secondary Data Variables (Items) Unit of Analysis Duplicated v. Unduplicated Unit Record (Client-level) v. Aggregated May reach logical conclusions rather than definitive answers
12
Plan your Analysis in Advance!
What procedures will be conducted with each set of data and who will do them? How will data be coded and recoded? How will data be disaggregated (i.e. “broken out for example by participant characteristics, or time)? How will missing data be handled? What analytical strategies or calculations will be performed (e.g., frequencies, cross-tabs)? How will comparisons will be made? Whether/which statistical testing is needed? Analytical procedures – such as frequency – the percent that experienced a certain factor or result, or comparisons among groups. Participant characteristics may include age, gender, race/ethnicity, - we will cover more detail on the ways to break down your data in this presentation when we get to the analyses in more details For example, Danielle may have a certain target for how may or what percentage of the moms will agree to breastfeed upon delivery. If analyzing results of the # of moms breastfeeding after delivery, will want to compare it to the target of how many they were aiming for. ***I got a couple sample analysis plans from Tina that we can include as examples in the resources section. They are very complex, but I’ll work with Elena to simplify them.
13
There is no single process!
Analysis Plan Specifics, You Must Decide . . . What procedures will be conducted with each set of data and who will do them. How data will be grouped or partitioned. What types of codes will be applied to the data. How comparisons will be made. Data to other project data (within group) Data to expectations Data to data from other sources (across groups) There is no single process!
14
Analyzing (Quantitative) Data: A Few Important Terms*
Case: individual record (e.g., 1 participant, 1 day, 1 activity) Demographics: descriptive characteristics (e.g., gender) Disaggregate: to separate or group information (e.g., to look at data for males separately from females) – conducting crosstabs is a strategy for disaggregating data. Duplicated/Unduplicated (e.g., counting # of individuals at events – dup; or counting number of events for each individual ) Partition(v): another term that means disaggregate. Unit of Analysis: the major entity of the analysis – i.e., the what or the whom is being studied (e.g., participants, groups, activities) Unit Record (i.e., client level) v. Aggregate (i.e., group level) Variable: something that changes (e.g., number of hours of attendance) *common usage
15
Quantitative Data Analysis: Basic Steps
Organize and arrange data (number cases as needed). Scan data visually. Code data per analysis plan. Enter and verify data. Determine basic descriptive statistics. Recode data as needed (including missing data). Develop created variables. Re-calculate basic descriptive statistics. Conduct other analyses per plan
16
Quantitative Data Analysis Strategies
Important Things to Look at or Summarize Frequencies: How often a response or status occurs. Total and Valid Percentages: Frequency/total *100 Measures of Central Tendency: Mean, Median, (Modes) Distribution: Minimum, Maximum, Groups (*iles) Cross-Tabulations: Relationship between two or more variables (also called contingency analyses, can include significance tests such as chi-square analyses) Useful, 2nd Level Procedures Means testing (ANOVA, t-Tests) Correlations Regression Analyses
17
Analyzing Quantitative Data
Important Things to Look at or Summarize What to Do Calculate Frequencies Calculate Total and/or Valid Percentages What That Means Count how many there are of something. Count how often something (e.g., a response) occurs. Frequency/total *100 Example Questions You Could Answer How many participants were in each group? What were the demographics of participants? How many answered “Yes” to Question 2? What proportion of participants met intensity targets? What proportion of all those who answered question 2, said “Yes.” For example, we might say that the mean length of stay in a shelter is 11 days; yet the mode is 7 days (most common length of stay), and the median is 9 days – out of 30 cases, 15 are less than 9 days and 15 are above 9 days. Frequencies: Of the 30 grandparents taking the survey, 20 said that the support they receive at the center helps them care for their grandchildren. Percentages – that means that 67% (20/30*100) say that the center helps them care for their grandchildren Ratios: 10 of the 20 women receiving prenatal counseling through the Maternity care program chose to breastfeed after delivery. Cross-tabs: (i.e. female participants were more likely to complete the program than the male participants) If you were trying to say whether the relatinoship holds up in the larger population or is just due to chance, then you need to move to inferential statistics and would do what is called a chi square test to see if the relationship is significant but if just using to describe the data you have – for example you have the whole program population data that you are describing – then can use a cross-tab without a significance test.
18
Example Questions You Could Answer
Analyzing Quantitative Data Important Things to Look at or Summarize What to Do Determine Central Tendencies What That Means Calculate the average (mean), or identify the median (middle) or mode (most common value). Avg. = Sum of Values Total Number of Values Total # of hours Total # of people with hours Example Questions You Could Answer What is the average number of hours participants attend? What is the most common numbers of days attended in a week? (mode) For example, we might say that the mean length of stay in a shelter is 11 days; yet the mode is 7 days (most common length of stay), and the median is 9 days – out of 30 cases, 15 are less than 9 days and 15 are above 9 days. Frequencies: Of the 30 grandparents taking the survey, 20 said that the support they receive at the center helps them care for their grandchildren. Percentages – that means that 67% (20/30*100) say that the center helps them care for their grandchildren Ratios: 10 of the 20 women receiving prenatal counseling through the Maternity care program chose to breastfeed after delivery. Cross-tabs: (i.e. female participants were more likely to complete the program than the male participants) If you were trying to say whether the relatinoship holds up in the larger population or is just due to chance, then you need to move to inferential statistics and would do what is called a chi square test to see if the relationship is significant but if just using to describe the data you have – for example you have the whole program population data that you are describing – then can use a cross-tab without a significance test.
19
Determine Distributions Example Questions You Could Answer
Analyzing Quantitative Data Important Things to Look at or Summarize What to do Determine Distributions Cross-Tabulations (pivot tables are crosstabs) What That Means Determine the minimum value, the maximum, and/or how the data are grouped (e.g, high, medium, or low values, quartiles, percentiles, etc.). Relationship between 2 or more variables (also called contingency analyses, can include significance tests such as chi-square analyses) Example Questions You Could Answer What was the least amount of attendance for the group? What was the most? How many participants fall into low, medium, and high intensity groups? Are there relationships between participant characteristics and outcome changes? For example, we might say that the mean length of stay in a shelter is 11 days; yet the mode is 7 days (most common length of stay), and the median is 9 days – out of 30 cases, 15 are less than 9 days and 15 are above 9 days. Frequencies: Of the 30 grandparents taking the survey, 20 said that the support they receive at the center helps them care for their grandchildren. Percentages – that means that 67% (20/30*100) say that the center helps them care for their grandchildren Ratios: 10 of the 20 women receiving prenatal counseling through the Maternity care program chose to breastfeed after delivery. Cross-tabs: (i.e. female participants were more likely to complete the program than the male participants) If you were trying to say whether the relatinoship holds up in the larger population or is just due to chance, then you need to move to inferential statistics and would do what is called a chi square test to see if the relationship is significant but if just using to describe the data you have – for example you have the whole program population data that you are describing – then can use a cross-tab without a significance test.
20
Coding and Data Entry Create codebook(s) as needed (identify codes and affix them to instrument copies). Create electronic database when possible (use Excel, SPSS, SAS). ID/create unique identifiers for cases and affix or enter as needed. Enter or extract data as needed (do not recode as data are entered). Make (electronic or paper) copies of your data. Codebook – for example for survey, will want to do it right on a master copy of the survey and then write codes onto the surveys by responses (talk to Elena about this and YDI example) Create unique ids if preserving anonymity Copies of data are for editing, cutting and pasting, etc. - store your master copy of data away
21
Analysis of Qualitative Data
Analytical Strategies Similar For Qualitative and Quantitative Data Consider how you plan to use findings, -- who is the audience? what format works best? Plan your analysis in advance. How does the data fit within overall evaluation plan, other data? How will findings fit in the overall report plan? How will you code, display and draw conclusions about data? How will you validate/verify and adjust your findings? Be careful interpreting data!
22
Steps to Take When Analyzing Qualitative Data
Segment or partition data (i.e., divide it into meaningful analytical units) Reduce data Code data Compare data Organize, summarize and display data Draw conclusions, verify/validate results Revise summaries and displays accordingly Process is Iterative
23
Coding Qualitative Data
A priori or deductive codes: predetermined categories based on accepted theory or program knowledge Inductive: based on raw data (not predetermined) Hierarchical: larger categories with subcategories in each You can combine inductive and deductive within a hierarchical coding scheme
24
Coding Strategies and Reminders
Keep a master list of codes Distinguish a priori and inductive codes Re-apply codes to all segments Use multiple codes, but keep coding schemes as simple as possible Test out sample entries to identify potential problems before finalizing code selections Check for inter/intra coder reliability (consistency) Coding is not exact (expect differences) Co-occurring codes (more than one applies) Face-sheet codes (descriptors)
25
Enumeration A strategy for organizing, summarizing, and displaying qualitative data Quantify frequency of codes,* or types Use counts to define results (e.g., most responses were positive; all responses fell into 4 categories – the category most exemplified was __________). * e.g., none, some, a lot, as a percentage
27
Don’t blame it on bad evaluation Clarify next course of action
Negative Findings Explain the results and what they mean, and why they occurred if possible Clarify how negative Don’t blame it on bad evaluation Clarify next course of action Clarify what did work and for whom Avoid milquetoast approach Don’t be reluctant to report if possible
28
Disseminate errata if necessary or recall report
Inaccurate Findings Determine cause Disseminate errata if necessary or recall report Communicate with stakeholders why results will not be usable
29
Inconclusive Findings
Present in an unbiased fashion Indicate conclusions can not be drawn Develop a plan to correct evaluation or program problems if necessary
30
Positive Findings Explain the results and what they mean, and why they occurred if possible Clarify how positive, who it worked for and how Don’t distrust positive results (but be careful to avoid biased designs) Report positive results and celebrate accomplishments Clarify next course of action Resist making assumptions about the next iteration Design careful follow-up
32
Evaluation Reporting: Initial Steps
1. Clearly identify your audience. Staff? Funders? Board? Participants? Multiple 2. Determine what Presentation Strategies work best. PowerPoint Newsletter Fact sheet Oral presentation Visual displays Video Storytelling Press releases Report full report, executive summary, stakeholder-specific report?
33
Components of A Strong Program Evaluation Report
Description of the subject program. Clear statement about the evaluation questions and the purpose of the evaluation. Description of actual data collection methods Summary of key findings (including tables, graphs, vignettes, quotes, etc.) Discussion or explanation of the meaning and importance of key findings Suggested Action Steps Next Steps (for the program and the evaluation) Issues for Further Consideration (loose ends) Introduction Methods Findings Conclusions
34
Think About Communication Strategies
Are there natural opportunities for sharing (preliminary) findings with stakeholders? At a special convening At regular or pre-planned meetings During regular work interactions (e.g., clinical supervision, staff meetings, board meetings) Via informal discussions
35
Additional Reporting Tips
Convert findings to shareable form(s). Think about internal and external reporting. Plan for multiple reports. Before you start writing, be sure to develop an outline and pass it by some stakeholders. If you’re commissioning an evaluation report, ask to see a report outline in advance. Review the evaluation reports of others carefully for the important components and meaningfulness.
36
Before You Present Your Findings, Answer These Questions
Do your findings accurately reflect the data you collected? How might your interpretation be inaccurate? Are there any unintended consequences that might result from sharing these findings? Are there any missing voices you overlooked?
37
Sharing Findings: ‘ate’ Steps
Deliberate – Spend time with people close to the evaluation work and confirm the findings. You must convince yourself (ves) first. Anticipate – Determine how you want to use the findings and what value might be derived from the findings for the program/process. Investigate – Once you have findings, test them with key stakeholders. They will shed light on perceived value of the findings. Calibrate – Develop a result sharing mechanism that can convey the message you want to convey to your chosen audience. Illuminate – Remove any unnecessary details and highlight the ‘key findings’. Substantiate – Take a step away from the work and come back to it later with fresh eyes. Ask yourself, “Do the findings still resonate?” Annotate – Proofread the final draft. Misteaks can distract from results. Communicate – Share the results! What constitutes evidence? This may vary based on your audience Returning to Rodney’s CRE presentation: Whose perspectives are accepted as credible evidence? Credible to whom? Once report or presentation is nearly final, seek external feedback from someone not heavily in program or initiative You should also circle back to select staff and partners with whom you shared preliminary findings for additional input on final product You will want feedback both on your findings AND the way you are presenting your findings
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.