Presentation is loading. Please wait.

Presentation is loading. Please wait.

How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 2: DATA COLLECTION Anita M. Baker, Ed.D. Evaluation.

Similar presentations


Presentation on theme: "How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 2: DATA COLLECTION Anita M. Baker, Ed.D. Evaluation."— Presentation transcript:

1 How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 2: DATA COLLECTION Anita M. Baker, Ed.D. Evaluation Services Hartford Foundation for Public Giving, Nonprofit Support Program: BEC Bruner Foundation

2 These materials are for the benefit of any 501c3 organization. They MAY be used in whole or in part provided that credit is given to the Bruner Foundation. They may NOT be sold or redistributed in whole or part for a profit. Copyright © by the Bruner Foundation 2012 * Please see supplementary materials for a sample agenda, activities and handouts Bruner Foundation Rochester, New York

3 2 How to Use the Bruner Foundation Evaluation Essentials for Program Managers Powerpoint Slides The Evaluation Essentials for Program Managers slides were developed as part of a Bruner Foundation special project, by evaluation trainer Anita Baker – Evaluation Services, and jointly sponsored by the Hartford Foundation for Public Giving. They were tested initially with a single organization in Rochester, NY (Lifespan) as part of the Evaluation Support Project 2010. The materials were revised and re-tested with three nonprofit organizations as part of the Anchoring Evaluation project in 2011-12. The slides, intended for use in organizations that have already participated in comprehensive evaluation training, include key basic information about evaluation planning, data collection and analysis in three separate presentations. Organization officials or evaluation professionals working with nonprofit organization managers are encouraged to review the slides, modify order and add/remove content according to training needs Additional Materials To supplement these slides there are sample agendas, supporting materials for activities, and other handouts. There are “placeholder” slides with just a picture of the target with an arrow in the bullseye that signify places where activities can be undertaken. Be sure to move or eliminate these depending on the planned agenda.Other more detailed versions of the Evaluation Essentials materials area also available in Participatory Evaluation Essentials: An Updated Guide for Nonprofit Organizations and Their Evaluation Partners and the accompanying 6-session slide presentation. These materials are also available on the Bruner Foundation and Evaluation Services websites free of charge. Whether you are an organization leader or an evaluation professional working to assist nonprofit organization staff, we hope that the materials provided here will support your efforts. When you have finished using the Evaluation Essentials for Program Managers series have trainees take our survey. https://www.surveymonkey.com/s/EvalAnchoringSurveyhttps://www.surveymonkey.com/s/EvalAnchoringSurvey Bruner Foundation Rochester, New York

4 How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, inform decisions. i Review

5 Logical Considerations 1. Think about the results you want. 2. Decide what strategies will help you achieve those results? 3. Think about what inputs you need to conduct the desired strategies. 4. Specify outcomes, identify indicators and targets.** DECIDE IN ADVANCE, HOW GOOD IS GOOD ENOUGH 5. Document how services are delivered. 6. Evaluate actual results (outcomes). ii Review

6 Outcomes Changes in behavior, skills, knowledge, attitudes, condition or status.  Must be: realistic and attainable, related to core business, within program’s sphere of influence  Outcomes are time sensitive, can be accomplished in multiple ways and are closely related to program design iii Review

7 Indicators Specific, measurable characteristics or changes that represent achievement of an outcome.  Indicators are directly related to the outcome and help define it  Indicators are specific, measurable, observable, seen, heard, or read.  Most outcomes have more than one indicator – you must identify the set that signals achievement iv Review

8 Targets Specify the amount or level of outcome attainment expected, hoped for or required. Targets can be set....  Relative to external standards (when available)  Past performance/similar programs  Professional hunches Targets should be set carefully, in advance, with stakeholder input v Review

9 Evaluation Question Criteria  It is possible to obtain data to address the questions.  There is more than one possible “answer” to the question.  The information to address the questions is wanted and needed.  It is known how resulting information will be used internally (and externally).  The questions are aimed at changeable aspects of activity. vi Review

10 What do you need to do to conduct Evaluation?  Specify key questions  Specify an approach (develop an evaluation design)  Apply evaluation logic  Collect and analyze data  Summarize and share findings vii

11 How are evaluation data collected?  Interviews  Surveys  Observations  Record Reviews  All have limitations and benefits  Require preparation on the front end:  Instrument Development and testing  Administration plan development  Analysis plan development 1

12 How are evaluation data collected?  Interviews  Surveys  Observations  Record Reviews  All have limitations and benefits  Require preparation on the front end:  Instrument Development and testing  Administration plan development  Analysis plan development 1

13 Interviews :  One-sided conversation with questions mostly pre- determined, but open-ended.  Respondent answers in own terms.  Can be conducted  in person  on phone  one-on-one, or groups  Instruments are called – protocols, schedules or guides USE INTERVIEWS TO: Study attitudes and perceptions Collect self-reported assessment of changes in response to program Collect program assessments Document program implementation Determine changes over time. 2

14 What is Evaluative Thinking? A type of reflective practice that incorporates use of systematically collected data to inform organizational decisions and other actions. Asking questions of substance Determining data needed to address questions Gathering appropriate data in systematic ways Analyzing data and sharing results Developing strategies to act on findings 3

15 Supportive Evaluation Environments  Organizational culture and processes necessary to translate information into action.  Processes to convert data to findings to action steps.  Culture where learning is rewarded  Staff with time and resources to engage in evaluation  Direct engagement of key decision-makers  Manageable, straightforward evaluation  Targeted and compelling methods of communication to share results 4

16 Interview Activity: Focused on Evaluative Thinking

17 How are evaluation data collected?  Interviews  Surveys  Observations  Record Reviews  All have limitations and benefits  Require preparation on the front end:  Instrument Development and testing  Administration plan development  Analysis plan development

18 Surveys :  Series of items with pre-determined response choices  Can be completed by administrator or respondents  Can be conducted  “paper/pencil”  phone, internet (e-survey)  using alternative strategies  Instruments are called – surveys, “evaluations,” questionnaires USE SURVEYS TO: Study attitudes and perceptions Collect self-reported assessment of changes in response to program Collect program assessments Collect some behavioral reports Test knowledge Determine changes over time. PRE POST GRAND CLAIMS 5

19 Survey Result Example: After School Program Feedback 9 th Grade n=71 10/11 th Grade n=97 Work collaboratively with others90% (41%)95% (58%) Try new things85% (37%)96% (58%) Listen actively84% (37%)89% (55%) See a project through from beginning to end79% (32%)81% (39%) Learn to value others’ viewpoints71% (33%)78% (29%) Become more confident in front of others68% (35%)82% (46%) Use an expanded vocabulary67% (21%)72% (28%) With memorization63% (29%)78% (40%) Express themselves with words63% (16%)83% (35%) Table 4a: Percent of Respondents Who Thought Participation in Theatre Classes and the Spring Production Helped* Them in the Following Ways 6 * Includes % who indicated they were helped somewhat and a lot. Pink shading indicates items where more than 75% of respondents indicated they were helped.

20 Things to Think about Before Administering a Survey  Target group: who, where, sampling?  Respondent assistance, A/P consent  Type of survey, frequency of administration  Anonymity vs. Confidentiality  Specific fielding strategies, incentives?  Time needed for response  Tracking administration and response  Data analysis plans  Storing and maintaining confidentiality 7

21 Survey Activity: Find Errors in Mock Survey

22 How are evaluation data collected?  Interviews  Surveys  Observations  Record Reviews  All have limitations and benefits  Require preparation on the front end:  Instrument Development and testing  Administration plan development  Analysis plan development 8

23 Observations:  Observations are conducted to view and hear actual program activities.  Users of reports will know what and how events occur.  Can be focused on  programs overall  participants  pre-selected features  Instruments are called – protocols, guides, checklists USE OBSERVATIONS TO: Document program implementation Witness levels of skill/ability, program practices, behaviors Determine changes over time. 9

24

25 How are evaluation data collected?  Interviews  Surveys  Observations  Record Reviews  All have limitations and benefits  Require preparation on the front end:  Instrument Development and testing  Administration plan development  Analysis plan development 10

26 Record Reviews :  Accessing existing internal information, or information collected for other purposes.  Can be focused on  own records  records of other orgs  adding questions to existing docs  Instruments are called – protocols USE REC REVIEW TO: Collect some behavioral reports Conduct tests, collect test results Verify self-reported data Determine changes over time 11

27 Collecting Record Review Data  Review existing data collection forms (suggest modifications or use of new forms if possible).  Develop a code book or at least a data element list keyed to data collection forms.  Develop a “database” for record review data.  Develop an analysis plan with mock tables for record review data. 12

28 Record Review Analysis Example AGENCY CDREFMHAMSCENTRALTOTAL Number of Participants AGE at INTAKE (Convert to %s) 17 and Younger 18 – 21 22 – 34 35 – 49 50 – 64 65 and Older PRIMARY DISABILITY (%s) Neurological Developmental/Cognitive Physical Chronic Disease/Illness Psychiatric Sensory Other 13

29 Record Review Example: Descriptive AGENCY CDREFMHAMSCENTRALTOTAL Number of Participants32453343157310 AGE at INTAKE 17 and Younger 3% 4%00 10%7% 18 – 21013%0047%20% 22 – 3413%29%19% 7%18%17% 35 – 4939%27%34%40%28%30% 50 – 6436%22%38%47% 19%23% 65 and Older10% 4% 9% 7%0 4% PRIMARY DISABILITY Neurological22%60%3%98%027% Developmental/Cognitive19%31%0078%43% Physical6%0002% Chronic Disease/Illness3%0001% Psychiatric19%4%97%011%19% Sensory9%2%001% Other22%2%0 7%6% 14

30 Record Review Example: Evaluative 15

31 Sources of Record Review Data Available Administrative DataOther Extant Data Intake Forms Attendance Rosters Program Logs (e.g., daily activity descriptions ) Evaluation Forms (e.g., customer satisfaction surveys, session assessments) Case Files or Case Management Data (these may include both internal data – such as progress toward internally established goals; and external data – such as reports about a participant’s living arrangements, employment or childbearing status). Exit or Follow-up Data Assessments (these may also include both internal data – such as culminating knowledge measurements at the end of a cycle; and external data such as test scores, report card grades; scale scores on a behavioral scale; medical or substance use test results). Census Data -- available on the internet, in libraries or by demand from marketing firms. Vital Statistics -- also available on the internet, in libraries and from local health departments Topical Outcome Data -- e.g., crime statistics, birth outcomes, juvenile arrest data KIDS COUNT child well-being indicators National survey data -- e.g., NELS, NLS, YRBS Community Profile Data UI (unemployment insurance) data 16

32 Record Review Activity: Identify data elements from extant data

33 What happens after data are collected? 1. Data are analyzed, results are summarized. 2. Findings must be converted into a format that can be shared with others. 3. Action steps should be developed from findings. “Now that we know _____ we will _____.” 17

34 Increasing Rigor in Program Evaluation  Mixed methodologies  Multiple perspectives/ sources of data  Multiple points in time Validity and Reliability Reliable, not ValidValid, not ReliableNeither Valid nor ReliableValid and Reliable 18

35


Download ppt "How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 2: DATA COLLECTION Anita M. Baker, Ed.D. Evaluation."

Similar presentations


Ads by Google