How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 2: DATA COLLECTION Anita M. Baker, Ed.D. Evaluation.

Slides:



Advertisements
Similar presentations
A Systems Approach To Training
Advertisements

Southeastern Association of Educational Opportunity Program Personnel 38 th Annual Conference January 30 – February 3, 2010 Upward Bound Internal & External.
Anita M. Baker, Ed.D. Jamie Bassell Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 2 Bruner Foundation Rochester, New.
How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 1: EVALUATION BASICS Anita M. Baker, Ed.D. Evaluation.
Chapter 13 Survey Designs
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Guidelines for Making Decisions about IEP Services IEP Services 8 of 8 Implement the Special Education Services Evaluate the Impact of Services.
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Evaluation. Practical Evaluation Michael Quinn Patton.
Chapter 13 Survey Designs
Survey Designs EDUC 640- Dr. William M. Bauer
What should be the basis of
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Quality Improvement Prepeared By Dr: Manal Moussa.
How to Develop the Right Research Questions for Program Evaluation
They may NOT be sold or redistributed in whole or part for a profit.
CAHPS Overview Clinician & Group Surveys: Practical Options for Implementation and Use AHRQ ANNUAL MEETING SEPTEMBER 18, 2011 Christine Crofton, PhD CAHPS.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
2014 AmeriCorps External Reviewer Training
Questionnaires and Interviews
CENTER FOR NONPROFIT EXCELLENCE EVALUATION WORKSHOP SERIES Session IV: Analyzing and Reporting Your Data Presenters: Patty Emord.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
The Evaluation Plan.
Anita M. Baker, Ed.D. Jamie Bassell Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 3 Bruner Foundation Rochester, New.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Copyright © 2013 by The National Restaurant Association Educational Foundation. Published by Pearson. All rights reserved. HOSPITALITY HUMAN RESOURCES.
May 8, 2012 MWP-K Learning Event Monitoring, evaluation, and learning (MEL) framework for the Millennium Water Program, Kenya.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Classroom Assessments Checklists, Rating Scales, and Rubrics
Working Definition of Program Evaluation
Military Family Services Program Participant Survey Training Presentation.
Outcome Based Evaluation for Digital Library Projects and Services
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Evaluating a Research Report
TRAINING EVALUATION WHAT? WHAT? WHY? WHY? HOW? HOW? SO WHAT? SO WHAT?
Workshop 6 - How do you measure Outcomes?
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
DEVELOPMENT OF SURVEY FROM AN ITEM BANK For Counselors Motivational Interviewing Performance Management and Vocational Rehabilitation Program Evaluation.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
Teacher Evaluation and Professional Growth Program Module 4: Reflecting and Adjusting December 2013.
+ EOIS-CaMS DATA MIA EOIS-CaMS Data Management, Integrity and Analysis (Data MIA) Prepared by: Robyn Cook-Ritchie, RCR Consulting ManagementIntegrity Analysis.
SUB-MODULE 5. MEANS OF VERIFICATION RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
Quality Assessment July 31, 2006 Informing Practice.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 3: SPECIFIC STRATEGIES Anita M. Baker, Ed.D. Evaluation.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Why Do State and Federal Programs Require a Needs Assessment?
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Anita M. Baker, Ed.D. Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 1 Bruner Foundation Rochester, New York.
Anita M. Baker, Ed.D. Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 3 Bruner Foundation Rochester, New York.
Learning AP ILD November 8, 2012 Planning and Monitoring for Learning.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Are we there yet? Evaluating your graduation SiMR.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Focus Questions What is assessment?
DATA COLLECTION METHODS IN NURSING RESEARCH
Module 5 HAIL Research This module provides an overview of how researchers will collect data with both Independent Living Specialist(ILS) and consumers.
NSP Evaluation Roundtable
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Presentation transcript:

How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 2: DATA COLLECTION Anita M. Baker, Ed.D. Evaluation Services Hartford Foundation for Public Giving, Nonprofit Support Program: BEC Bruner Foundation

These materials are for the benefit of any 501c3 organization. They MAY be used in whole or in part provided that credit is given to the Bruner Foundation. They may NOT be sold or redistributed in whole or part for a profit. Copyright © by the Bruner Foundation 2012 * Please see supplementary materials for a sample agenda, activities and handouts Bruner Foundation Rochester, New York

2 How to Use the Bruner Foundation Evaluation Essentials for Program Managers Powerpoint Slides The Evaluation Essentials for Program Managers slides were developed as part of a Bruner Foundation special project, by evaluation trainer Anita Baker – Evaluation Services, and jointly sponsored by the Hartford Foundation for Public Giving. They were tested initially with a single organization in Rochester, NY (Lifespan) as part of the Evaluation Support Project The materials were revised and re-tested with three nonprofit organizations as part of the Anchoring Evaluation project in The slides, intended for use in organizations that have already participated in comprehensive evaluation training, include key basic information about evaluation planning, data collection and analysis in three separate presentations. Organization officials or evaluation professionals working with nonprofit organization managers are encouraged to review the slides, modify order and add/remove content according to training needs Additional Materials To supplement these slides there are sample agendas, supporting materials for activities, and other handouts. There are “placeholder” slides with just a picture of the target with an arrow in the bullseye that signify places where activities can be undertaken. Be sure to move or eliminate these depending on the planned agenda.Other more detailed versions of the Evaluation Essentials materials area also available in Participatory Evaluation Essentials: An Updated Guide for Nonprofit Organizations and Their Evaluation Partners and the accompanying 6-session slide presentation. These materials are also available on the Bruner Foundation and Evaluation Services websites free of charge. Whether you are an organization leader or an evaluation professional working to assist nonprofit organization staff, we hope that the materials provided here will support your efforts. When you have finished using the Evaluation Essentials for Program Managers series have trainees take our survey. Bruner Foundation Rochester, New York

How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, inform decisions. i Review

Logical Considerations 1. Think about the results you want. 2. Decide what strategies will help you achieve those results? 3. Think about what inputs you need to conduct the desired strategies. 4. Specify outcomes, identify indicators and targets.** DECIDE IN ADVANCE, HOW GOOD IS GOOD ENOUGH 5. Document how services are delivered. 6. Evaluate actual results (outcomes). ii Review

Outcomes Changes in behavior, skills, knowledge, attitudes, condition or status.  Must be: realistic and attainable, related to core business, within program’s sphere of influence  Outcomes are time sensitive, can be accomplished in multiple ways and are closely related to program design iii Review

Indicators Specific, measurable characteristics or changes that represent achievement of an outcome.  Indicators are directly related to the outcome and help define it  Indicators are specific, measurable, observable, seen, heard, or read.  Most outcomes have more than one indicator – you must identify the set that signals achievement iv Review

Targets Specify the amount or level of outcome attainment expected, hoped for or required. Targets can be set....  Relative to external standards (when available)  Past performance/similar programs  Professional hunches Targets should be set carefully, in advance, with stakeholder input v Review

Evaluation Question Criteria  It is possible to obtain data to address the questions.  There is more than one possible “answer” to the question.  The information to address the questions is wanted and needed.  It is known how resulting information will be used internally (and externally).  The questions are aimed at changeable aspects of activity. vi Review

What do you need to do to conduct Evaluation?  Specify key questions  Specify an approach (develop an evaluation design)  Apply evaluation logic  Collect and analyze data  Summarize and share findings vii

How are evaluation data collected?  Interviews  Surveys  Observations  Record Reviews  All have limitations and benefits  Require preparation on the front end:  Instrument Development and testing  Administration plan development  Analysis plan development 1

How are evaluation data collected?  Interviews  Surveys  Observations  Record Reviews  All have limitations and benefits  Require preparation on the front end:  Instrument Development and testing  Administration plan development  Analysis plan development 1

Interviews :  One-sided conversation with questions mostly pre- determined, but open-ended.  Respondent answers in own terms.  Can be conducted  in person  on phone  one-on-one, or groups  Instruments are called – protocols, schedules or guides USE INTERVIEWS TO: Study attitudes and perceptions Collect self-reported assessment of changes in response to program Collect program assessments Document program implementation Determine changes over time. 2

What is Evaluative Thinking? A type of reflective practice that incorporates use of systematically collected data to inform organizational decisions and other actions. Asking questions of substance Determining data needed to address questions Gathering appropriate data in systematic ways Analyzing data and sharing results Developing strategies to act on findings 3

Supportive Evaluation Environments  Organizational culture and processes necessary to translate information into action.  Processes to convert data to findings to action steps.  Culture where learning is rewarded  Staff with time and resources to engage in evaluation  Direct engagement of key decision-makers  Manageable, straightforward evaluation  Targeted and compelling methods of communication to share results 4

Interview Activity: Focused on Evaluative Thinking

How are evaluation data collected?  Interviews  Surveys  Observations  Record Reviews  All have limitations and benefits  Require preparation on the front end:  Instrument Development and testing  Administration plan development  Analysis plan development

Surveys :  Series of items with pre-determined response choices  Can be completed by administrator or respondents  Can be conducted  “paper/pencil”  phone, internet (e-survey)  using alternative strategies  Instruments are called – surveys, “evaluations,” questionnaires USE SURVEYS TO: Study attitudes and perceptions Collect self-reported assessment of changes in response to program Collect program assessments Collect some behavioral reports Test knowledge Determine changes over time. PRE POST GRAND CLAIMS 5

Survey Result Example: After School Program Feedback 9 th Grade n=71 10/11 th Grade n=97 Work collaboratively with others90% (41%)95% (58%) Try new things85% (37%)96% (58%) Listen actively84% (37%)89% (55%) See a project through from beginning to end79% (32%)81% (39%) Learn to value others’ viewpoints71% (33%)78% (29%) Become more confident in front of others68% (35%)82% (46%) Use an expanded vocabulary67% (21%)72% (28%) With memorization63% (29%)78% (40%) Express themselves with words63% (16%)83% (35%) Table 4a: Percent of Respondents Who Thought Participation in Theatre Classes and the Spring Production Helped* Them in the Following Ways 6 * Includes % who indicated they were helped somewhat and a lot. Pink shading indicates items where more than 75% of respondents indicated they were helped.

Things to Think about Before Administering a Survey  Target group: who, where, sampling?  Respondent assistance, A/P consent  Type of survey, frequency of administration  Anonymity vs. Confidentiality  Specific fielding strategies, incentives?  Time needed for response  Tracking administration and response  Data analysis plans  Storing and maintaining confidentiality 7

Survey Activity: Find Errors in Mock Survey

How are evaluation data collected?  Interviews  Surveys  Observations  Record Reviews  All have limitations and benefits  Require preparation on the front end:  Instrument Development and testing  Administration plan development  Analysis plan development 8

Observations:  Observations are conducted to view and hear actual program activities.  Users of reports will know what and how events occur.  Can be focused on  programs overall  participants  pre-selected features  Instruments are called – protocols, guides, checklists USE OBSERVATIONS TO: Document program implementation Witness levels of skill/ability, program practices, behaviors Determine changes over time. 9

How are evaluation data collected?  Interviews  Surveys  Observations  Record Reviews  All have limitations and benefits  Require preparation on the front end:  Instrument Development and testing  Administration plan development  Analysis plan development 10

Record Reviews :  Accessing existing internal information, or information collected for other purposes.  Can be focused on  own records  records of other orgs  adding questions to existing docs  Instruments are called – protocols USE REC REVIEW TO: Collect some behavioral reports Conduct tests, collect test results Verify self-reported data Determine changes over time 11

Collecting Record Review Data  Review existing data collection forms (suggest modifications or use of new forms if possible).  Develop a code book or at least a data element list keyed to data collection forms.  Develop a “database” for record review data.  Develop an analysis plan with mock tables for record review data. 12

Record Review Analysis Example AGENCY CDREFMHAMSCENTRALTOTAL Number of Participants AGE at INTAKE (Convert to %s) 17 and Younger 18 – – – – and Older PRIMARY DISABILITY (%s) Neurological Developmental/Cognitive Physical Chronic Disease/Illness Psychiatric Sensory Other 13

Record Review Example: Descriptive AGENCY CDREFMHAMSCENTRALTOTAL Number of Participants AGE at INTAKE 17 and Younger 3% 4%00 10%7% 18 – 21013%0047%20% 22 – 3413%29%19% 7%18%17% 35 – 4939%27%34%40%28%30% 50 – 6436%22%38%47% 19%23% 65 and Older10% 4% 9% 7%0 4% PRIMARY DISABILITY Neurological22%60%3%98%027% Developmental/Cognitive19%31%0078%43% Physical6%0002% Chronic Disease/Illness3%0001% Psychiatric19%4%97%011%19% Sensory9%2%001% Other22%2%0 7%6% 14

Record Review Example: Evaluative 15

Sources of Record Review Data Available Administrative DataOther Extant Data Intake Forms Attendance Rosters Program Logs (e.g., daily activity descriptions ) Evaluation Forms (e.g., customer satisfaction surveys, session assessments) Case Files or Case Management Data (these may include both internal data – such as progress toward internally established goals; and external data – such as reports about a participant’s living arrangements, employment or childbearing status). Exit or Follow-up Data Assessments (these may also include both internal data – such as culminating knowledge measurements at the end of a cycle; and external data such as test scores, report card grades; scale scores on a behavioral scale; medical or substance use test results). Census Data -- available on the internet, in libraries or by demand from marketing firms. Vital Statistics -- also available on the internet, in libraries and from local health departments Topical Outcome Data -- e.g., crime statistics, birth outcomes, juvenile arrest data KIDS COUNT child well-being indicators National survey data -- e.g., NELS, NLS, YRBS Community Profile Data UI (unemployment insurance) data 16

Record Review Activity: Identify data elements from extant data

What happens after data are collected? 1. Data are analyzed, results are summarized. 2. Findings must be converted into a format that can be shared with others. 3. Action steps should be developed from findings. “Now that we know _____ we will _____.” 17

Increasing Rigor in Program Evaluation  Mixed methodologies  Multiple perspectives/ sources of data  Multiple points in time Validity and Reliability Reliable, not ValidValid, not ReliableNeither Valid nor ReliableValid and Reliable 18