Catherine Kost Heather McGreevy E VALUATING Y OUR V OLUNTEER P ROGRAM.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
What “Counts” as Evidence of Student Learning in Program Assessment?
S EPTEMBER SESSION 1: R ESEARCH Q UESTIONS & S ELECTING M EASURES The first steps toward getting your EA research project underway.
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Collecting Qualitative Data
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Wynne HARLEN Susana BORDA CARULLA Fibonacci European Training Session 5, March 21 st to 23 rd, 2012.
Designing an Effective Evaluation Strategy
Project Monitoring Evaluation and Assessment
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Chapter 4 How to Observe Children
Social Science Research and
Volunteer Recognition Honoring and recognizing individuals for their unique contribution to educational program efforts Honoring and recognizing individuals.
Evaluation. Practical Evaluation Michael Quinn Patton.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Collecting Data This is STEP 3 of the five steps.
Quality of Life ENACTUS TRAINING Measurement Tools Developed by D Caspersz & D Bejr, 2013.
Measuring Learning Outcomes Evaluation
Catherine Kost Heather McGreevy R ETAINING V OLUNTEERS.
Molly Chamberlin, Ph.D. Indiana Youth Institute
Action Research: For Both Teacher and Student
Measuring for Success Module Nine Instructions:
UNLEASH the POWER of the Evaluation Framework, Methods, Tools and Data Analysis.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
PROGRAM PLANNING, IMPLEMENTATION & EVALUATION The Service Delivery Model Link Brenda Thompson Jamerson, Chair Services to Youth Facet May 8-12, 2013.
Designing Learning.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
The Evaluation Plan.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Primary Research. Purpose  To understand the qualitative and quantitative methods commonly used in primary research.
AU- MAEL Dr. Dan Bertrand
WRITING EFFECTIVE GRANT PROPOSALS With an Eye toward Performance and Evaluation Issues.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Research Methods in Education
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Interactive Power Point Evaluating Your Program. Evaluating your Program First – Review the Steps Step 1 ◦State overall objectives Step 2 ◦State desired.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Evaluating and measuring impact in career development: extension workshop Presented by – Date – Just to identify strengths and areas to improve are no.
Research and Evaluation
Quality Assessment July 31, 2006 Informing Practice.
Quantitative and Qualitative Approaches
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Data Generation.
1 Learning Objectives: 1.Understand data collection principles and practices. 2.Describe the differences between collecting qualitative and quantitative.
Program Evaluation.
Measuring for Success Module Nine. Reflecting on the Previous Session What was most useful? What progress have you made? Any comments or questions?
S URVEY D ESIGNS Chapter 12 Lauri Cabral Sean Lafontaine.
EDU 5900 AB. RAHIM BAKAR 1 Research Methods in Education.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
ACM 4063 Communication Research
C HOOSING E VALUATION T OOLS. Gathering information requires the appropriate tool Important to use a variety of data collection types.
Strategic Research. Holiday Inn Express Stays Smart What research results led to an upgrade of all Holiday Inn Express bathrooms? How did their agency,
Copyright © 2014 by The University of Kansas Data Collection: Designing an Observational System.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Making it Count! Program Evaluation For Youth-Led Initiatives.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
Trouble? Can’t type: F11 Can’t hear & speakers okay or can’t see slide? Cntrl R or Go out & come back in 1 Sridhar Rajappan.
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Measurement Tools ENACTUS TRAINING Quality of Life
Designing Effective Evaluation Strategies for Outreach Programs
Measurement Tools ENACTUS TRAINING Quality of Life
Chapter 14 Evaluation in Healthcare Education
Data Collection: Designing an Observational System
Presentation transcript:

Catherine Kost Heather McGreevy E VALUATING Y OUR V OLUNTEER P ROGRAM

O BJECTIVES 2 1. Increase knowledge of the steps to evaluating a volunteer program. 2. Increase ability to plan how to evaluate a volunteer program. 3. Develop next steps for improving current evaluation of volunteer program.

E VALUATION 3  What’s the point?  Lends credibility to your agency  Helps fundraise from individuals  Use for grant-writing purposes  Helps retain volunteers  Better serve your mission

F OUR S TEPS TO E VALUATING A V OLUNTEER P ROGRAM 4 STEP 1: Decide What to Track STEP 2: Collect Data STEP 3: Turn Data into Findings STEP 4: Use Findings

5 T HE I MPORTANCE OF E VALUATION Imagine you are managing a Neighborhood Watch volunteer program in which residents volunteer to patrol the neighborhood. Fast forward 5 years. You are in front of 1,500 people, including your boss, your boss’s boss, and some of the most influential people in the country. You are reporting on your Neighborhood Watch volunteer program. Looking back over the past five years, what do you want to be able to say?

6  The resources used to meet your volunteer program goals, such as time, money, materials, etc. Inputs also include service providers, the program setting, community factors, collaborations, service technology, funding sources, and participants. T HE E VALUATION G RID - I NPUTS

7  Can be set for three areas: community/ beneficiaries, organization, and volunteers. Objectives define what you are trying to achieve. T HE E VALUATION G RID - O BJECTIVES

8 T HE E VALUATION G RID - A CTIVITIES  Process information that are descriptions about what happens day-to-day to carry out the program. These are the methods used to accomplish program goals (i.e, classes, workshops, counseling sessions, group outings, assessments).

9  The results of your efforts. The units produced by a program. Units include dose (i.e., # of classes), duration, and number of participants. T HE E VALUATION G RID - O UTPUTS

10  Short-term meaningful changes that may or may not hold up over time Immediate steps of progress toward a goal. T HE E VALUATION G RID - O UTCOMES

11  Show what has changed for the better because of your program  Meaningful long-term changes T HE E VALUATION G RID - I MPACTS

12  What your program aims to achieve for the community.  What your program aims to achieve for your organization  What your program aims to achieve for your volunteers. S TEP 1: D ECIDE W HAT TO T RACK

13  Using the Volunteer Program Evaluation Plan worksheet, complete Step 1 for your program  Select an output, outcome, or impact that you would like to measure, but currently don’t A CTIVITY : P ROGRAM E VALUATION

Select Indicators 2.2 Select Methods 2.3 Obtain Data S TEP 2: C OLLECT D ATA

15 An indicator is a specific item that will represent the level or degree to which a process measure, output, outcome, or impact occurred. S TEP 2.1: S ELECT AN I NDICATOR

16  Existing Records  Focus Groups  Interviews  Observations  Portfolio/Journal Assessment  Tests  Written Surveys S TEP 2.2: S ELECT M ETHODS

17  Existing information collected by other agencies and institutions  Provide both descriptive and evaluative information  Used to track changes in quantifiable behaviors E XISTING R ECORDS

18 A series of questions, semi-structured or unstructured, conducted in person or over the phone.  Use when you want in-depth information or when investigating a sensitive topic. I NTERVIEWS

19 First-hand observation of interactions and events Establish pre-determined protocol or observations guides to focus the information you gather  Use when self-report or existing data is not accurate  Use when professional judgment is helpful O BSERVATIONS

20 Instruments that contain questions about the issues to be evaluated. Types of Questions  Single, direct questions (closed ended)  Series of questions about the same topic (scale)  Open ended questions S URVEYS

21 How to conduct the survey  By mail  In person  Over the phone or internet  In a centralized activity as part of an event Types of Surveys  Standardized  Self developed S URVEYS

22  If under 100, survey the total population  If over 100, survey a sample population  Large & representative enough so results reflect the entire population  Probability Sampling: Assign a number to each client and draw numbers at random or select people at equal intervals (every third person) S AMPLING S TRATEGY & S AMPLE S IZE

23  Collect Data  Aggregate Data  Analyze Data  Quantitative  Qualitative S TEP 3: T URN D ATA INTO F INDINGS

24 Activity: Data Collection  Using the Volunteer Program Evaluation Plan worksheet, complete Steps 2 and 3 for your program  Write down your data collection method or methods and who will aggregate and analyze your data

25 Report Findings  Select the best format for audience & message  Provide basic information on how data were collected  Express only one idea per graph (if used)  Use both qualitative and quantitative data  Do not over-interpret results Apply Findings S TEP 4: U SE F INDINGS

Thank You for Coming Please complete an evaluation of all of workshops. T HANK Y OU !