Evaluation for Grant Writers (and others) Jennifer Sweeney, MSLS, PhD.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
NORTH CAROLINA TEACHER EVALUATION PROCESS TRAINING 2-Day Training for Phase I, II and III *This 2-Day training is to be replicated to meet.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Designing an Effective Evaluation Strategy
©2007 by the McGraw-Hill Companies, Inc. All rights reserved. 2/e PPTPPT.
Catulpa Community Support Services.  Use of an electronic data entry program to record demographic data and case notes to reflect service delivery 
Laura Pejsa Goff Pejsa & Associates MESI 2014
Chapter 4 How to Observe Children
Enhancing Education Through Technology Round 9 Competitive.
Formative and Summative Evaluations
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Title I Needs Assessment and Program Evaluation
HEInnovate A self-assessment tool for higher education institutions (HEIs) wishing to explore their entrepreneurial and innovative potential.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Sunday 29 March – Thursday 2 April 2009, CAIRO, EGYPT
Continuous Quality Improvement (CQI)
Molly Chamberlin, Ph.D. Indiana Youth Institute
How to Develop the Right Research Questions for Program Evaluation
UNLEASH the POWER of the Evaluation Framework, Methods, Tools and Data Analysis.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
2014 AmeriCorps External Reviewer Training
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Reporting and Using Evaluation Results Presented on 6/18/15.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Impact assessment framework
Portfolio based assessment - options for the new CGEA.
Conceptual Framework for the College of Education Created by: Dr. Joe P. Brasher.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
The Evaluation Plan.
Too expensive Too complicated Too time consuming.
Program Evaluation and Logic Models
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Outcome Based Evaluation for Digital Library Projects and Services
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Situation Analysis Determining Critical Issues for Virginia Cooperative Extension.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
School Improvement Partnership Programme: Summary of interim findings March 2014.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
1 SHARED LEADERSHIP: Parents as Partners Presented by the Partnership for Family Success Training & TA Center January 14, 2009.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Evaluating Multilingual Education Programs Seminar on Multilingual Education Kabul, March 2010 Dennis Malone, Ph.D.
HEInnovate A self-assessment tool for higher education institutions (HEIs) wishing to explore their entrepreneurial and innovative potential.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
CAPS: COACHING TEACHERS Facilitator: Dr. Lynne Paradis BELIZE LITERACY PROGRAM June 2011.
Session 2: Developing a Comprehensive M&E Work Plan.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Logic Models How to Integrate Data Collection into your Everyday Work.
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Designing Effective Evaluation Strategies for Outreach Programs
Partnership for Practice
Student Assessment and Evaluation
School’s Cool Makes a Difference!
Student Assessment and Evaluation
Using Logic Models in Project Proposals
Presentation transcript:

Evaluation for Grant Writers (and others) Jennifer Sweeney, MSLS, PhD

What is evaluation? Logic models Developing evaluation questions Collecting data Analyzing data Writing the report AGENDA

What is evaluation ? Measure something’s worth or merit Conducted for a purpose... – provide accountability – increase effectiveness – build capacity – reinforce program activities – solve problems – increase engagement and ownership

Ways of focusing evaluations Accountability Appreciative inquiry Capacity-building Cost-effectiveness Effectiveness Goals-based Intervention-oriented Needs assessment

Two (of many) types of evaluation Formative – occurs during project development, implementation, progress – assess ongoing activities – monitor and improve project – aka progress reporting Summative -occurs after the fact -assess success in reaching goals -AKA outcome or impact evaluation

Evaluation in program planning What gets measured gets... – done? – or just measured? Evaluation is good management: – Setting goals – Planning activities – Measuring implementation – Fixing problems

Program evaluation questions Was the project successful? What were its strengths and weaknesses? To what extent did the project or program meet the overall goals? Did the participants benefit from the project? In what ways? What components were the most effective? Were the results worth the cost? Is this project replicable?

Logic models What is a logic model? – Describes relationships among program elements – Illustrates project or program plan Benefits – document inputs, activities, outcomes – clarify understanding – facilitate planning

Elements of a logic model Resources aka inputs funding, personnel, facilities Activities specific actions that make up program Short-term outcomes outputs Long-term outcomes impact

Example of a logic model Resources funding personnel facilities Activities develop lesson teach students assess learning Short- term outputs # students reached Long-term outcomes growth in learning change in behavior

Constructing a logic model Example program After-school program for at-risk youth Overall goal: help teens learn how to make good choices, stay out of trouble, and succeed Public library partnership with local law enforcement, juvenile court, family services agencies, and high school Book discussion, reflection, creative writing, information literacy for various life skills; 1x/week for 10 weeks ➪ Library role: teach information literacy for life skills such as how to find information on employment, GED study, parenting, health information

Resources (aka “inputs”) Grant funds Personnel Facilities Materials & supplies In-kind contribution (time, expertise) from partners

Activities Train leaders Develop curriculum Conduct program

Outcomes: Short-term “outputs” Number of teens served Number of hours attended Number of leaders trained Number of hours of preparation time

Outcomes: Long-term Growth in awareness of good decisions Growth in problem-solving skills Growth in life skills Change in behavior, attitude ➪ Growth in awareness of information resources, information literacy skills

Which leads to our evaluation questions... What do I want to know? Use logic model to chart evaluation plan

Evaluation questions ActivitiesEvaluation questions Leader training & preparation Was training adequate? Hours of preparation? Program activitiesWere users satisfied with each activity? Did activities meet leader expectations? Short-term outputs Were output objectives met? Program attendance, level of participation Long-term outcomes Change in teen self- awareness/knowledge/skills Satisfaction Change in behavior

Make it measurable Evaluation questionMeasure Was training adequate?Hours of training Effectiveness of training Were program activities appropriate? (specific activities) User/leader satisfaction Did short-term outputs meet expectations? Program attendance Level of participation (hours, accomplishments) Were long-term outcomes met? Change in teen self- awareness/knowledge Change in behavior Change in information literacy skill level

Collecting data What do I want to know? – evaluation questions What information do I need? – measures How will I get this information? – data collection techniques

Data collection techniques Program records – activities, attendance, participation Surveys – users, leaders Interviews – users, other stakeholders Observation

Data collection matrix MeasureData collection tool Hours of preparation timeLeader’s reports User/leader satisfactionSatisfaction surveys Program attendance Level of participation (hours, accomplishments) Attendance logs Participation records Change in teen self- awareness/knowledge/skill level Pre- and post assessments (surveys); follow up interviews Change in behavior: stay in school, graduate, get/keep a job, stay off drugs, stay out of justice system School/probation/ social services records; follow up interviews

Surveys Advantages: Collect a lot of information from a lot of people Cover a wide range of topics Inexpensive Analyze with variety of software Disadvantages: Self-reports may be biased Data may lack depth May not provide much context ✓✓✓✓✓✓

Interviews Advantages: Explore complex issues Opportunity for follow up Establish rapport Disadvantages: Interviewer bias Training and analysis is time consuming Smaller samples

Developing user survey questions Tailor your survey to the specific goals of your program Address questions from funders Address issues in proposal In general, your survey should cover: Did you accomplish what you expected to accomplish? Did your users get what they wanted? What could be improved?

Some example user survey questions Pre- & post-assessment on information literacy skills Tell us how much you agree with the following: (scale: 1=agree strongly/5=disagree strongly) 1.I know how to find information on things I am interested in 2.I know how to find information on how to get a job 3.I know how to find information on finishing high school 4.I know how to find information on applying for community college 1.I wish I knew more about: ________________________

Data collection tips Have a plan for collecting data Take advantage of the “captive audience” Build time into program schedule for evaluation On question preparation... Keep it simple! And short! Look for existing questions Get input from partners Use familiar language Use numbers in scales rather than adjectives Pretest!

Making sense of your data Explore your data: −mean (average) −range (high, low values) −create categories for qualitative data ➪ how do you create categories ? Be systematic Experiment with different graphical displays

Writing the report First things first: Who is your audience? What do they want to know?

Outline of a formal report Summary Background Evaluation study questions Study methods Findings Conclusion and/or recommendations Optional: References Appendix: survey instruments, data tables

Background Problem addressed Stakeholders and their information needs Participants Project objectives Activities and their components Resources used to implement the project Expected measurable outcomes

Evaluation study questions Identify stakeholders’ specific information needs Draw from proposal, updated as necessary Lots of questions are possible; limit this section to those questions actually addressed Point out questions that could NOT be addressed and why

Study methods Who participated Sampling strategy (if applicable) – How representative? Measures used Type of data collected How data was collected Instruments used How data were analyzed

Findings Results of the analyses Use Evaluation Study Questions to organize findings Address each question, whether or not findings were satisfactory Include a summary at the end to present major conclusions

Conclusions and recommendations Summary of findings from a broader perspective Relate findings to overall program goals Recommendations for further study or future projects Base recommendations on solid data, NOT anecdotal evidence, no matter how persuasive!

Other sections Abstract Executive summary Appendix: survey instruments, data tables

Writing the report Ahead of time: Background, Study questions, Study methods – Include descriptions from original proposal, updated as needed Last steps: Findings, Conclusions, Summary ➪ Get feedback from colleagues

That’s it ! For more information... see the Resources handout Questions? Thank you for attending this webinar! Contact: