E VALUATION P ERSPECTIVES : L OGIC M ODELS, I MPACTS AND B UDGETING 2011 S YSTEMS S CIENCE G RANTSMANSHIP W ORKSHOP USDA N ATIONAL I NSTITUTE OF F OOD AND A GRICULTURE A UGUST 9, 2011 Dr. Gary J. Skolits Tiffany L. Smith Institute for Assessment and Evaluation
Institute for Assessment and Evaluation, University of Tennessee 2 Evaluation Presentation Topics 1. Logic model basics 2. Documenting project impacts 3. Evaluation do’s and don’ts 4. Evaluation budgeting perspectives
Institute for Assessment and Evaluation, University of Tennessee 3 1. Logic Model Basics
Institute for Assessment and Evaluation, University of Tennessee Logic Model Basics A logic model is a visual depiction of how a project intervention is expected to produce a desired outcome
Institute for Assessment and Evaluation, University of Tennessee 5 General Logic of a Project Social Need (Problem) Project Intervention (Action) Results (Change)
Institute for Assessment and Evaluation, University of Tennessee 6 Logic Model Drivers NeedsProject InterventionsResults Needs Purpose Inputs Outputs Outcomes
Institute for Assessment and Evaluation, University of Tennessee 7 Logic Model Elements A.Needs B.Purpose C.Inputs D.Outputs E.Outcomes
Institute for Assessment and Evaluation, University of Tennessee 8 A. A.Needs What problems deserve attention? o Always competing needs o Select critical need to address o Define the need
Institute for Assessment and Evaluation, University of Tennessee 9 B. B.Purpose What do you seek to accomplish towards addressing this problem? o Select aspects of the problem to address o Define the specific purpose of your project
Institute for Assessment and Evaluation, University of Tennessee 10 C. C.Inputs What resources will you invest? o Money o People o Materials o Equipment o Partners
Institute for Assessment and Evaluation, University of Tennessee 11 D. D.Outputs What will you do? o Services o Training o Products Whom will you reach? o Clients (must define this group) o Typically multiple stakeholders
Institute for Assessment and Evaluation, University of Tennessee 12 E. Outcomes What results will you achieve? Short-term (initial impact on participants) Medium-term (ongoing impact on participants) Long-term (impact on the problem)
Institute for Assessment and Evaluation, University of Tennessee Documenting Project Impacts
Institute for Assessment and Evaluation, University of Tennessee 14 Outcomes – Short-Term Feedback Awareness Commitment
Institute for Assessment and Evaluation, University of Tennessee 15 Outcomes – Medium-Term Knowledge/Skills Disposition Behavior
Institute for Assessment and Evaluation, University of Tennessee 16 Outcomes - Long-Term Achievement of project purpose Favorable change in the initial need of concern
Institute for Assessment and Evaluation, University of Tennessee 17 Impact: A Theory of Change Short -term: If participant reaction is positive, then you theorize that participants are likely to learn desired knowledge, skills, and attitudes. Medium-term: If participant learning/growth occurs (i.e., knowledge, skills, and dispositions), then you theorize that participants will change their behavior. Long-term: If behavior changes, then you theorize that there will be a positive change in terms of the initial need.
Institute for Assessment and Evaluation, University of Tennessee 18 Logic Model Template University of Wisconsin-Madison Extension
Institute for Assessment and Evaluation, University of Tennessee 19 EVALUATION: check and verify What do you want to know?How will you know it? PLANNING: start with the end in mind
Institute for Assessment and Evaluation, University of Tennessee Evaluation Do’s and Don’ts
Institute for Assessment and Evaluation, University of Tennessee 21 A Few Evaluation Do’s Talk with colleagues sponsoring similar projects for leads on good evaluators Engage an evaluator with a good reputation and experience with your type of evaluation need Engage an evaluator early in project planning Understand the role of formative and summative project evaluation Recognize the importance of efficient, reliable data collection Build evaluation data collection into project operations (when possible)
Institute for Assessment and Evaluation, University of Tennessee 22 A Few Evaluation Dont’s Be intimidated by evaluation – it is meant to enhance your project Fail to communicate with your evaluator on a regularly scheduled basis Forget to address IRB concerns – protect yourself and your clients/stakeholders Miss the opportunity to use project and evaluation data to add to the literature in your field
Institute for Assessment and Evaluation, University of Tennessee Evaluation Budgeting Perspectives
Institute for Assessment and Evaluation, University of Tennessee 24 for Evaluation 4. Budgeting for Evaluation Some factors to consider Evaluation costs depend on the required effort of the evaluator A general rule: 5% to 8% of project budget Be prepared to negotiate with your evaluator to minimize evaluation costs
Institute for Assessment and Evaluation, University of Tennessee 25 for Evaluation Budgeting for Evaluation The more data project staff collect, the lower the evaluation costs. (However, some data should be collected by the evaluator.) Link any proposed evaluation budget to specific evaluation plan and tasks. Stay on top of the deliverables promised by your evaluator.
Institute for Assessment and Evaluation, University of Tennessee 26 Two Key Resources University of Wisconsin-Madison (Extension) odel.html Kellogg Foundation Logic Model Guide center/resources/2006/02/WK-Kellogg-Foundation-Logic- Model-Development-Guide.aspx
Institute for Assessment and Evaluation, University of Tennessee 27 Gary Skolits Institute for Assessment and Evaluation University of Tennessee, Knoxville 1122 Volunteer Boulevard; A503 Bailey Education Complex Knoxville, TN