E VALUATION P ERSPECTIVES : L OGIC M ODELS, I MPACTS AND B UDGETING 2011 S YSTEMS S CIENCE G RANTSMANSHIP W ORKSHOP USDA N ATIONAL I NSTITUTE OF F OOD.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Develop and Validate Minimum Core Criteria and Competencies for AgrAbility Program Staff Bill Field, Ed.D., Professor National AgrAbility Project Director.
Presentation to HRPA Algoma January 29, My favourite saying… Fail to plan, Plan to Fail. 2.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Performance Evaluation/Management Training
Developing a Logic Model
Dennis McBride, Ph.D. The Washington Institute (253) Goal Driven Logic Models.
Logic modeling. “Would you tell me, please, which way I ought to go from here?” “That depends a good deal on where you want to get to.” said the Cat.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Putting It all Together Facilitating Learning and Project Groups.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
HOW VOLUNTEERS MAKE A DIFFERENCE: Identifying and Measuring Volunteer Outcomes Montgomery County May 2015 Pam Saussy and Barry Seltser, Pro Bono Consultants.
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
Home Career Counseling and Services: A Cognitive Information Processing Approach James P. Sampson, Jr., Robert C. Reardon, Gary W. Peterson, and Janet.
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
2014 AmeriCorps External Reviewer Training
9 Closing the Project Teaching Strategies
Designing Real Community Partnerships That Work Maureen Rubin California State University, Northridge Innovative Educators Webinar November 4, 2009.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Performance Technology Dr. James J. Kirk Professor of HRD.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Program Evaluation and Logic Models
Preparing for the Main Event Using Logic Models as a Tool for Collaboratives Brenda M. Joly Leslie M. Beitsch August 6, 2008.
Evaluation Assists with allocating resources what is working how things can work better.
Rethinking Homelessness Their Future Depends on it!
Outcome Based Evaluation for Digital Library Projects and Services
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
Logic Models and Theory of Change Models: Defining and Telling Apart
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
LCES Advisory Leadership System… The role of advisory leadership councils in Extension’s programming process.
Julie R. Morales Butler Institute for Families University of Denver.
1 Using Logic Models to Enhance Evaluation WESTAT Center to Improve Project Performance (CIPP) Office of Special Education Programs Amy A. Germuth, Ph.D.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
“Working to ensure children, birth to 5, are prepared for success in school and life.” Wake County SmartStart Logic Model Training May 2013.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
 2007 Johns Hopkins Bloomberg School of Public Health Section B Logic Models: The Pathway Model.
Getting to Outcomes: How to do strategic planning with your CRP Theresa Costello National Resource Center for Child Protective Services May 25, 2007.
Logic modeling. “Would you tell me, please, which way I ought to go from here?” “That depends a good deal on where you want to get to.” said the Cat.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Continual Service Improvement Methods & Techniques.
Session 2: Developing a Comprehensive M&E Work Plan.
Giving Them our Best: 4-H Professional Development Logic Model Outcome: 4-H educators reflect quality, distinction and leadership in the field of youth.
1 Outcomes: Outcomes: Libraries Change Lives — Libraries Change Lives — Oh yeah? Prove it. Oh yeah? Prove it. The Institute of Museum and Library Services.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Logic Models How to Integrate Data Collection into your Everyday Work.
Designing Effective Evaluation Strategies for Outreach Programs
Resource 1. Involving and engaging the right stakeholders.
Preparing a Logic Model and Gantt Chart:
Grant Champions Workshop Logic Model Development
Using Logic Models in Program Planning and Grant Proposals
Introduction to Program Evaluation
Overview – Guide to Developing Safety Improvement Plan
Short term Medium term Long term
Overview – Guide to Developing Safety Improvement Plan
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
Logic modeling.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Using Logic Models in Project Proposals
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

E VALUATION P ERSPECTIVES : L OGIC M ODELS, I MPACTS AND B UDGETING 2011 S YSTEMS S CIENCE G RANTSMANSHIP W ORKSHOP USDA N ATIONAL I NSTITUTE OF F OOD AND A GRICULTURE A UGUST 9, 2011 Dr. Gary J. Skolits Tiffany L. Smith Institute for Assessment and Evaluation

Institute for Assessment and Evaluation, University of Tennessee 2 Evaluation Presentation Topics 1. Logic model basics 2. Documenting project impacts 3. Evaluation do’s and don’ts 4. Evaluation budgeting perspectives

Institute for Assessment and Evaluation, University of Tennessee 3 1. Logic Model Basics

Institute for Assessment and Evaluation, University of Tennessee Logic Model Basics A logic model is a visual depiction of how a project intervention is expected to produce a desired outcome

Institute for Assessment and Evaluation, University of Tennessee 5 General Logic of a Project Social Need (Problem) Project Intervention (Action) Results (Change)

Institute for Assessment and Evaluation, University of Tennessee 6 Logic Model Drivers NeedsProject InterventionsResults Needs  Purpose  Inputs  Outputs  Outcomes

Institute for Assessment and Evaluation, University of Tennessee 7 Logic Model Elements A.Needs B.Purpose C.Inputs D.Outputs E.Outcomes

Institute for Assessment and Evaluation, University of Tennessee 8 A. A.Needs  What problems deserve attention? o Always competing needs o Select critical need to address o Define the need

Institute for Assessment and Evaluation, University of Tennessee 9 B. B.Purpose  What do you seek to accomplish towards addressing this problem? o Select aspects of the problem to address o Define the specific purpose of your project

Institute for Assessment and Evaluation, University of Tennessee 10 C. C.Inputs  What resources will you invest? o Money o People o Materials o Equipment o Partners

Institute for Assessment and Evaluation, University of Tennessee 11 D. D.Outputs  What will you do? o Services o Training o Products  Whom will you reach? o Clients (must define this group) o Typically multiple stakeholders

Institute for Assessment and Evaluation, University of Tennessee 12 E. Outcomes What results will you achieve?  Short-term (initial impact on participants)  Medium-term (ongoing impact on participants)  Long-term (impact on the problem)

Institute for Assessment and Evaluation, University of Tennessee Documenting Project Impacts

Institute for Assessment and Evaluation, University of Tennessee 14 Outcomes – Short-Term  Feedback  Awareness  Commitment

Institute for Assessment and Evaluation, University of Tennessee 15 Outcomes – Medium-Term  Knowledge/Skills  Disposition  Behavior

Institute for Assessment and Evaluation, University of Tennessee 16 Outcomes - Long-Term  Achievement of project purpose  Favorable change in the initial need of concern

Institute for Assessment and Evaluation, University of Tennessee 17 Impact: A Theory of Change  Short -term: If participant reaction is positive, then you theorize that participants are likely to learn desired knowledge, skills, and attitudes.  Medium-term: If participant learning/growth occurs (i.e., knowledge, skills, and dispositions), then you theorize that participants will change their behavior.  Long-term: If behavior changes, then you theorize that there will be a positive change in terms of the initial need.

Institute for Assessment and Evaluation, University of Tennessee 18 Logic Model Template University of Wisconsin-Madison Extension

Institute for Assessment and Evaluation, University of Tennessee 19 EVALUATION: check and verify What do you want to know?How will you know it? PLANNING: start with the end in mind

Institute for Assessment and Evaluation, University of Tennessee Evaluation Do’s and Don’ts

Institute for Assessment and Evaluation, University of Tennessee 21 A Few Evaluation Do’s  Talk with colleagues sponsoring similar projects for leads on good evaluators  Engage an evaluator with a good reputation and experience with your type of evaluation need  Engage an evaluator early in project planning  Understand the role of formative and summative project evaluation  Recognize the importance of efficient, reliable data collection  Build evaluation data collection into project operations (when possible)

Institute for Assessment and Evaluation, University of Tennessee 22 A Few Evaluation Dont’s  Be intimidated by evaluation – it is meant to enhance your project  Fail to communicate with your evaluator on a regularly scheduled basis  Forget to address IRB concerns – protect yourself and your clients/stakeholders  Miss the opportunity to use project and evaluation data to add to the literature in your field

Institute for Assessment and Evaluation, University of Tennessee Evaluation Budgeting Perspectives

Institute for Assessment and Evaluation, University of Tennessee 24 for Evaluation 4. Budgeting for Evaluation Some factors to consider  Evaluation costs depend on the required effort of the evaluator  A general rule: 5% to 8% of project budget  Be prepared to negotiate with your evaluator to minimize evaluation costs

Institute for Assessment and Evaluation, University of Tennessee 25 for Evaluation Budgeting for Evaluation  The more data project staff collect, the lower the evaluation costs. (However, some data should be collected by the evaluator.)  Link any proposed evaluation budget to specific evaluation plan and tasks.  Stay on top of the deliverables promised by your evaluator.

Institute for Assessment and Evaluation, University of Tennessee 26 Two Key Resources  University of Wisconsin-Madison (Extension) odel.html  Kellogg Foundation Logic Model Guide center/resources/2006/02/WK-Kellogg-Foundation-Logic- Model-Development-Guide.aspx

Institute for Assessment and Evaluation, University of Tennessee 27 Gary Skolits Institute for Assessment and Evaluation University of Tennessee, Knoxville 1122 Volunteer Boulevard; A503 Bailey Education Complex Knoxville, TN