Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright 2009. University of Minnesota. All Rights Reserved. PROGRAM IMPACT & OUTCOME EVALUATION: Is The Pain Worth The Gain? Gary A. Hachfeld Extension.

Similar presentations


Presentation on theme: "Copyright 2009. University of Minnesota. All Rights Reserved. PROGRAM IMPACT & OUTCOME EVALUATION: Is The Pain Worth The Gain? Gary A. Hachfeld Extension."— Presentation transcript:

1 Copyright 2009. University of Minnesota. All Rights Reserved. PROGRAM IMPACT & OUTCOME EVALUATION: Is The Pain Worth The Gain? Gary A. Hachfeld Extension Educator, Ag Business Management University of Minnesota Extension RME Conference March 2009 Reno, NV.

2 Copyright 2009. University of Minnesota. All Rights Reserved. Today’s Outline: Definition of Evaluation Perspectives on Evaluation Evaluative Philosophy & Tips Evaluative Approaches Evaluative Framework Evaluation & Program Example Uses of Evaluative Data & Summary

3 Copyright 2009. University of Minnesota. All Rights Reserved. Program Evaluation: Definition of Evaluation*: systematic process of determining the worth of a program. Continuous Process*: is an essential part of all stages of educational programming: planning, design, implementation and evaluation. *Seevers, Barbara et.al. (1997) Education Through Cooperative Extension. Delmar Pub.

4 Copyright 2009. University of Minnesota. All Rights Reserved. Perspectives on Evaluation: Evaluation is a burden. Evaluation is boring and a huge waste of time. Evaluation requires too many resources to accomplish. OR Evaluation is an integral part of program development and implementation. It is NOT an inconvenience or “add on”. Evaluative data can be a valuable tool when reported to funders, organizational administration, the public, etc.

5 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluative Philosophy: The Principle of Leadership and Mission.* Begin With the End in Mind! *Covey, Steven R. (2004) The 7 Habits of Highly Effective People. Free Press

6 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluation Tips: Involve your program team*. Ask yourself the following questions*: –Do I have a program (a logical, sequential set of events or activities that accomplish an agreed upon set of goals or objectives)? –Is it worthy of evaluation? *Krueger, Richard A. (2000) Think Like an Evaluator

7 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluation Tips: Think about WHAT your program goals and purposes are (“proposed results”). –WHAT are you trying to accomplish with your program. –Design your evaluation process to determine if you have met WHAT it is you are trying to accomplish with the program.

8 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluation Tips: Think about WHO you are doing the evaluation for*. –Grant and other financial funders –Program participants –Organizational administration –Your program team –General public *Krueger, Richard A. (2000) Think Like an Evaluator

9 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluation Tips: Think about WHY you are evaluating the program*. –Demonstrate accountability to funders –Identify consequences of the program –Leverage funding for other programs –Build organizational support –Feedback to people: public, clients, sponsors –Improve future programs *Krueger, Richard A. (2000) Think Like an Evaluator

10 Copyright 2009. University of Minnesota. All Rights Reserved. Approaches to Evaluation: Formative Evaluation: –Determine what works with a particular situation or audience. –Determine how we are doing as educators. –Increase the impact and effectiveness of the program. –Insight into the evolution of the program.

11 Copyright 2009. University of Minnesota. All Rights Reserved. Approaches to Evaluation: Summative Evaluation: –Assess program impact and outcome. –Supply of evidence for reporting program impact and outcome. –Funder accountability, public awareness, etc. –Organizational support. –Personal performance measures.

12 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluative Data: Quantitative Evaluative Data: –Numbers, statistics, measurements, etc. Qualitative Evaluative Data: –Comments, stories, quotes, interviews, descriptions, perceptions, case studies, journals, etc.

13 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluation & Program: Evaluative clarity: –Start by thinking about what you want to accomplish programmatically. –Think about the who & why of the evaluation process. Evaluative clarity will result in a quality program curriculum and will enable you to report substantive impacts and outcomes.

14 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluation & Program: Many evaluation/ program development models: –Linear Logic Models: Newtonian Causality Logic Model Nascent Feedback-Based Systems Logic Model –Interdependent System Relationship Models: Web of Interconnections –Complex Linear Dynamics. –Theory of Action Model.

15 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluative Framework: 1. Inputs 2. Activities 3. People Involvement 4. Reactions 5. KASA Change 6. Practice Change 7. End Results *Bennett, Claude F. (1977) Hierarchy of Evidence for Program Evaluation

16 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluative Framework: 1. Inputs 2. Activities 3. People Involvement 4. Reactions *Bennett, Claude F. (1977) Hierarchy of Evidence for Program Evaluation Staff/volunteer time, costs, resources. Program promotion, presentation, etc. Participants: number, gender, age, etc. Degree of interest, like or dislike for program.

17 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluative Framework: 5. KASA Change 6. Practice Change 7. End Results *Bennett, Claude F. (1977) Hierarchy of Evidence for Program Evaluation Program Impact/Outcome Levels Knowledge, Attitudes, Skills, Aspirations. Adoption & application of knowledge, attitudes, skills, and aspirations to work, lifestyle, etc. Social, economic. environmental, & individual consequences. (impact) (impact & outcome)

18 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluative Framework: 1. Inputs 2. Activities 3. People Involvement 4. Reactions 5. KASA Change 6. Practice Change 7. End Results *Bennett, Claude F. (1977) Hierarchy of Evidence for Program Evaluation Program Development Program Evaluation

19 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluation & Program Example: Farm Transition & Estate Planning

20 Copyright 2009. University of Minnesota. All Rights Reserved. Evaluation Process: End-of-meeting evaluation –Increased knowledge – 5 point Likert scale (B5) –Status of transfer & personal estate plans (B3) –Comments about the program (B4) Post-meeting evaluation (6 months later) –Progress on plans & barriers encountered (B6) –Usefulness of workbook (B4) –Quality of program (B4) –Topics for a second meeting (B5)

21 Copyright 2009. University of Minnesota. All Rights Reserved. End-of-Meeting Evaluation: 19 workshops (B1) –October 2007 – April 2008 –23 local sponsors 587 farm family members (B3) –247 farm business units –Participants from 151 different communities –Ages: 22 to 89 (68.5% over age 55) –45.5% female; 54.5% male 301 surveys completed (51.3%)

22 Copyright 2009. University of Minnesota. All Rights Reserved. End-of-Meeting Evaluation: (B3) YesNo Up-to-date estate plan?18%82% Up-to-date farm transfer plan?11%89% If no, plan to start farm transfer process due to workshop 98%2%

23 Copyright 2009. University of Minnesota. All Rights Reserved. End-of-Meeting Evaluation: (B5) “Strongly agree or agree” Need for clear goals and communication 100% Transfer strategies 95% Importance of assessing financial strength 99% Tax issues 91% Estate planning strategies and issues 98% How to write and utilize a transition plan 90%

24 Copyright 2009. University of Minnesota. All Rights Reserved. Post-Meeting Evaluation: Process: –Mail-out survey with return envelope –Followed final program by six months –247 farm units surveyed –107 farm units returned surveys (43.3%)

25 Copyright 2009. University of Minnesota. All Rights Reserved. Post-Meeting Evaluation: Farm transfer plan progress: (B6) –72.9% reported starting the process –15.7% had completed the process Estate plan progress: (B6) –79.2% reported starting the process –17.1% had completed the process

26 Copyright 2009. University of Minnesota. All Rights Reserved. Outcome of Program: Average balance sheet asset value Minnesota farm family (FINBIN 2007): –Farm business assets - $1,373,612 per farm family –Personal assets - $197,068 per farm family Follow-up evaluative results: –15.7% of 107 farms reported implementing transfer plan –17.1% of 107 farms reported implementing estate plan

27 Copyright 2009. University of Minnesota. All Rights Reserved. Outcome of Program: Actual Program Outcome: (B7) –15.7% of 107 farms transfer plan - $22.0 million –17.1% of 107 farms estate plan - $3.5 million Total Actual Program Outcome: $25.5 million

28 Copyright 2009. University of Minnesota. All Rights Reserved. Outcome Evaluation: Think of it as the “so what” of your program efforts. Change in Action or Participation*  Knowledge &  Practice  Results Understanding Change (outcomes) (impacts) * Bennett, Claude F. (1977) Hierarchy of Evidence for Program Evaluation

29 Copyright 2009. University of Minnesota. All Rights Reserved. Outcome Evaluation: Change in Action or Participation  Knowledge &  Practice  Results Understanding Change (outcomes) (impacts) Attend  Increased  Develop  $25.5 million Workshop Understanding Implement Plans

30 Copyright 2009. University of Minnesota. All Rights Reserved. Program Impact & Outcome Evaluation: Lets Practice !

31 Copyright 2009. University of Minnesota. All Rights Reserved. Program Evaluation: Share evaluative findings with: Grant Funders ! Decision makers ! Agency administration ! Local program sponsors ! Journals & publications ! Media !

32 Copyright 2009. University of Minnesota. All Rights Reserved. RESULTS: Farm Transfer Estate Planning: Funding: –Risk Management Grants: $52,000 –Minnesota Corn Growers Assn.: $3,600 –Sponsorship fees: $26,400 Recognition: –Extension Dean & Director Team Award –NACAA Program Team Award

33 Copyright 2009. University of Minnesota. All Rights Reserved. RESULTS: Winning the Game: Funders: –MN Soybean Research Promotion: $195,000 –Sponsorship fees: $78,000 Recognition: –Extension Dean & Director Team Award –AAEA Outstanding Program Award –Featured in “Source” magazine

34 Copyright 2009. University of Minnesota. All Rights Reserved. Summary: Think about evaluation and develop your own philosophy. Start with the end in mind. What do you want to accomplish? Think about the who & why of the evaluative process as it relates to your program. Involve the program team. Determine what evaluative approach and data type you are going to utilize. Ask yourself “so what”?

35 Copyright 2009. University of Minnesota. All Rights Reserved. Hints: Collect evaluative data/information using multiple methods & levels for same program effort: –Pre-test & post-test, post-test perceptions, surveys and questionnaires, face-to-face interviews, phone interviews, Turning Point Technology, Survey Monkey, etc. –Can effectively report findings at multiple levels. Be Creative - use results in multiple ways: –Formal written reports (short & long), verbal reports, press releases, bullet items in marketing brochures, target audience description in grants, grant results, etc.

36 Copyright 2009. University of Minnesota. All Rights Reserved. Hints: Avoid “Myths of Reporting”:* –One written report is enough: We do not all learn the same way. –People read written reports: Know your audience. “Leak” information to influentials. –Complexity is impressive to the audience: Know your audience. –Build to the most important point-save it for last: Not for evaluative data – time constraints, interruptions, etc. *Krueger, Richard A. (2000) Think Like an Evaluator

37 Copyright 2009. University of Minnesota. All Rights Reserved. Hints: Avoid “Myths of Reporting”:* –A 15 minute report means a 15 minute presentation: Audiences want time for questions, clarification, etc. –The audience knows why they are getting the report: Indicate why the report was given and any action that may be recommended. –Everything should be reported: Be brief and report most critical information. *Krueger, Richard A. (2000) Think Like an Evaluator

38 Copyright 2009. University of Minnesota. All Rights Reserved. PROGRAM IMPACT & OUTCOME EVALUATION: Is The Pain Worth The Gain? You have to decide !

39 Copyright 2009. University of Minnesota. All Rights Reserved. Contact information: Gary A. Hachfeld Extension Educator Ag Business Management 507-389-6722 hachf002@umn.edu

40 Copyright 2009. University of Minnesota. All Rights Reserved. Questions? Comments?


Download ppt "Copyright 2009. University of Minnesota. All Rights Reserved. PROGRAM IMPACT & OUTCOME EVALUATION: Is The Pain Worth The Gain? Gary A. Hachfeld Extension."

Similar presentations


Ads by Google