Presentation is loading. Please wait.

Presentation is loading. Please wait.

Approaches to Evaluation (1.5)

Similar presentations


Presentation on theme: "Approaches to Evaluation (1.5)"— Presentation transcript:

1 Approaches to Evaluation (1.5)
What are the goals of our program? What do we expect to happen to participants? What do we want participants to gain, learn, etc.?

2 Keep in Mind: Purpose of Evaluation/Research
Resources and Skills Available Who Has the Information How Data will be Analyzed and Used

3 Approaches (Models) Intuitive Judgment Professional Judgment
Goal-Attainment Logic Models Goal-Free Process or Systems

4 Goal-Attainment: Goals and Objectives
Related to “needs,” “skills,” and “competencies” Helps to establish a direction for your program Helps to define what resources are needed and how (and where) resources will be used Helps to define what programs/facilities will be provided Specifically links desired outcomes to programs Usually connected to this mission statement of the youth development organization Helps to establish a direction for the program Helps to define what resources are needed and how (and where) resources will be used Helps to define what services will be provided (links customer needs with organizational services) Links benefits to services 4

5 Goals and Objectives Goals Objectives
Broad, long-range statements that define the programs/services that are going to be provided Objectives Specific statements (about the attainable parts of the goal) that are measurable and have some dimension of time Goals can be defined as broad statements that define the youth development program/services that are going to be provided. In contrast, objectives are specific statements that are measurable and have some dimension of time. 5

6 Objectives (Review) Specific Measurable Achievable Relevant
Must be clear and concrete Measurable Must be some way to determine whether or not the desired results have been achieved Achievable Must be attainable and reality-based! Relevant Must be useful; must have worth to your organization Time-limited/Time connected Must specify a time frame for accomplishment Program objectives should have the following five (5) characteristics: One, they should be “Specific”- clear and concrete. Two, they must be “Measurable,” there must be some way to determine whether or not the desired results have been achieved. Three, they must be “Pragmatic,”- which means that they have to be realistic, attainable, and based in reality. Four, they must be “Useful.” In other words, they must have some value, worth, or benefit to the organization or to the youth and families who will be participants in the programs. And five, objectives should be “Linked to Needs,”- they should be directly linked to the needs of youth/families and the organization’s mission. 6

7 Individual or Group Remember that goals and objectives can be written for an individual (i.e., how you want one particular person to perform/succeed) or a group (i.e., how you want a group of participants to perform/succeed). Recall that our program goal was “to improve youth’s public speaking skills.” By putting all 4 parts of our objective together, we can get the objective: “Youth will demonstrate proficiency in giving an oral presentation by standing in front of a peer group and discussing a selected topic, which has an introduction, main argument, supporting facts, and conclusion in 4-6 minutes.” For this goal, any number of objectives could have been developed. This was just one example. 7

8 A Logic Model is… A HOT TOPIC in (Program) Evaluation
A depiction of a program showing what the program will do and what it is to accomplish A series of “if-then” relationships that, if implemented as intended, lead to the desired outcomes The core of program planning and evaluation 8

9 MODEL LOGIC the principles of reasoning
the relationship of elements to each other and a whole MODEL small object representing another, often larger object (represents reality) preliminary pattern serving as a plan tentative description of a system or theory that accounts for all its known properties The American Heritage Dictionary, 2nd ed. Adapted from University of Wisconsin-Extension, Program Development and Evaluation 9

10 Simplest Form of a Logic Model
INPUTS OUTPUTS OUTCOMES In its simplest form, a logic model is a graphic representation that shows the logical relationships between: The resources that go into the program – INPUTS The activities the program undertakes – OUTPUTS The changes or benefits that result – OUTCOMES 10

11 What will show that you’ve arrived? (EVALUATION!)
Where are you going? How will you get there? What will show that you’ve arrived? (EVALUATION!) 11

12 Logic Model May Also be Called…
Program action Model of change Outcome map Program logic 12

13 Everyday Example Get pills Take pills Feel better H E A D C Situation
Let’s take a simple example – one that we can all relate to. How many of us have had a headache at one time or another? (headache – SITUATION) What do we do? Our experience may be that certain pills help So, we need to get the pills (INPUTS), Then we take the pills (OUTPUTS) As a consequence, our headache goes away and we feel better. (OUTCOME) Number of embedded assumption: assumes that we can find/get the needed pills; that we take the pills as prescribed; that the pills lead to improvement – not a stomach ache or other negative side effect. All programs have such assumptions – often the basis for failure or less than expected results But, you can see the logic of the diagram and the end results – the impact that is expected. What really matters isn’t whether we get the pills and take the pills, but whether we feel better as a result Situation INPUTS OUTPUTS OUTCOMES Slide adapted from University of Wisconsin-Extension, Program Development and Evaluation 13

14 Assumptions Assumptions underlie much of what we do. One benefit of logic modeling is that it helps us make our assumptions explicit. Assumptions are the beliefs, principles, ideas we have about the program, the people involved and the way we think the program will operate. Assumptions underlie all that we do. Examples of assumptions include: Community coalitions are an effective strategy for addressing community problems Our partners will participate actively in program delivery The funding will be adequate and available when needed The target participant want to learn and change their behaviors In a 2004 study by Kaplan and Garrett, assessing underlying assumptions was found to be one of the most important parts of logic modeling but it is often minimized or overlooked. [Kaplan & Garrett (2005)] 14

15 Youth and Community Service
INPUTS OUTPUTS OUTCOMES Youth identify project to work on Youth improve skills in planning, decision making, problem solving Youth demonstrate leadership skills Staff Youth are connected with and feel valued by their community Youth successfully complete projects Grant Plan project Youth learn about their community Youth ages 12–16 Partners Carry out the project Youth gain confidence in doing community work Youth engage in additional community activities This logic model illustrates the forward and backward connections (feedback loops) that are common in programs. Another chain of outcomes could be developed for the adults Time Evaluate how they did Adults Example from University of Wisconsin-Extension, Program Development and Evaluation 15

16 If-Then Relationships
Underlying a Logic Model is a series of ‘if-then’ relationships if-then if-then if-then if-then if-then 16

17 How Will Activities Lead to Desired Outcomes
How Will Activities Lead to Desired Outcomes? A series of if-then relationships Tutoring Program Example If Then If Then If Then If Then If Then We invest time and money We can provide tutoring 3 hrs/week for 1 school year to 50 children Students struggling academical-ly can be tutored They will learn and improve their skills They will get better grades They will move to next grade level on time Example from University of Wisconsin-Extension, Program Development and Evaluation 17

18 But Wait, There’s More… INPUTS OUTPUTS OUTCOMES So What?
Activities Participation Short Medium Program investments Long-term What We Invest What We Do Who We Reach What results So What? What is the VALUE? Adapted from University of Wisconsin-Extension, Program Development and Evaluation 18

19 Developing a System (1.6) The 5 Ps of Evaluation
Personnel Policies Places Programs Participant Outcomes ****These are NOT discrete—they overlap!

20 Things to Think about When Developing a System…
Keep in mind the time and financial constraints Evals have political overtones Establish appropriate reasons for evaluation (staff want info, funder requires it, needed for decisions about continuing a program)

21 Purposeful Planning for Evaluation
Schedule time on your calendar Involve planning board, committees, colleagues, staff, volunteers, etc. Identify what you hope to achieve (desired outcomes) Identify your goals and objectives Identify the methods you will use to measure whether or not your program met your objectives

22 Personnel Evaluations (Performance Appraisal or Performance Eval)
Can be formative or summative Based on job description and/or plan of work Staff should receive feedback from eval Used for decisions like compensation, promotion, training, employee development Should be at least annual but not just a single activity—should have informal evals as well Evaluate performance, not personality, to help employee improve Describe how you would envision the process of personnel eval Assign duties Determine criteria based on the duties Gather info on how well employee performs Appraise performance Provide feedback to employee Make decisions and adjustments about performance (value of employee, promotions, further training, etc)

23 Policy Evaluation Administrative aspects include way agency is organized and operated Public policy and community surveys to measure support, opinions, or info Cost-benefit, cost-effectiveness evals Performance-based program budgets Economic impacts Assessments Cost benefits- compare cost to benefits but not just in $$ Cost –effectiveness easier because it’s ratio of cost to revenue generated (brea-even programs are 1:1) Performance –based budgets- breaks down work or program activities into detailed sub-units then compare

24 Places (Areas and Facilities)
Areas evaluated on use, safety, and legal mandates Master planning (part of long-range planning) A&F evals often done by applying standards (ACA and P&R accreditation) GIS use very popular

25 Program Quality and Participants (1.7)
INPUTS OUTPUTS OUTCOMES In its simplest form, a logic model is a graphic representation that shows the logical relationships between: The resources that go into the program – INPUTS The activities the program undertakes – OUTPUTS The changes or benefits that result – OUTCOMES 25

26 Social-Economic-Environmental Improvements
Hierarchy of Effects Reactions Learning Actions Number and characteristics of people reached; frequency and intensity of contact Degree of satisfaction with program; level of interest; feelings toward activities, educational methods Changes in knowledge, attitudes, skills, aspirations Changes in behaviors and practices Participation Source: Bennett and Rockwell, 1995, Targeting Outcomes of Programs Many Extension staff will remember the Bennett hierarchy of the 1970’s that was so popular and widely used throughout Extension. The Bennett hierarchy is a precursor of the present day logic model. You can see the similarities in this graphic. Rockwell and Bennett have since developed a toolkit titled, Targeting Outcomes of Programs (TOP) that is available on the web at See it for more information. 26

27 What results for individuals, families, communities…
OUTCOMES What results for individuals, families, communities… SHORT Learning Changes in: Awareness Knowledge Attitudes Skills Opinion Aspirations Motivation Behavioral intent MEDIUM Action Behavior Decision-making Policies Social Action LONG-TERM Conditions Social (Well-Being) Health Economic Civic Environmental OUTCOMES In order to facilitate/achieve outcomes, that extend along a continuum – or chain of outcomes from short to long-term or impact. For example, changes in knowledge, skills, attitudes and intent: Change in knowledge might be increased understanding of the purpose of a budget, or loan terms Change in skills might be how to develop a spending plan Change in attitude might be Change in confidence might be increased confidence to ask questions; go to a bank and seek service Change in intent might be Change in behavior Change in decision making Change in individual, family, financial institution, community conditions Unit of analysis?? Sebstad provides illustrative outcomes for 5 thematic areas Outcomes are the changes or benefits for individuals, families, groups, businesses, organizations, and communities. Outcomes occur along a path from short-term achievements to longer-term end results (impacts). Outcomes include Short-term: Changes in awareness, knowledge, skills, attitudes, opinions, motivation, intent such as Increased knowledge of poverty’s impact on individuals and the community Goal represents a general, big picture statement of desired results. Increased skills in leading a group Greater intention to exercise Medium-term: Changes in behaviors, decision making, action Participating youth use a spending plan Producers make informed decisions concerning farm transfer Community installs bike paths Long-term: Changes in social, economic, civic, environmental conditions such as Reduced debt Improved water quality Increased community safety The ultimate result of a program is usually referred to as “impact”. Impacts might be achieved in one year or take 10 or more years to be achieved. Such long-term impacts may or may not be reflected in the logic model, depending on scope of the initiative, purpose and audience of the logic model. C H A I N OF O U T C O M E S Example from University of Wisconsin-Extension, Program Development and Evaluation 27

28 Logic Model Helps with Participant-Outcomes Evaluation
Provides the program description that guides evaluation process Helps match evaluation to the program Helps know what and when to measure Are you interested in process and/or outcomes? Helps focus on key, important information What do we really need to know? Adapted from University of Wisconsin-Extension, Program Development and Evaluation 28

29 “Outcome” (People) or “Output” (Program)—BOTH ARE IMPORTANT
Outcome = tangible results of your program # of participants; # who were trained; Responses to your program (including satisfaction ratings) Output = impacts (hopefully +) on program participants Learning (knowledge, understanding, perceptions, attitudes) Skills (capabilities, specific behaviors) Conditions (increased security, stability, pride, etc.)

30 Find the Outcome(s) Everyone in your beginning swimming class became comfortable with putting their faces in the water. 80% of youth participants said that they enjoyed the ropes course. 8 out of 10 kids in the after-school soccer program demonstrated the correct way to pass a soccer ball. A mom called you to say that her son follows instructions better at home after being involved in your program.

31 Find the Outcome(s) - Answers
Everyone in your beginning swimming class became comfortable with putting their faces in the water. (not an outcome) 80% of youth participants said that they enjoyed the ropes course. 8 out of 10 kids in the after-school soccer program demonstrated the correct way to pass a soccer ball. A mom called you to say that her son follows instructions better at home after being involved in your program.

32 Steps for Targeting Outcomes
Mission Goals Outcomes Teach Staff to Implement Explore readiness to evaluate Consider how you will use your results Identify mission, goals, and objectives Learn to use a Logic Model to focus on change Complete a Logic Model Consider your outcome results Share outcome results with others Now want to think specifically about your program If not, stay here, but keep these notes for next year…you have a starting point ACA has a resource for you

33 Issues in Doing Research or Evaluation (1.11)
Ethical Political Legal Moral

34 Broad Ethical Considerations
Avoid bias (settings, questions, populations) Consider question content Protect confidentiality (if appropriate) Manage data/records Report results of evaluation Respect/report different opinions

35 “Doing the Right Thing”
Political Issues Supports/refutes views and values Personal contacts Value-laden definitions Controversial recommendations Pressures to produce certain findings Know the organization’s position, don’t go beyond the data in your conclusions and recommendations, have clear purpose to eval

36 “Doing the Right Thing”- contd.
Legal Issues Not many legal concerns except around illegal behaviors (underage drinking, drug use, illegal gambling, income tax evasion) Ethical Issues Be open with people Don’t promise more than you can do Protect the rights and privacy of your subjects Guard against coercion Get written consent & board approval Guard against harm Let participants know the results

37 “Doing the Right Thing”- contd.
Moral Unintentional mistakes made by bias or carelessness Cultural & procedural biases Letting bias, prejudice, friendships influence outcomes Dealing with negative findings Taking too long to get the results out Being prepared to recognize the possibility of statistical errors and knowing how to explain them To guard against political, ethical, legal, moral issues: Have humility Be patient Plan Focus on practical questions, feasible issues Adopt a self-evaluation orientation- evaluate yourself! What are some dilemmas that you might face as an evaluator?

38 “Doing the Right Thing” for Institutional Review Boards (IRB) and “Just Because It is the Right Thing to Do” Voluntary Participation No Harm Anonymity and Confidentiality Issues of Deception Analysis and Reporting

39 Anonymity A researcher/evaluator cannot identify a given response with a given respondent.
Confidentiality Researcher/evaluator can identify a given person's responses but promises not to do so publicly.

40 Informed Consent Participants in a study (research or evaluation) must base their voluntary participation on an understanding of the possible risks involved.

41 Institutional Review Board (Required for research—not usually for evaluation projects unless they are going to be used outside the agency) Guidelines: Minimize the risk to subjects Risk must be reasonable to benefits of the study Informed consent Adequate detail should be provided about the study in lay language Privacy must be maintained Freedom of choice, free to withdraw at any time


Download ppt "Approaches to Evaluation (1.5)"

Similar presentations


Ads by Google