1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

The Aged Care Standards and Accreditation Agency Ltd Continuous Improvement in Residential Aged Care.
Gathering Evidence of Impact: A Continued Conversation Jan Middendorf Cindy Shuman Office of Educational Innovation and Evaluation.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Chapter 16 Becoming a Better Teacher by Becoming a Reflective Teacher.
Consistency of Assessment
Evaluating and Revising the Physical Education Instructional Program.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Evaluation. Practical Evaluation Michael Quinn Patton.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Collecting Data This is STEP 3 of the five steps.
Measuring Learning Outcomes Evaluation
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 3: Engaging stakeholders.
Student Assessment Inventory for School Districts Inventory Planning Training.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation See the PDE booklet, Collecting evaluation data:
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Getting started with evaluation.
Molly Chamberlin, Ph.D. Indiana Youth Institute
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Measuring for Success Module Nine Instructions:
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 2: Developing an evaluation plan.
The Comprehensive School Health Education Curriculum:
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Problem Based Learning (PBL) David W. Dillard Arcadia Valley CTC.
Orienting Extension Faculty that are Volunteer Administrators.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Impact assessment framework
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
Competency Area A: Communicating with Paraprofessionals.
The Evaluation Plan.
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
Observation technique and assessment measurements 1.
From SEF to SIP March 24, Content Objectives Participants will 1. Understand SIP law 2. Explore the CNA 3. Explore the Plan-Do-Check-Act Model for.
Chapter 15 Qualitative Data Collection Gay, Mills, and Airasian
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Program Assessment. Before you get started Need to determine if the program can be evaluated or should be evaluated. Is the answer already available?
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Copyright © 2014 by The University of Kansas Data Collection: Designing an Observational System.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation What do you want to know? What will you measure?
PORTFOLIOS Building an Employability Portfolio What is a Portfolio? A portfolio is a marketing tool that is used as evidence of your skills. Essentially,
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Improving Data, Improving Outcomes Conference Washington, DC Sept , 2013 Planful Changes: Using Self-Assessments to Improve Child and Family Outcome.
Tier III Preparing for First Meeting. Making the Decision  When making the decision to move to Tier III, all those involve with the implementation of.
Logic Models Performance Framework for Evaluating Programs in Extension.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Evaluating the Quality and Impact of Community Benefit Programs
Group evaluation There is need to assess the degree to which a group is achieving or has achieved its set goals. The process of assessing this constitutes.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Utilizing the LOGIC MODEL for Program Design and Evaluation
Program Evaluation Essentials-- Part 2
TECHNOLOGY ASSESSMENT
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Data Collection: Designing an Observational System
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Levels of involvement Consultation Collaboration User control
Presentation transcript:

1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides an overview of the steps to planning a quality evaluation. Each step is covered in greater detail in other sections on the web site.

2 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation An evaluation plan is your roadmap How do you know which way to go if you don’t know where you are going?

3 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Use the booklet, Planning a Program Evaluation and the Worksheet as background for this set of slides and to help with your evaluation plan. Links to these resources are found on the web page:

4 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation There are 5 core steps in planning a program evaluation

5 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation These are best practices steps! Consider these steps regardless if you are evaluating a single workshop or a comprehensive program. However, the level of detail in your evaluation plan will depend upon the scope and nature of your evaluation.

6 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation STEP 1 Begin by deciding who should be involved in helping design and implement the evaluation. Or, do you do it alone?

7 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Think about these 4-H YD stakeholders as people who might be involved in your evaluation ( your evaluation stakeholders) Youth participants Your program partners Parents Teachers Volunteers Your funder

8 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation How might they be involved? What roles might they play? They might Help determine what the evaluation will focus on – what you really need to know Be part of an evaluation team Help write the questions Collect data Help enter data and/or do the analyses Raise funds Write a press releases

9 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Why would you want to do this? To build ownership of the evaluation and your work To develop skills in others, e.g., youth who learn how to conduct a survey or analyze data To ensure that the evaluation findings will be useful and used To bring talent and expertise to the evaluation To share the work!

10 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation STEP 2 Next, we focus the evaluation. What program or part of a program are you going to evaluate? Use a logic model to describe your program so you evaluate what is meaningful.

11 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation A logic model helps us clearly describe the program we want to evaluate. It helps us focus on what we want to collect information about. INPUTSOUTPUTSOUTCOMES Program investments ActivitiesParticipationShortMedium AssumptionsExternal factors Long-term Situation:

12 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Be clear about your purpose for evaluating We are conducting an evaluation of ______________ (name of program) because _________________________ in order to ________________________. Example: We are conducting an evaluation of the 3 series Money Quest Program because we want to know to what extent youth participants increase their knowledge about money management and use the recommended practices in order to report the value of this program to our funder.

13 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation WHO?WHAT do you/they want to know? HOW will they use the info? You – staff Participants Funder Other stakeholders ?? Determine Use and User

14 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation What do you want to know – what data do you want to collect? Go back to your logic model – the description of your program and think about what you really want to know about your program. What data do you want to collect? Do you want information about: Outcomes: to what extent changes occur in knowledge, skills, attitudes, opinions, behaviors, practices, policies, social-economic-environmental quality… Reactions: What the participants like/dislike? Whether they will come again/stay involved/promote your program to others, etc. ?

15 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Do you want to know about … Participation Who/how many attended? Who didn’t attend - why? What happens for different participants? Activities What was done and how well was it done (quality) Did everything go according to plan? What worked well/not so well? Which activities link to different outcomes for different participants? Inputs What was invested (inputs)? By whom? Who collaborated? How? What resources were used?

16 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Do you want to know about… Your teaching? How you did? How you might improve? Future programming? Future interests; needs? Costs and returns? How much it costs to put the program on and what the return on that investment is? Other questions???

17 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Prioritize: We can’t and don’t want to collect information about everything!

18 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation How will you know it? INDICATORS For each evaluation question, think about the specific information you need to collect to answer that question. Example: QuestionIndicator To what extent did the program increase youth- adult partnerships? - #,% of Boards with youth participating in meetings before and after - #,% of Boards with youth on committees before and after - #,% of Boards with youth in leadership positions before and after - Reported change in quality of the youth-adult interaction

19 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation What is your “evaluation design” Evaluation design is your overall approach to collecting data. Typical evaluation designs include : –Single point in time (e.g., survey, end-of-program questionnaire) –Pre-post program or retrospective post-then-pre (comparison of before to after) –Multiple points in time (e.g., pre, post and follow-up –Comparison group designs (two or more groups) –Case study design (indepth analysis of a single case) –Mixed method (e.g., the use of a survey, observation and testimonials)

20 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation STEP 3 Now, think about from whom and what method you will use to collect the information.

21 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation From where or from whom will you get the information you need? (“Source”) Do you think that you are most likely to get the information you need from: –Existing information – records, reports, program documents, etc. –People – participants, parents, volunteers, etc. –Pictorial records and observations – video or photos, observations of events

22 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation What method(s) of data collection is most appropriate ? Survey (questionnaire) Case study Interview Observation Group assessment Expert or peer reviews Portfolio reviews Testimonials Tests Photographs, videotapes, slides Diaries, journals, logs Document review and analysis

23 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Which method do you choose? As we know, there are many standard methods of data collection, and more creative ones. Each has its own strengths and weaknesses. The ‘art and science’ of data collection is to select the method appropriate for the purpose of your evaluation, the audience you are collecting information from and your resources.

24 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Test your understanding: Match the method in the left column to the definitions on the right Case study Questionnaire Observation Focus group Interview 1.Collecting standardized information from people in a non-threatening way 2.Talking with and listening to people either face-to-face or by telephone. 3.Gathering information by viewing what is occurring 4.Exploring a topic in depth through group discussion 5.Gaining an in-depth understanding of someone’s experience in the program

25 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation STEP 4 While the actual analysis comes once you have your data, you want to think about analysis when you are planning the evaluation. That way you will be sure to collect the type of information to be able to report as you want.

26 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation What do you want to be able to report? Will you want to reportType of analysis Numbers, percentsCount; percentage Average number or score Range of scores Mean Range Changes from before to afterChange score Comparisons of one group to another Cross tab People’s storiesQualitative content analysis CommentsQualitative content analysis

27 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Step 5 And, last but not least, is USING and COMMUNICATING your evaluation findings. Actually, we should be communicating and using our information throughout the evaluation…not just at the end. Think about the many ways you might use and share information both during and after your evaluation.

28 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation How will you report, use, and learn from the findings? WHO - To whom will you report? WHAT - What will you report/share? HOW - How will you share the information? WHEN - When will you communicate?

29 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Managing the evaluation We might think about a management plan to make sure everything gets done! –Human subjects protection Check the decision tree and follow best practice guidelines. Consult with the Human Subjects Administrator if you have questions: –Timeline –Responsibilities: who will do what? –Budget: money needed to do the evaluation

30 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Check your evaluation plan against the Evaluation Standards to ensure a ‘quality’ evaluation Utility: Will it be useful and used? Feasibility: Is it practical and can be accomplished given your resources? Propriety: Is it respectful and ethical? Accuracy: Is it likely to produce accurate information?

31 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation “The more attention you give to planning the evaluation, the more effective it will be.” −The Program Manager’s Guide to Evaluation, 2003 Remember…

32 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Reflection time What is one thing that you learned from this presentation that you didn’t know before? Good luck with planning your next evaluation!