Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Professional Learning Communities Connecting the Initiatives
Let’s Take a Look Back Reviewing your PD Plan. Guskey’s Three Phases of Evaluation Need to image from Summer Institute 1 The flow chart??
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Getting on the same page… Creating a common language to use today.
Programming Techniques and Skills for Advisory Leaders Ralph Prince and Roger Rennekamp, Ph.D. University of Kentucky Cooperative Extension Service.
The Role of Data analysis for M& E in the context of ABRDP By: Faye Ensermu Chemeda Data Analysis Expert Ethio-Italian Development Co-operation Asella.
On-Line Version of the MI PRS Face-to-Face Training 1 Bruce Haas, Ph.D. Michigan State University Extension.
Measuring Success & Impact: Challenges & Opportunities for STEM Early Learning Programs Tiffany R. Lee University of Colorado Boulder University of Washington.
Evaluation and Human Resources Focus: discuss how evaluation of schools is conducted and where the emphasis should be placed in these evaluations. Thesis:
Evaluation. Practical Evaluation Michael Quinn Patton.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Working with 4-H Club Members and Officers. OBJECTIVE Identify 3 ways youth can develop life skills as a 4-H club member.
Logic Model Workshop Community Interests Dawn Martz, Foellinger Foundation Mike Stone, Evaluation Consultant December 2013.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Iowa Support System for Schools and Districts in Need of Assistance Phase III: Design AEA 267 SINA Process Se.
Program Planning & Evaluation Begin with the End in Mind Dr. Dallas L. Holmes Specialist Institutional Research Utah State University Extension.
Impact Evaluation: Initiatives, Activities, & Coalitions Stephen Horan, PhD Community Health Solutions, Inc. September 12, 2004.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Case Management 1 What Will be Covered? 1. Organizational Assessments (Module 1) 2. Designing Quality Services (Module 2) 3. HIV Prevention and Care Advancing.
Contribution Analysis A different approach to measurement of impact.
Missouri Integrated Model Mid-Year Meeting – January 14, 2009 Topical Discussion: Teams and Teaming Dr. Doug HatridgeDonna Alexander School Resource SpecialistReading.
Too expensive Too complicated Too time consuming.
Logic Models Handout 1.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
The Targeting Outcomes of Programs (TOP) framework.
Outcome Based Evaluation for Digital Library Projects and Services
Introduction to Evaluation. Objectives Introduce the five categories of evaluation that can be used to plan and assess ACSM activities. Demonstrate how.
Evaluation framework: Promoting health through strengthening community action Lori Baugh Littlejohns & Neale Smith David Thompson Health Region, Red Deer,
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
LCES Advisory Leadership System… The role of advisory leadership councils in Extension’s programming process.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Futuring the Key to NC Success Pat Sobrero NC Urban Extension Summit May 11, 2005.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Selecting Criteria and Setting Standards. Useful Criteria and Standards Criteria need to reflect intent of the program Select criteria that can be influenced.
Community Planning Training 5- Community Planning Training 5-1.
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
1 Poverty Analysis and Data Initiative (PADI) Capacity Building Program To Support The Poverty Reduction Strategy Shahid Khandker World Bank Institute.
Rex L. LaMore Michigan State University Center for Urban Affairs Community and Economic Development Program.
AEA 267 Support System for Schools and Districts in Need of Assistance The KASAB June, 2011 KASAB: Knowledge Attitude Skills Aspiration Behavior.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
Community Planning Training 8-1. Community Planning Training 8- Community Planning Training 8-2.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Systems Accreditation Berkeley County School District Accreditation Team Chair Training October 20, 2014 Dr. Rodney Thompson Superintendent.
Logic Model, Monitoring, Tracking and Evaluation Evaluation (Section T4-2)
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
A Hierarchy for Targeting Outcomes and Evaluating Their Achievement S. Kay Rockwell Professor and Evaluation Specialist Agricultural.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Evaluating Partnerships
Short term Medium term Long term
Monitoring and Evaluation of Postharvest Training Projects
Resources Activity Measures Outcomes
Assessment of Service Outcomes
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Presentation transcript:

Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training

Part of Planning Process Evaluation is an upfront activity in the design or planning phase of a program. Evaluation is not an after-program activity.

Why Outcomes? Today, in a time of continued reduction in government funding, Extension professionals are challenged more than ever before to document outcomes of programs and address stakeholder demands for accountability.

Review of Part of Bennett's Hierarchy As one moves up the hierarchy, the evidence of program impact gets stronger. Reactions KASA Practice Change People Involvement Activities End Results Resources

Collecting impact data on programs is costly, time consuming, and requires skill. (But not impossible!) Extension professionals are expected to evaluate a minimum of one program a year at impact level.

Example A pre/post measure can assess short term outcomes on knowledge, attitudes, skills, and aspirations (motivation to change). A plan for participant follow-up is required to assess behavior or practice change.

Plan early Plan early on what is needed with cost, time, skills (data collection, analysis, interpretation), and resources that are needed to evaluate an extension program. Work with Institute teams/ groups.

Evaluating programs at the lower levels (inputs, participation, collaboration, activities, and reactions) may require little effort and are less expensive. This is process evaluation.

Process Evaluation Process evaluation, also called formative evaluation, helps program staff to assess ongoing programs for improvement and implementation. Examples: program fidelity, reaching target audiences

Outcome Evaluation Documenting impact or community-level outcomes requires skills relative to questionnaire development, data collection and analysis, interpretation and reporting.

Summative evaluation, also called impact or outcomes evaluation, may require understanding of evaluation designs, data collection at multiple points, and sophisticated statistical analyses such as Analysis of Covariance and the use of covariates.

A Framework for Linking Costs and Program Outcomes Using Bennett's Hierarchy Process (Formative) EvaluationOutcome Evaluation Cost & OutcomesInputsActivitiesParticipationReactionsKASA Practice/ Behavior ChangeSEEC Short TermXXXXXXXXX---- Inter- mediate XXX----XXXXXXXXX Long TermXXX----XXXXXXXXX X = Low cost, effort, and evidence; XX = requires questionnaire development, data collection and analysis skills; XXX = requires understanding of evaluation designs, multiple data collection, additional analysis, skills, interpretation; XXXX—all of the above, time, increased costs, potentially resulting in stronger evidence of program impact.

Professional Development Plans for professional development are captured in MI PRS, consider building skills in evaluation. Develop with Institute work teams program evaluation plans that fit with logic models.

To make an Evaluation Plan: 1.Decide if the program is ready for formative/process or summative/outcome evaluation. 2.Link program objectives to evaluation questions that address community outcomes.

To make an Evaluation Plan, Cont. 3.Identify key indicators for evaluation (make sure they are measurable and relevant). 4.Consider evaluation costs (follow- up techniques and comparison groups used in summative designs are more expensive). 5.Develop a cost matrix.

Tracking program and project processes and outputs, as well as outcomes, will require data collection and analysis systems outside of MI PRS. Link program costs and cost of evaluation to the outcomes.

Conclusion In the end, evaluation questions that address the “so what” issue are connected to outcomes and costs, and ultimately justify the value of Extension programs to public good.

Key Reference Radhakrishna, R., & Bowne, C. (2010). Viewing Bennett’s hierarchy from a different lens: Implications for Extension program evaluation. Journal of Extension, 48 (6). Retrieved 1/24/11 at:

MSUE Resources Organizational Development webpage – Planning, Evaluation, and Reporting section

Evaluation Resources will Grow!

Other Extension materials on Evaluation….with future MSU specific resources to be released in 2011

MSU Evaluation Specialist  Assists with work teams to develop logic model objectives and evaluation strategies  Consults on evaluation designs  Provides guidance to data analysis and selecting measures  Develops and delivers educational programs related to Extension program evaluation  Facilitates evaluation plan development or brainstorming for Institute work teams

Organizational Development team member Dr. Cheryl Peters, Evaluation Specialist Statewide coverage (Presque Isle) Fax Campus Office: Room 11, Agriculture Hall. Campus Phone: Skype: cpeters.msue

MI PRS Resources