Evaluation Planning & Outcome Measurement Beyond the Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald.

Slides:



Advertisements
Similar presentations
Program Evaluation How to Effectively Evaluate Your Program Carol Pilcher Department of Entomology Iowa State University.
Advertisements

Promoting Rational Drug Use in the Community Monitoring and evaluation.
Evaluating Risk Communication Katherine A. McComas, Ph.D. University of Maryland.
Corry Bregendahl Leopold Center for Sustainable Agriculture Ames, Iowa.
Rules of the Game Form groups of 6-8 persons The 1 st table to signal may answer - correct answers = +10 points - incorrect answer = -10 points You may.
Volunteer Recognition Honoring and recognizing individuals for their unique contribution to educational program efforts Honoring and recognizing individuals.
Programming Techniques and Skills for Advisory Leaders Ralph Prince and Roger Rennekamp, Ph.D. University of Kentucky Cooperative Extension Service.
Rest of Course Proposals & Research Design Measurement Sampling Survey methods Basic Statistics for Survey Analysis Experiments Other Approaches- Observation,
Can we quantify the library’s influence? Creating an ISO standard for impact assessment Roswitha Poll Chair of ISO TC 46 SC 8: Quality – statistics and.
Orientation to Performance and Quality Improvement Plan
Title I Needs Assessment and Program Evaluation
Lygus Management Survey Instructions: Please follow along and respond on the form provided. Check all that apply & fill in the blanks where indicated.
Carol Brodie Research & Graduate Studies
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Defining & Reporting Our Successes Outputs, Outcomes and Impacts.
Molly Chamberlin, Ph.D. Indiana Youth Institute
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Barriers and Facilitators of Implementation New York Academy of Medicine Peter Dayan, MD, MSc December, 2012.
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Program Planning & Evaluation Begin with the End in Mind Dr. Dallas L. Holmes Specialist Institutional Research Utah State University Extension.
OKLAHOMA QUALITY IMPROVEMENT COLLABORATIVE INTERIM TRAINING Marlene Mason MCPP Healthcare Consulting, Inc. October 28, 2010.
Nine steps of a good project planning
This Lesson Introduction to the ADDIE model Needs Analysis
Sophia Gatowski, Ph.D., Consultant National Council of Juvenile & Family Court Judges Sophia Gatowski, Ph.D., Consultant National Council of Juvenile &
Future Incidence, Prevalence and Cost of Diabetes: An applied example of using a population prediction tool to inform public health Laura Rosella Nancy.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Mixed Methods and Other Special Types of Research.
Opioid Misuse Prevention Program “OMPP” Strategic Planning Workshop
JCAHO UPDATE June The Bureau of Primary Health Care is continuing to encourage Community Health Centers to be JCAHO accredited. JCAHO’s new focus.
The Evaluation Plan.
Measuring the effectiveness of your volunteer program Meals On Wheels Association of America 2011 Annual Conference August 30, 2011 Stacey McKeever, MPH,
Goal Setting The foundation of a plan for success includes goal setting and the achievement of goals.
Copyright © 2007 Pearson Education Canada 3-1 Marketing Research Marketing research serves many roles. It can: 1.Link companies with customers via information.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
You’ve Got What It Takes: Peer Training and Mentoring for Staff Development You’ve Got What It Takes: Peer Training and Mentoring for Staff Development.
An Online Knowledge Base for Sustainable Military Facilities & Infrastructure Dr. Annie R. Pearce, Branch Head Sustainable Facilities & Infrastructure.
The aim / learning outcome of this module is to understand how to gather and use data effectively to plan the development of recycling and composting.
Logic Models and Theory of Change Models: Defining and Telling Apart
1 Safe Use of Wastewater in Agriculture Dr. Jens Liebe, UNW-DPC.
Evaluation and Website Overview Ellen Iverson Cathy Manduca.
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
LCES Advisory Leadership System… The role of advisory leadership councils in Extension’s programming process.
The Impact of Health Coaching
IPMA Executive Conference Value of IT September 22, 2005.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Enviro-weather: A Weather-based pest and crop management information system for Michigan J. Andresen, L. Olsen, T. Aichele, B.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Evaluation Plan Steven Clauser, PhD Chief, Outcomes Research Branch Applied Research Program Division of Cancer Control and Population Sciences NCCCP Launch.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
Logic Model, Monitoring, Tracking and Evaluation Evaluation (Section T4-2)
Adding evaluation to your plan and next steps: proposal Webinar 4 of the series Mapping an Outreach Project: Start with Information; End with a Plan The.
Monitoring and Evaluation in the GMS Learning Program 7 – 18 May 2012, Mekong Institute, Khon Kaen, Thailand Randy S. Balaoro, CE, MM, PMP Data Collection.
Rest of Course Proposals & Research Design Measurement Sampling
Leveraging Data for Performance Improvement Jack Millaway, LPHI Chatrian Kanger, AHL.
North Central IPM Center Evaluation Jean Haley Evaluation Specialist
Systems Analysis Lecture 5 Requirements Investigation and Analysis 1 BTEC HNC Systems Support Castle College 2007/8.
A Hierarchy for Targeting Outcomes and Evaluating Their Achievement S. Kay Rockwell Professor and Evaluation Specialist Agricultural.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Patricia M. Alt, Ph.D. Dept. of Health Science Towson University
Piecing Together the Grant Proposal
Ross O. Love Oklahoma Cooperative Extension Service
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Introducing Real Time Strategic Planning
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Evaluation Planning & Outcome Measurement Beyond the Logic Model 2010 AAEA In-Service, Maricopa Agricultural Center Al Fournier, Dan McDonald & Tom DeGomez

Why Evaluate?

Why Evaluate? (1) “Customer satisfaction” Program improvement (formative) Did people learn? (knowledge, skills) Did people change? (adoption) Did the situation change? (outcome, impact)

Why Evaluate? (2) To document outcomes Reporting To prove our worth (APR, continuing) To keep $ flowing (grant success) Because we have to!

What do you evaluate?

How often to you evaluate “customer satisfaction”? 1.Always 2.Most of the time (> 66%) 3.Sometimes ( %) 4.Occasionally (< 33%) 5.Never

How often to you evaluate clientele learning? 1.Always 2.Most of the time (> 66%) 3.Sometimes ( %) 4.Occasionally (< 33%) 5.Never

How often to you evaluate application of learning? 1.Always 2.Most of the time (> 66%) 3.Sometimes ( %) 4.Occasionally (< 33%) 5.Never

How often to you evaluate Outcomes? 1.Always 2.Most of the time (> 66%) 3.Sometimes ( %) 4.Occasionally (< 33%) 5.Never

How often to you evaluate impacts? 1.Always 2.Most of the time (> 66%) 3.Sometimes ( %) 4.Occasionally (< 33%) 5.Never

Outcomes versus Impacts Short term Increase awareness Increase knowledge Change attitudes Apply knowledge Adopt a practice Long term Result of short term outcomes “in the world” Increase profits Improve health Reduce pollution

Taught growers how to reduce irrigation but maintain yields 1.Outcome 2.Impact

Increased parent awareness of cyber-bullying 1.Outcome 2.Impact

Reduced absentee rates in schools after adopting IPM 1.Outcome 2.Impact

Barriers to Evaluation

Lack of time Lack $ Lack of knowledge Lack of experience Low priority / not important Not interested Other?

Which program should I Evaluate? (PSU tipsheet)

Why Evaluate? (2) To prove our worth (APR, continuing) To keep $ flowing (grant success) Because we have to!

Example: Regional IPM Applications must provide detailed plans for evaluation of the project …. The evaluation plan should include specific evaluation objectives and measurement indicators (e.g., adoption rate, number of acres impacted, pesticide use, risk reduction, profitability) that will be used to measure impacts and outcomes resulting from the project. Evaluation plans that include surveys should indicate survey expertise of investigators and/or describe the survey methodology that will be used.

Evaluation Planning Evaluation Objectives: What change do you want to document? Measurement Indicators: What data will you use to document change? Methodology: How will you collect the data?

Evaluation Plan (1) Objective 1: To measure knowledge of natural enemies (I.D.) & their role in whitefly management. Data: Can they identify natural enemies? Do they know the role of specific fauna in whitefly management? (knowledge) How: audience response survey

Evaluation Plan (2) Objective 2: To collect baseline data on current whitefly management practices. Data: self-reported sampling practices and thresholds for treatment, do they consider natural enemies? How: survey implemented face-to-face & online.

Evaluation Plan (3) Objective 3: To measure the intention of clientele to adopt revised thresholds, sampling and management guidelines. Data: self-reported attitudes about usefulness of NE in WF control; willingness to adopt. How: the same survey implemented face-to-face & online.

Example: Regional IPM Applications must provide detailed plans for evaluation of the project …. The evaluation plan should include specific evaluation objectives and measurement indicators (e.g., adoption rate, number of acres impacted, pesticide use, risk reduction, profitability) that will be used to measure impacts and outcomes resulting from the project. Evaluation plans that include surveys should indicate survey expertise of investigators and/or describe the survey methodology that will be used.

Example: Regional IPM Applications must provide detailed plans for evaluation of the project …. … such as logic models or other established methods.

Remember 3 Things: 1.What change do you want to document? (Evaluation Objectives, based on program goals, linked to program activities) 2.What data will you use to document change? (Measurement Indicators) 3.How will you collect the data? (Methodology)

What about your programs? Evaluation Objective Measurement Indicator Methodology

Determining Evaluation Objectives Look at desired outcomes & impacts.

Determining Evaluation Objectives Look at desired outcomes & impacts. Be careful about commitments. Look at your outputs for who to target, what to focus on. Outputs should relate to desired outcomes.

Determining Evaluation Objectives Look at desired outcomes & impacts. Be careful about commitments. Look at your outputs for who to target, what to focus on. Outputs should relate to desired outcomes. Look at your inputs (resources).

Determining Evaluation Objectives Look at desired outcomes. Be careful about commitments. Look at your outputs for who to target, what to focus on. Outputs should relate to desired outcomes. Look at your inputs (resources). Keep it real & prioritize. (Needs, budget, abilities.)

What makes a good Measurement Indicator? Is it measurable? Are data obtainable? Can it be quantified? (or qualified) Does it relate directly to your program goals?

What makes a good Measurement Indicator? Is it measurable? Are data obtainable? Can it be quantified? (or qualified) Does it relate directly to your program goals?

Data Sources (methods) Existing or common data (public sources, census data, Dept. of Education, etc.) Surveys (written, online, telephone) Interviews Observations Focus groups Other…?

Indicate your level of experience with written surveys 1.I am an expert 2.Lots of experience 3.Some experience 4.Tried it once or twice 5.No experience

Indicate your level of experience with online surveys 1.I am an expert 2.Lots of experience 3.Some experience 4.Tried it once or twice 5.No experience

Indicate your level of experience with telephone surveys 1.I am an expert 2.Lots of experience 3.Some experience 4.Tried it once or twice 5.No experience

Indicate your level of experience with focus groups 1.I am an expert 2.Lots of experience 3.Some experience 4.Tried it once or twice 5.No experience

Indicate your level of experience with advisory groups 1.I am an expert 2.Lots of experience 3.Some experience 4.Tried it once or twice 5.No experience

Indicate your level of experience with interviewing 1.I am an expert 2.Lots of experience 3.Some experience 4.Tried it once or twice 5.No experience

Indicate your level of experience using existing data sources for evaluation 1.I am an expert 2.Lots of experience 3.Some experience 4.Tried it once or twice 5.No experience

Indicate your level of experience with Human Subjects (IRB) 1.I am an expert 2.Lots of experience 3.Some experience 4.Tried it once or twice 5.No experience

Indicate your level of experience with evaluation planning 1.I am an expert 2.Lots of experience 3.Some experience 4.Tried it once or twice 5.No experience

Indicate your level of familiarity with UA CE evaluation website 1.Used the site more than once 2.Used the site to find resources 3.Viewed the site 4.Heard of it, never visited 5.Never heard of it

Evaluation Planning (PSU tipsheets) Evaluation strategy (8 steps) Linking outcomes to program activities and writing Impact statements

Documenting Impacts

IPM Program Goals (desired impacts) Reduce economic risk (profit) Reduce risk to human health Reduce risk to the environment

Cotton IPM Saves Millions $ $201,000,000 saved costs & yield loss IGRs, Bt cotton & AZ IPM plan Zero grower sprays for PBW Ellsworth et al. 2008

Lowest Costs in 30 years (inflation-adjusted to 2008 dollars) Lygus: -35% PBW: -89% Whitefly: -71% Fewer Sprays in last 7 years in last 7 years Ellsworth 2008

Health & Environment 1.7M lbs reduction in insecticide use Lowest usage in 30 yrs! Ellsworth et al. 2008

What is the likelihood you will use something you learned about today? 1.Very likely 2.Somewhat likely 3.Slightly likely 4.No way 5.Not sure

Resources for Evaluation Extension Program Evaluation website Other internet resources eXtension community of practice

New! Evaluation Lending Library