Methods and Implications of Using Methods Ellen Taylor-Powell University of Wisconsin-Cooperative Extension.

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Introduction to Monitoring and Evaluation
Learning Outcomes By Terrence Willett. What are Learning Outcomes? n Assessment / Program Based n Outcome Based Assessment n Skills; Knowledge; Result.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Gathering Evidence of Impact: A Continued Conversation Jan Middendorf Cindy Shuman Office of Educational Innovation and Evaluation.
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
Designing an Effective Evaluation Strategy
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation 1 This presentation covers: - types of interviews:
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Family Resource Center Association January 2015 Quarterly Meeting.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
Observational Studies Observing in the Field. Two types of observation Nonparticipant observation. Researcher is not part of the activity taking place,
Reality Check For Extension Programs Deborah J. Young Associate Director University of Arizona Cooperative Extension.
Evaluation. Practical Evaluation Michael Quinn Patton.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Collecting Data This is STEP 3 of the five steps.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Culturally Appropriate Data Collection Methods How.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation See the PDE booklet, Collecting evaluation data:
Competency Assessment Public Health Professional (2012)-
Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Getting started with evaluation.
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 2: Developing an evaluation plan.
Program Evaluation Using qualitative & qualitative methods.
Communication Degree Program Outcomes
Impact assessment framework
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
The Evaluation Plan.
Practical Ideas On Alternative Assessment For ESL Students Jo-Ellen Tannenbaum, Montgomery County Public Schools (MD)
Early Childhood Development: A Field of Study Basic Concepts, Skills, & Issues.
Too expensive Too complicated Too time consuming.
Logic Models Handout 1.
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
Effect of a values-based prevention curriculum on HIV- positive couples from four regions in Ethiopia Presented at XIX IAC 2012 By Misgina Suba, MPH 25.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Logic Models and Theory of Change Models: Defining and Telling Apart
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Developing a logic model Western Region Institute Brian Luckey, University of Idaho Extension 1 © 2008 by the University of Wisconsin System..
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Managing Organizational Change A Framework to Implement and Sustain Initiatives in a Public Agency Lisa Molinar M.A.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Community Board Orientation 6- Community Board Orientation 6-1.
Evaluation Workshop Self evaluation – some workable ideas and approaches.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Project KEEP: San Diego 1. Evidenced Based Practice  Best Research Evidence  Best Clinical Experience  Consistent with Family/Client Values  “The.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
Program Evaluation Principles and Applications PAS 2010.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
The purpose of evaluation is not to prove, but to improve.
Collecting and Analyzing Data Adapted from University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation.
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
Initial Project Aims To increase the capacity of primary schools in partnership with parents to implement a sustainable health and sexuality education.
Life Skills Education (LSE) Peace Trust, 15-Kuruchi Road, Kulavanigarpuram, Tirunelveli , Tamilnadu, India. PH:
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Logic Models Performance Framework for Evaluating Programs in Extension.
Assessing Young Learners
How to Assess the Effectiveness of Your Language Access Program
Logic Models and Theory of Change Models: Defining and Telling Apart
Presentation transcript:

Methods and Implications of Using Methods Ellen Taylor-Powell University of Wisconsin-Cooperative Extension

Our time today Overview: Sources and Methods Program examples Cultural considerations Attribution vs. contribution Application of evaluation standards as we think about methods and implications

Process Ask questions Interactivity Share examples

Methods = CHOICES So many choices, so many decisions

“Developing an evaluation is an exercise of dramatic imagination” (Cronbach, 1982: 239)

a.There is one best way to collect data b.Quantitative methods that collect numbers provide more useful information c.Evaluation data collection involves any established social science research method d.We often collect data from program participants e.We should always collect data from as many participants as possible Let’s get started by checking ourselves! (answer each statement with either true or false)

Myths Choice about which method to choose is primarily a technical decision There is one best method There are established and known standards of what constitutes methodological quality and excellence More data is always better “Hard” data is better than “soft” data

Where do methods fall in the process of planning an evaluation?

INPUTSOUTPUTSOUTCOMES Program investments ActivitiesParticipationShortMedium Long- term Evaluation methods: How will you collect the information to answer your questions? Match evaluation questions and methods to your PROGRAM Evaluation questions : What questions do you want to answer? Logic model

Example: What do you (and others) want to know about the program? Staff Money Partners Assess parent ed programs Design- deliver evidence- based program of 8 sessions Parents increase knowledge of child dev Parents better understanding their own parenting style Parents use effective parenting practices Improved child- parent relations Research INPUTSOUTPUTS OUTCOMES Facilitate support groups Parents gain skills in new ways to parent Parents identify appropriate actions to take Parents of 3-10 year olds attend Reduced stress Parents gain confidence in their abilities Strong families Inputs ProcessOutcomes Impact

Possible evaluation questions… Staff Money Partners Assess parent ed programs Design & deliver evidence-based program of 8 sessions Parents increase knowledge of child dev Parents better understand their own parenting style Parents use effective parenting practices Improved child- parent relations Research Facilitate support groups Parents gain skills in effective parenting practices Parents identify appropriate actions to take Strong families Parents of 3-10 year olds attend To what extent is stress reduced? relations improved? To what extent did behaviors change? For whom? Why? What else happened? To what extent did knowledge and skills increase? For whom? Why? What else happened? Did all parents participate as intended? Who did/not not?Did they attend all sessions?support groups?Level of satisfaction? Were all sessions delivered? How well? Do support groups meet? What amount of $ and time were invested? Reduced stress

Sources of data Sources of evaluation information People: youth participants, parents, teachers, volunteers, leaders, judges… Pictorial records and observations: before-after photos; observations at events; artwork… Existing information: record books, plans of work, logs, journals, meeting minutes…

Data collection methods Data collection methods Survey Interview Focus group Observation Expert or peer reviews Portfolio reviews Testimonials Tests Photograph, videotape, slides Diaries, journals, logs Document review and analysis

Polling slide…How many use/have used these methods?

Creative methods… Creative expression: drawing, drama, role-playing Photography, videotape, slides Diaries, journals, logs Personal stories Expert review Buzz session Affinity diagramming ??? “There can be no definitive list of creative evaluation approaches. Such as list would be a contradiction in terms” Patton: 346

Pros and cons of different methods Insert slides OR connect to a pdf? Example re. choices

Quantitative: numbers breadth generalizability Qualitative: words depth specific "Not everything that counts can be counted and not everything that can be counted counts.“ (Albert Einstein) Quantitative information – Qualitative information

Often, it is better to use more than one data collection method…. TRIANGULATION Why?

Examples How might you mix sources of information in your evaluation? How might you mix data collection methods to evaluate your program?

Polling or quiz

1. Focus: Whole farm phosphorus management – 11 Western counties 2. Questions3. Indicators4. Timing5. Data collection SourcesMethodsSampleInstruments 1 What did the phosphorus mngt program actually consist of? Who did what? #, type of activities implemented: course developed, workshops conducted, on-farm work #, who, role of partners At time of activity At time of involvement Staff Partners Recording log/data base Log Interview annually All activities All partners Need form; system for ongoing recording Need recording form; Interview questions 2 Did the expected number of farmers attend the various activities? Who participated in what? #, key characteristics of participating farmers per activity At time of activity (workshop, field day, on- farm visit) Attendance logs Record review All participants Need recording form and system for collecting data 3 What resulted? To what extent did participating farmers: a) increase their knowledge? b) Increase skills in tracking P levels? c) adopt recommendations? d) reduce P levels? e) save money? 4 What else Happened? #,% of participants who a) Report increased knowledge b) demonstrate skill c) report changes in feeding levels d) record P reductions e) report $ savings; amount of savings End of each workshop Ongoing Annually –4 th quarter Participants Farmers Staff Partners; other stakeholders Post session survey Observations Record review Informal interviews Focus groups All participants All participants 5-7 selected in each grouping Questionnaire TBD Recording logs and questions TBD Develop focus group protocol for each group EXAMPLE DESIGN: Baseline? Comparison group? External contingencies? Other outcomes?

Contribution vs. attribution We need to accept the fact that what we are doing is measuring with the aim of reducing the uncertainty about the contribution made, not proving the contribution made. Mayne, 1999:10

Culturally appropriate evaluation methods How appropriate is the method given the culture of the respondent/the setting? Culture differences: nationality, ethnicity, religion, region, gender, age, abilities, class, economic status, language, sexual orientation, physical characteristics, organizational affiliation

Is a written questionnaire culturally appropriate? Things to consider: Literacy level Tradition of reading, writing Setting Not best choice for people with oral tradition Translation (more than just literal translation) How cultural traits affect response – response sets How to sequence the questions Pretest questionnaire may be viewed as intrusive

Are interviews culturally appropriate? Things to consider: Preferred by people with an oral culture Language level proficiency; verbal skill proficiency Politeness – responding to authority (thinking it’s unacceptable to say “no”), nodding, smiling, agreeing Need to have someone present Relationship/position of interviewer May be seen as interrogation Direct questioning may be seen as impolite, threatening, or confrontational

Are focus groups culturally appropriate? Things to consider: Issues of gender, age, class, clan differences Issues of pride, privacy, self-sufficiency, and traditions Relationship to facilitator as prerequisite to rapport Same considerations as for interview

Is observation culturally appropriate? Things to consider: Discomfort, threat of being observed Issue of being an “outsider” Observer effect Possibilities for misinterpretations

CHALLENGES Hard to reach populations Young children When to do follow-up Sensitive subject matter Reactivity Evaluation as an add- on

Insert – polling, quiz…some type of interactivity

Apply the evaluation standards to your methods decisions Utility Feasibility Propriety Accuracy

UTILITY Will the data sources and collection methods serve the information needs of your primary users?

FEASIBILITY Are your sources and methods practical and efficient? Do you have the capacity, time, and resources? Are your methods non-intrusive and non-disruptive?

PROPRIETY Are your methods respectful, legal, ethical, and appropriate? Does your approach protect and respect the welfare of all those involved or affected?

ACCURACY Are your methods technically adequate to: answer your questions? measure what you intend to measure? reveal credible and trustworthy information? convey important information?

When choosing methods, consider… The purpose of your evaluation – what do you want to know? Your use/users – what kind of data will your stakeholders find most credible and useful? –Percents, comparison, stories, statistical analysis Your respondents − how they can best be reached, how they might best respond? Your comfort level Level of burden to program or participants Pros and cons of each method RESOURCES

html

Resources Ohio State University Penn State