Evaluation Designs and Methods Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Professor and Youth Development Specialist Oregon State University.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
The 21st Century Context for
Assessment Photo Album Science Fair Project
Day 2: Learning and Teaching Session 3: Effective Feedback NYSED Principal Evaluation Training Program.
PD Plan Agenda August 26, 2008 PBTE Indicators Track
CURRICULUM COMPACTING
What exactly is PYP exhibition? An overview for parents Fernway School January 9 th 7-8 PM.
May 5, 2015 Strategies for Evaluation Data Collection Eric Graig, Ph.D.
Tell me what to ask Designing Surveys: Practice makes pretty good Research & Evaluation Strategic Planning & Implementation Kamehameha Schools Presented.
Year 11 Unit 2 – Controlled assessment (25%)
Developing a Statistics Teaching and Beliefs Survey Jiyoon Park Audbjorg Bjornsdottir Department of Educational Psychology The National Statistics Teaching.
Questionnaires and interviews
February 9, 2012 Session 1: Observing Lessons NYSED Principal Evaluation Training Program.
An Assessment Primer Fall 2007 Click here to begin.
Balancing Rigor and Reality Evaluation Designs for 4-H Youth Development Programs Mary E. Arnold, Ph.D. 4-H Youth Development Specialist Program Planning.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Acting Commissioner, National Center for Education Research.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Collecting Data This is STEP 3 of the five steps.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation See the PDE booklet, Collecting evaluation data:
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Year 6 PYP Exhibition Information Session 2015
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Jack C Richards Professional Development for Language Teachers: Strategies for Teacher Learning Jack C Richards & Thomas.
Preparing for Data Collection Need to recognize that data collection is a high level activity that cannot be just passed off to graduate assistant Need.
Introduction to digiCOACH Empowering Instructional Leaders Common Core Edition.
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
Scientific Inquiry: Learning Science by Doing Science
Promoting School- Based Mental Health Through a Countywide Summer Institute Keri Weed, Ph.D. Department of Psychology University of South Carolina Aiken.
District Literacy Leaders – March 10 Wireless: PSESD Guest Resources: leaders-resources/ Success for.
Presented by Monica Ballay, LASPDG Staff
Year 11 Unit 2 – Controlled assessment (25%)
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
Monica Ballay Data Triangulation: Measuring Implementation of SPDG Focus Areas.
Full Implementation of the Common Core. Last Meeting Performance Tasks Smarter Balanced Assessment Consortium Upcoming Accountability Measure Strong teaching.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Evaluating a Research Report
Workshop 6 - How do you measure Outcomes?
Evaluating HRD Programs
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Data Generation.
Planning and Focusing an Evaluation Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Associate Professor and Youth Development Specialist.
Teacher Performance Evaluation System Data Sources.
Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.
Community Planning Training 5- Community Planning Training 5-1.
Program Evaluation.
Technology Integration Plan in Art Education By: Brittany Hauser.
WASHINGTON STATE UNIVERSITY EXTENSION Evaluation Based Program Development This material was developed in partnership with Rutgers's University and Lydia.
Science Department Draft of Goals, Objectives and Concerns 2010.
Section I Concept Development in Mathematics and Science Unit 7 Planning for Science ©2013 Cengage Learning. All Rights Reserved.
Major Science Project Process A blueprint for experiment success.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
What does it mean to be a RETA Instructor this project? Consortium for 21 st Century Learning C21CL
LST - Literacy February 28, Groups ▪ Engagement Group – Caribbean A ▪ Literacy Groups – Boardroom.
Welcome! Seminar – Monday 6:00 EST HS Seminar Unit 1 Prof. Jocelyn Ramos.
Trouble? Can’t type: F11 Can’t hear & speakers okay or can’t see slide? Cntrl R or Go out & come back in 1 Sridhar Rajappan.
PRESENTER: MS. CRYSTAL WATSON DATE: OCTOBER 4, 2014 Preparing for a Successful Job Interview.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
District Literacy Leaders Network Meeting March 24, :00am-12:00pm Dr. LaWonda Smith Manager, English Language Arts Dr. Argentina Back Manager, Multilingual.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Teaching and Learning Division National Center for Education Research.
Planning a Customer Survey Part 1 of 3 Elaine Carlson, Westat Anne D’Agostino, Compass Evaluation & Research.
Logic Models Performance Framework for Evaluating Programs in Extension.
Instructional Practice Guide: Coaching Tool Making the Shifts in Classroom Instruction Ignite 2015 San Diego, CA February 20, 2015 Sandra
Action Research for School Leaders by Dr. Paul A. Rodríguez.
Professional Development: Imagine Difference Shapes and Sizes
ASSESSMENT OF STUDENT LEARNING
Presentation transcript:

Evaluation Designs and Methods Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Professor and Youth Development Specialist Oregon State University 4-H Professional Development Webinar February 14, 2013

Webinar Agenda Building on previous month’s topic of focusing and planning your evaluation we will: Explore the concept of rigor in evaluation, and its particular place in Extension and 4-H evaluation Learn the role that evaluation questions play in determining evaluation design and data collection methods Explore common evaluation designs Explore common and innovative evaluation methods Learn of resources available to support evaluation efforts in 4-H and Extension

Elements of Rigor Evaluation design Conceptualization of program constructs & outcomes Measurement strategies Timeframe of the evaluation study Program integrity Program participation and attrition Statistical analyses Braverman, M. T., & Arnold, M. E. (2008). An evaluator’s balancing act: Maintaining rigor while being responsive to multiple stakeholders. In M. T. Braverman, M. Engel, R. A. Rennekamp, & M. E. Arnold (Eds.) Program evaluation in a complex organizational system: Lessons from Cooperative Extension. New Directions for Evaluation, 120,

Rigor and the 4-H Organization Who determines standards of rigor? How do decisions about evaluation methods get made? How, and to what extent, is the quality of a completed evaluation determined ?

(XO)(XO) O = “Observation” (data collection)X = “intervention” (program) Post Only Design Evaluation Question Example: What skills do campers report developing at science camp? TIME

Never Sometimes UsuallyAlways I can use scientific knowledge to form a question 1234 I can ask a question that can be answered by collecting data 1234 I can design a scientific procedure to answer a question 1234

Post Only Control Group Design E (XO) C (XO) Evaluation Question Example: Do youth who attend 4-H summer science camp have better science skills than youth who do not attend? O = “Observation” (data collection)X = “intervention” (program) E = Experimental group (program participants) C = Control group (non-participants ) TIME

What is the level of rigor? What cannot be said? Never Sometimes UsuallyAlways I can use scientific knowledge to form a question 1234 I can ask a question that can be answered by collecting data 1234 I can design a scientific procedure to answer a question 1234

One Group Pre-Test/Post-Test (O XO) Evaluation Question Example: Do youth have higher levels of positive youth development at the end of the program than they did at the beginning? TIME

Strongly DisagreeDisagreeAgree Strongly Agree I feel good about my scholastic abilityOOOO I feel accepted by my friendsOOOO I can figure out right from wrongOOOO I can do things that make a differenceOOOO What is the level of rigor? What cannot be said?

Retrospective Pre-Test ( O XO) Evaluation Question Example: Do youth have higher levels of positive youth development at the end of the program than they did at the beginning? TIME

For each of the following items, please indicate how you felt before participating in this program, and how you feel now after participating in this program. 1 = Strongly disagree 2 = Disagree 3 = Agree 4 Strongly Agree BeforeAfter I feel accepted by my friends OOOOOOOO I can figure out right from wrong OOOOOOOO I can do things that make a difference OOOOOOOO

Control Group Pre-Test/Post Test Evaluation Question Example: Do youth in program develop higher levels of PYD than youth who do not participate? E (O XO) C (O ---O) Strongly DisagreeDisagreeAgree Strongly Agree I feel good about my scholastic abilityOOOO I feel accepted by my friendsOOOO I can figure out right from wrongOOOO I can do things that make a differenceOOOO

TIME LOW - Level of PYD - HIGH E C

Time Series Design with Control Group OOOOXOOOOOOOXOOO LOW - Level of PYD - HIGH E C

Choosing an Evaluation Data Collection Method Some Common Methods Archival data (records and documents) Surveys (mailed, electronic, phone) Interviews (phone, face to face, group) Focus group interviews Observation Tests (scenarios or skill/knowledge tests)

Important Steps What is the key concept that must be measured in each evaluation question? Did youth participants in the YA4-H! Teens as Teachers program pilot increase their own consumption of fruits and vegetables? Who has knowledge of this potential change? Several sources may emerge: Youth, parents, leaders, teachers, friends, records, observations. What sources of data will be acceptable to stakeholders? What expertise and funding is available to make a particular method practical?

Existing Documents and Records Have you ever considered meeting notes, minutes, videos, registrations, test scores, forms, records, reports as possible sources of data? Considerably more cost effective than original data collection Data are not affected by the act of collecting it Programs collect lots of information that is never used, and too often we forget to look for existing data that can answer the question What existing information could answer the question of increased fruits and vegetable consumption?

Surveys (Mailed, Electronic, Phone) How could surveys help us assess increase in youth consumption of fruits and vegetables? Designing Surveys: A Guide to Decisions and Procedures. Ronald Czaja and Johnny Blair (2005) Internet, Mail, and Mixed-Mode Surveys: Tailored Design Method (3 rd Ed.) John Dillman, Jolene Smyth, & Leah Melani Christian (2008) How to Conduct Surveys: A Step by Step Guide Arlene Fink (2013) Survey Research Methods (4 th Ed.) Floyd Fowler (2009)

Interviews (Phone, Face to Face, Group) Designing and Conducting Your First Interview (text book) Bruce Friesen (2010) Qualitative Interviewing: The Art of Hearing Data (3 rd Ed.) Hebert and Irene Rubin (2011) Focus Groups; A Practical Guide for Applied Research (2 nd Ed.) Richard Krueger (1994) Interviewing as Qualitative Research: A Guide for Researchers in Education and the Social Sciences (4 th Ed.) Irving Seidman (2012) Why might interviews be a good method for collecting data about youth consumption of fruits and vegetables?

Direct Observation Situations where you want direct information May be more reliable than asking people if they are using new practices When you are trying understand an ongoing behavior, process or unfolding situation or event Observing camp counselors before, during and after a training program When there is physical evidence, products or outcomes that can be readily seen Inspecting project records, newsletters, signs When written or other data collection procedures seem appropriate Programs to vulnerable or underserved audiences, when language or literacy is a problem Key Resource Collecting Evaluation Data: Direction Observation Ellen Taylor-Powell and Sara Steele- Booklet available on the State 4-H Website Could we use direct observations to measure youth consumption of fruits and vegetable?

Tests and Scenarios Useful when assessing learning that requires specific knowledge that must be turned into action for the program to be considered successful. Requires the ability to do a real pre –post test process Can be very creative! Think skits and role plays Camp counselor training Youth leadership programs Risk management training Can you think of how a scenario evaluation might be a useful method for assessing youth consumption of fruits and vegetables?

Summary Stay tuned on March 6 th for our next webinar, which will focus more in depth on creating high quality questionnaires! Your Evaluation Question Determines Your DesignDetermines Your Methods

That’s all for now! Join in next month for: Creating High Quality Questionnaires That’s all for now! Join in next month for: Creating High Quality Questionnaires Don’t forget to complete an evaluation of today’s webinar at: