Balancing Rigor and Reality Evaluation Designs for 4-H Youth Development Programs Mary E. Arnold, Ph.D. 4-H Youth Development Specialist Program Planning.

Slides:



Advertisements
Similar presentations
WEBINAR 2 Defining Scientific Research Based Intervention.
Advertisements

Experimental Research Designs
New Teacher Induction Academy Data Collection November 30, 2011
“If you don’t know where you are going, how are you gonna’ know when you get there?” Yogi Berra Common Elements of Strong Program Evaluation THEORY OF.
Appraisal in Counseling Session 2. Schedule Finish History Finish History Statistical Concepts Statistical Concepts Scales of measurement Scales of measurement.
PPA 502 – Program Evaluation
Program Evaluation Spero Manson PhD
TalentMap 1 York Community Services: November 1-12, TalentMap Athabasca University – Spring TalentMap Employee Survey Results Presentation.
PISA Partnership to Improve Student Achievement through Real World Learning in Engineering, Science, Mathematics and Technology.
Institute for innovation and development of learning process
The Impact of On-line Teaching Practices On Young EFL Learners' Instruction Dr. Trisevgeni Liontou RHODES MAY
RESEARCH DESIGNS FOR QUANTITATIVE STUDIES. What is a research design?  A researcher’s overall plan for obtaining answers to the research questions or.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
Engaging Students: Piloting “Clickers” in AUC Classrooms Azza Awwad, Maha Bali, Riham Moawad Center for Learning and Teaching.
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
INTRODUCTION TO OPERATIONS RESEARCH Niranjan Saggurti, PhD Senior Researcher Scientific Development Workshop Operations Research on GBV/HIV AIDS 2014,
I want to test a wound treatment or educational program in my clinical setting with patient groups that are convenient or that already exist, How do I.
Introduction and Overview Reaching the Summit of Success, September 16 th and 17 th, 2014 Dr. Tiana Povenmire-Kirk and Kimberly Bunch-Crump.
FORMATIVE SUPERVISION. At the end of this class participants will have a better understanding of the processes of formative supervision and will improve.
Measuring the Impact of Robotics and GIS/GPS on Youth STEM Attitudes Gwen Nugent, Bradley Barker, Michael Toland, Neal Grandgenett, Slava Adumchuk.
Evaluation Designs and Methods Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Professor and Youth Development Specialist Oregon State University.
Summer 2014 Glenville State College Forensics Science Student and Teacher Post Evaluation Results.
Welcome to the State of the STEM School Address National Inventor’s Hall of Fame ® School Center for Science, Technology, Engineering and Mathematics (STEM)
Dare to Evaluate Roger A. Rennekamp, Ph.D. Department Head and State 4-H Program Leader Youth Development Education Oregon State University
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
Numerical Reasoning: An Inquiry-Based Course for K-8 Teachers Rachel Cochran, Center for Educational Accountability Jason Fulmore, Center for Educational.
Summer 2015 Glenville State College Forensics Science Student and Teacher Post Evaluation Results.
Quantitative SOTL Research Methods Krista Trinder, College of Medicine Brad Wuetherick, GMCTE October 28, 2010.
Mentoring Preservice Teachers: Survey Results Updated last: 9/15/2008.
Service Learning Dr. Albrecht. Presenting Results 0 The following power point slides contain examples of how information from evaluation research can.
Planning and Focusing an Evaluation Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Associate Professor and Youth Development Specialist.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Marianne Bird, University of California Cooperative Extension Aarti Subramaniam, 4-H Center for Youth Development INTRODUCTION On the Wild Side is an environmental.
Research and Evaluation Team Lines of Work Andy Porter, Director Building a Partnership – Susan Millar District Case Studies – William Clune Targeted Studies.
A. P. Moller - Maersk Employee Engagement Survey 2011 MDSI Corporate IT-Admin; RVA018 - Roberto - Valenciano Report.
Experiments. The essential feature of the strategy of experimental research is that you… Compare two or more situations (e.g., schools) that are as similar.
Q: Does a student’s perception of writing, help to improve writing skills? Q:Will the use of a blog improve students’ attitude towards, writing?
Experimental Research Methods in Language Learning Chapter 5 Validity in Experimental Research.
NMC Online Conference December 2005 Assessing Learning in a MORPG Patricia Youngblood, PhD Director of Evaluation, SUMMIT (Stanford University Medical.
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
What is Science? - Ideas developed by scientists and the methods used to gain information about the idea. – A process of making observations and asking.
Teacher Survey Highlights R&E/LWW May2014.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Needs Assessment: Minding the Gaps Nigel Gannon, Program Specialist NYS 4-H Youth Development Cornell Cooperative Extension is an employer and educator.
Evaluation Communities of Practice: Building capacity from the inside out Ben Silliman, Department of Youth, Family, & Community Sciences NC State University.
Applying Principles of Learning Diane Ebert-May Department of Plant Biology Michigan State University Assessment.
What is the Scientific Method?. The scientific method is a way to ask and answer scientific questions by making observations and doing experiments.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Assessing Teaching Through Critical Reflection and Evidence to Encourage Renewal Phyllis Blumberg - University of the Sciences
SCIENTIFIC RESEARCH PROCESS Levels of Measurement.
DESIGNING GOOD SURVEYS Laura P. Naumann Assistant Professor of Psychology Nevada State College.
OUTCOME MONITORING A Short Review Nidal Karim. What is Outcome Monitoring? WhatHowWhyWhen The routine process of monitoring whether or not service recipients.
LEADERSHIP & TEACHER DEVELOPMENT
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Introduction to Life Science
Core Competencies: Choosing Study Design
Right-sized Evaluation
What do you really think about Science??
Conclusion & Implications
Monitoring and Evaluation of Postharvest Training Projects
Quantitative Research
SCIENCE AND ENGINEERING PRACTICES
Research Strategies: How Psychologists Ask and Answer Questions
Parent Satisfaction Surveys November 2015
Michigan Science Standards (MSS)
TAKS, Inquiry, Standards and Assessment
Inquiry Teaching Practices and The Effect of Mindset
Presentation transcript:

Balancing Rigor and Reality Evaluation Designs for 4-H Youth Development Programs Mary E. Arnold, Ph.D. 4-H Youth Development Specialist Program Planning and Evaluation Oregon State University

Elements of Rigor Evaluation design Conceptualization of program constructs & outcomes Measurement strategies Timeframe of the evaluation study Program integrity Program participation and attrition Statistical analyses Braverman, M. T., & Arnold, M. E. (2008). An evaluator’s balancing act: Maintaining rigor while being responsive to multiple stakeholders. In M. T. Braverman, M. Engel, R. A. Rennekamp, & M. E. Arnold (Eds.) Program evaluation in a complex organizational system: Lessons from Cooperative Extension. New Directions for Evaluation, 120,

Rigor and the 4-H Organization Who determines standards of rigor? How do decisions about evaluation methods get made? How, and to what extent, is the quality of a completed evaluation determined?

(XO) O = “Observation” (data collection)X = “intervention” (program) E = Experimental group (program participants) C = Control group (non-participants ) Post Only Design Evaluation Question Example: What life skills do campers report developing at 4-H camp?

Strongly Disagree Disagree Agree Strongly Agree 19 To work with others as a team To feel good about myself To be independent To make me want to try new things To be responsible To cooperate with others To talk to others more easily To work through disagreements 1234 Attending this camp gave me the opportunity to: What can be said? Percentage of youth ratings for each item (frequencies) Mean ratings for each item Ranking of highest to lowest of mean ratings What is the level of rigor? What cannot be said?

Post Only Control Group Design E (XO) C (XO) Evaluation Question Example: Do youth who attend 4-H summer science camp have better science skills than youth who do not attend?

Please fill in the circle that tells how much you currently can use each of the following skills when you work on a science investigation: What can be said? Percentage of youth ratings for each item – for each group (frequencies) Mean ratings for each item – for each group Ranking of highest to lowest of mean ratings – for each group Comparison between groups for each of the above With enough cases, a statistical test for significant differences between the two groups can be conducted NeverSometimesUsuallyAlways I can use scientific knowledge to form a question OOOO I can ask a question that can be answered by collecting data OOOO I can design a scientific procedure to answer a question OOOO What is the level of rigor? What cannot be said?

One Group Pre-Test/Post-Test (O XO) Evaluation Question Example: Do youth have higher levels of positive youth development at the end of the program than they did at the beginning?

Please indicate your level of agreement with each item: Strongly DisagreeDisagreeAgree Strongly Agree I feel good about my scholastic abilityOOOO I feel accepted by my friendsOOOO I can figure out right from wrongOOOO I can do things that make a differenceOOOO What can be said? Percentage of youth ratings for each item – before and after the program (frequencies) Mean ratings for each item – before and after the program Pre and post-program comparisons for each of the above With enough cases, a statistical test for significant differences between pre and pos- program ratings can be conducted What is the level of rigor? What cannot be said?

Retrospective Pre-Test ( O XO) Evaluation Question Example: Do youth have higher levels of positive youth development at the end of the program than they did at the beginning?

For each of the following items, please indicate how you felt before participating in this program, and how you feel now after participating in this program. 1 = Strongly disagree 2 = Disagree 3 = Agree 4 Strongly Agree What can be said? Percentage of youth ratings for each item – before and after the program (frequencies) Mean ratings for each item – before and after the program Pre and post-program comparisons for each of the above With enough cases, a statistical test for significant differences between pre and pos- program ratings can be conducted BeforeAfter I feel accepted by my friendsOOOOOOOO I can figure out right from wrongOOOOOOOO I can do things that make a differenceOOOOOOOO

Control Group Pre-Test/Post Test Evaluation Question Example: Do youth have higher levels of positive youth development at the end of the program than they did at the beginning? E (O XO) C (O ---O)

Please indicate your level of agreement with each item: Strongly DisagreeDisagreeAgree Strongly Agree I feel good about my scholastic abilityOOOO I feel accepted by my friendsOOOO I can figure out right from wrongOOOO I can do things that make a differenceOOOO What can be said? Percentage of youth ratings for each item – both groups; before and after the program (frequencies) Mean ratings for each item – both groups; before and after the program Pre and post-program comparisons between groups With enough cases, a statistical test for significant differences between groups pre and post-program can be conducted

Time Series Design with Control Group Evaluation Question Example: Do youth have higher levels of positive youth development at the end of the program than they did at the beginning? E (O O X OO) C (O O --- OO)

Please indicate your level of agreement with each item: Strongly DisagreeDisagreeAgree Strongly Agree I feel good about my scholastic abilityOOOO I feel accepted by my friendsOOOO I can figure out right from wrongOOOO I can do things that make a differenceOOOO What can be said? Percentage of youth ratings for each item – both groups; before and after the program (frequencies) Mean ratings for each item – both groups; before and after the program Pre and post-program comparisons between groups With enough cases, a statistical test for significant differences between groups pre and post-program can be conducted Sophisticated analysis in some case, such as latent growth curve modeling

Time Series Design with Control Group OOOOXOOOOOOOXOOO