Evaluating Life Skills Development Through 4-H Challenge Course Outcomes VanderWey, S. Cooper, R. North American Association for Environmental Education.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
Student Learning Targets (SLT)
Assisting Peers to Provide W orthwhile Feedback UC Merced SATAL Program.
SEVEN FREQUENTLY ASKED QUESTIONS ABOUT USABILITY TESTING Usability Testing 101.
Selecting Your Evaluation Tools Chapter Five. Introduction  Collecting information  Program considerations  Feasibility  Acceptability  Credibility.
Selecting Your Evaluation Tools Chapter Five cont…
International Conference on Lifelong Leaning ICLLL 2011
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
How to Assess Student Learning in Arts Partnerships Part II: Survey Research Revised April 2, 2012 Mary Campbell-Zopf, Ohio Arts Council
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
The Effect of Quality Matters™ on Faculty’s Online Self-efficacy DLA Conference 2010 Jim Wright, Ed.S. June 9, 2010.
Are We making a Difference
Program Evaluation Using qualitative & qualitative methods.
Chemistry B.S. Degree Program Assessment Plan Dr. Glenn Cunningham Professor and Chair University of Central Florida April 21, 2004.
Classroom Management Self-efficacy in a Teacher Preparation Program Presentation at NERA, October 2013 University of Connecticut - Neag School of Education.
MPI Mission Perception Inventory Institutional Characteristics and Student Perception of Mission: What Makes a Difference? Ellen Boylan, Ph.D. ● Marywood.
Evaluating a Research Report
Exploring the SF-36, V2: Measuring the Quality of Life of AgrAbility Program Participants Ronald C. Jester Robert A. Wilson University of Delaware National.
Washington State Department of Social & Health Services – Division of Behavioral Health and Recovery - PRI One Department Vision Mission Core set of Values.
Using Family Survey Data for Program Improvement Pam Roush, Director WV Birth to Three October 7, 2009.
WAIT Training Annual Report Educational Evaluators Inc. Expertise.Partnership.Results.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Response to Intervention RTI Teams: Following a Structured Problem- Solving Model Jim Wright
GET READY 1 Task 1: Assemble and Orient an Outcome Measurement Workgroup Workgroup: Task 3: Develop Timeline Task 4: Distribute your game plan to Key Players.
The Development and Validation of the Evaluation Involvement Scale for Use in Multi-site Evaluations Stacie A. ToalUniversity of Minnesota Why Validate.
Reliability Ability to produce similar results when repeated measurements are made under identical conditions. Consistency of the results Can you get.
Introduction to the Road to Quality Process using the Missouri Afterschool Program Self- Assessment.
LEARN. CARE. COMMUNITY. PNWU.edu Figure 1: Concept Map for IPE Fidelity 1.Determine the rubric score that represents high, medium, and low fidelity. 2.Identify.
Missouri Western State University NCAT Mid-Course Sharing Workshop Lou Fowler Associate Professor of Accounting
“Excuse Me Sir, Here’s Your Change”
CAEP Standard 4 Program Impact Case Study
Advisor: Dr. Richard Fanjoy
Questionnaire Design.
Part Two.
Assessment Basics PNAIRP Conference Thursday October 6, 2011
Functional Area Assessment
Emergency department pediatric psychiatric services
Performance-Approach The Student Experience
The Effect of Training Program on Graduate Teaching Assistants’
Benchmarks of Quality (BOQ) Training
Evaluation of An Urban Natural Science Initiative
Evaluation of the First Year of an On-Campus Internship Program
Personal Assessment of the College Environment (PACE)
SUNY Applied Learning Resolution.
Brotherson, S., Kranzler, B., & Zehnacker, G.
Friendship Quality as a Moderator
Steps for development and evaluation of an instrument - Revision
Ross O. Love Oklahoma Cooperative Extension Service
Parent and Family Partnership Surveys
Chapter Eight: Quantitative Methods
Looking at your program data
Analyzing Reliability and Validity in Outcomes Assessment Part 1
Employee Satisfaction Survey Report 2006
MEASURING GENDER NORMS AMONG EARLY ADOLESCENTS AND YOUNG people IN UGANDA: TOOL VALIDITY AND ASSOCIATIONS WITH HIV Risk factors This presentation provides.
2017 ILO Survey Results.
Thesis Proposal Presentation
Using Family Survey Data for Program Improvement
Ronald C. Jester Robert A. Wilson University of Delaware
Christine Fleming, PhD, CRC
Performance Improvement Projects: From Idea to PIP
Gallup: TeacherInsight Automated Preliminary Interview System
What to do with your data?
An Empirical Study of Learning Strategy Use by Differently Proficient Students in a Web-based Environment Wang Zhiru.
Why don’t you want to work with older adults?
Using Logic Models in Project Proposals
A Pilot Study Evaluating the Effectiveness of Person-Centered Planning
Campus-based faculty, field-based faculty, and community members may approach the design and conduct of program evaluation with different (sometimes conflicting)
Presentation transcript:

Evaluating Life Skills Development Through 4-H Challenge Course Outcomes VanderWey, S. Cooper, R. North American Association for Environmental Education National Conference Spokane WA - October 12, 2018

Introductions Scott VanderWey Robby Cooper Associate Professor Director of 4-H Adventure Education Washington State University Robby Cooper Clinical Assistant Professor Scott… 2

A Land Grant Research University Scott

Learning Objectives Gain awareness of challenge course evaluation findings. Gain understanding of effective program evaluation implementation and research methods. Develop ideas for your own program evaluation with the assistance of workshop facilitators.

Camp Long is a 67 acre part in West Seattle Camp Long is a 67 acre part in West Seattle. It was already the site of the oldest artificial rock climbing area in North America, and the 4H Challenge Course added to the rich history of adventure education within Seattle Parks. Reaching Urban Youth

Camp Long Challenge Course

Think-Pair-Share Consider: Directions: Think to yourself. What are you doing, or what would you like to be doing, to evaluate your programs? Directions: Think to yourself. Discuss with a partner. Share with your group.

The beginning… “We know it works” How can we show it? Actually, no we don’t. No matter how good it is, it can be better. We can empirically demonstrate that it works, specifically what it does, to other people (funders, supervisors, peers, potential clients). Top Viewed as of December 31, 2012: 1. Empowerment: What is it? 1999. 69,583 views 8. Analyzing Likert Data. April 2012. 13,012 views Now we have an instrument FOR US. We design the programs – we decide if we have time. Is it important to know if our programs are doing what we intend them to do? Is it important to demonstrate that to stakeholders, bosses, potential clients? Then it is worth the time. Yes we do – That’s what we will do today. How can we show it?

What impacts would you expect? The beginning… Do you think “it works?” Actually, no we don’t. No matter how good it is, it can be better. We can empirically demonstrate that it works, specifically what it does, to other people (funders, supervisors, peers, potential clients). Top Viewed as of December 31, 2012: 1. Empowerment: What is it? 1999. 69,583 views 8. Analyzing Likert Data. April 2012. 13,012 views Now we have an instrument FOR US. We design the programs – we decide if we have time. Is it important to know if our programs are doing what we intend them to do? Is it important to demonstrate that to stakeholders, bosses, potential clients? Then it is worth the time. Yes we do – That’s what we will do today. What impacts would you expect?

The Development Process Not just creating a survey… Program Mission Procedures Goals Training Outcomes Implementation Measures Analysis Resources Revision Article sought to create measures of desired outcomes of Positive Youth Development programs. Researchers found both face and concurrent validity through interviews with a pilot sample and measuring the correlation of survey scales with associated variables.

Program Mission What is your Program Mission? What do you and your people value? Article sought to create measures of desired outcomes of Positive Youth Development programs. Researchers found both face and concurrent validity through interviews with a pilot sample and measuring the correlation of survey scales with associated variables.

Goals Process Am I running my program well? Outcome Is my program making an impact? Article sought to create measures of desired outcomes of Positive Youth Development programs. Researchers found both face and concurrent validity through interviews with a pilot sample and measuring the correlation of survey scales with associated variables.

Adventure Education Outcomes Targeting Life Skills Model

* Program Outcomes * = self-efficacy Actually, no we don’t. No matter how good it is, it can be better. We can empirically demonstrate that it works, specifically what it does, to other people (funders, supervisors, peers, potential clients). Top Viewed as of December 31, 2012: 1. Empowerment: What is it? 1999. 69,583 views 8. Analyzing Likert Data. April 2012. 13,012 views Now we have an instrument FOR US. We design the programs – we decide if we have time. Is it important to know if our programs are doing what we intend them to do? Is it important to demonstrate that to stakeholders, bosses, potential clients? Then it is worth the time. Yes we do – That’s what we will do today. * = self-efficacy

Measures Communication (Klein, et al. 2006; Adolescent Health) Decision-making (Klein, et al. 2006; Adolescent Health) Teamwork (Annett, et al., 2000; Ergonomics) Self-efficacy (Sherer, et al., 1982; Psychological Reports)

Procedures & Resources What can we realistically and reasonably accomplish that will provide a valid and reliable measurement? Time Resources Qualifications administration analysis Setting Ethics

Measures Concerns Reliability Validity

Procedures Notice to Parents sent out with program registration materials Facilitator administers surveys using packet containing: Facilitator Instructions Recruitment Script Processing Page Surveys Envelope

Procedures Pretest – Posttest Design 12 – 16 item pretest 21– 25 item posttest (satisfaction, demographics) Completed surveys are sealed in an envelope and mailed to WSU Pullman campus. Data is entered and analyzed by faculty and graduate assistants.

Training & Implementation Clear Procedures for Implementation Fidelity Realistic Expectations Frequent Check-ins and Revision

Analysis & Revision Procedural Fidelity Training Audience Realistic Expectations Survey

Reliability of Measures Analysis Reliability of Measures Scale Reliability (Chronbach’s Alpha)*   Version 1 Version 2 Communication .30 Decision-making .48 Teamwork .34 Self-efficacy .33 *Averaged pretest / posttest Alpha Alpha > .70 considered sufficient

Reliability of Measures Analysis Reliability of Measures Scale Reliability (Chronbach’s Alpha)*   Version 1 Version 2 Communication .30 .69 Decision-making .48 .79 Teamwork .34 .68 Self-efficacy .33 .61 *Averaged pretest / posttest Alpha Alpha > .70 considered sufficient

85% of participants said they enjoyed their experience at Camp Long Results Camp Long, Seattle, WA Sample: 696 participants Satisfaction 85% of participants said they enjoyed their experience at Camp Long

Results N Mean Score Change t Sig. (p) Effect Size (d) Communication 576 .181 7.78 .000* .26** Decision-making 693 .089 2.92 .14 Self-efficacy 386 .194 4.52 .32** Teamwork 696 .103 4.32 .17 Table columns: Name of scale variable. Number of participants included in pretest-posttest comparison for that scale variable. The mean score of the group, at pretest, for that scale variable. The t-test score, which is the number that you get when you calculate the paired t-test (a comparison of mean scores in a related group). The p-value, which is the measure of significance. A p-value < .05 indicates a significant change in mean score from pretest to posttest. Results: Communication was highly significant. (change from pretest to posttest) Decision-making was not significant, but might be for repeat groups of participants, which we will look at as we survey more groups and specifically more repeat groups. Teamwork was significant. (change from pretest to posttest) Self-efficacy was not significant. (sample size is too small) *Statistical significance: p < .05 ** Small effect size: d > .2

Results

Results

Scott VanderWey vanderwey@wsu.edu Robby Copper robby.cooper@wsu.edu Kevin Wright wrightkc@wsu.edu