Camping & Environmental Education Institute 2/25/15

Slides:



Advertisements
Similar presentations
Educational Specialists Performance Evaluation System
Advertisements

Program Goals Just Arent Enough: Strategies for Putting Learning Outcomes into Words Dr. Jill L. Lane Research Associate/Program Manager Schreyer Institute.
Develop and Validate Minimum Core Criteria and Competencies for AgrAbility Program Staff Bill Field, Ed.D., Professor National AgrAbility Project Director.
School Improvement Through Capacity Building The PLC Process.
 A strategic plan is a guiding document for an organization. It clarifies organizational priorities, goals and desired outcomes.  For the SRCS school.
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
Simple Survey Resources Jeff Buckley, Jennifer Cantwell, Casey Mull, Whitney Cherry, Julie Dillard, Nicole Crawson.
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Simple Survey Resources: Templates, Tabulation & Impact Jeff Buckley, Jenna Daniel, & Casey Mull.
SUPERINTENDENT AND BOARD OF EDUCATION MEMBER PERCEPTIONS REGARDING PREFERRED LEADERSHIP BEHAVIORS FOR SUPERINTENDENTS IN WEST VIRGINIA Keith A. Butcher.
Family Resource Center Association January 2015 Quarterly Meeting.
Talbert House Project PASS Goals and Outcomes.
The Academic Assessment Process
Quality evaluation and improvement for Internal Audit
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Goal 3, Volunteer Development and Systems to Support Youth: Logic Model and Communications Plan Situation Statement During 2005, over 11,218 adult volunteers.
performance INDICATORs performance APPRAISAL RUBRIC
Assessing Financial Education: A Practitioner’s Guide December 2010.
Molly Chamberlin, Ph.D. Indiana Youth Institute
Making a Powerful IMPACT: How to write impact statements that count. Whitney Cherry, Calhoun County Yolanda Goode, Gadsden County.
How to Develop the Right Research Questions for Program Evaluation
I am ready! A look at how career classes are preparing students for career success Katy Hinz, Program Coordinator, Office for Student Engagement. Career.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
IFAS Extension Goal 3, Logic Model and Communications Plan Life Skills Developed in Youth Through Subject Matter Experiences Situation Statement Florida.
Teen Leadership Program Report 2012 Results real teens life results 1.
PROGRAMS MONITORING AND SUPERVISION
ACADEMIC PERFORMANCE AUDIT
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Simple Survey Resources: Templates, Tabulation, & Impact for Novices National Urban Extension Conference May 5, 2015 Jeff Buckley, Jennifer Cantwell, Casey.
Connected Learning with Web 2.0 For Educators Presenter: Faith Bishop Principal Consultant Illinois State Board of Education
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Research Design & the Research Proposal Qualitative, Quantitative, and Mixed Methods Approaches Dr. Mary Alberici PY550 Research Methods and Statistics.
Equality Impact Assessments: Measuring Impact and Driving Change Lucy Ambrozejczyk Service Improvement Officer Hafod Housing Association.
WELCOME Strategic Directions Finale May 1, SETTING THE STAGE Planning for BC’s Future 2015—2018.
IFAS Extension Goal 3, Logic Model and Communications Plan Organizational Strategies and Learning Environments to Support Youth Situation Statement Florida.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Overview of Program Evaluation Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Associate Professor and Youth Development Specialist Oregon.
Quality Assessment July 31, 2006 Informing Practice.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Service-Learning in Health Education/Promotion and Recreation Courses Presented to 2011 AAHPERD National Convention & Exposition April 2, 2011 Yating “Tina”
Planning and Focusing an Evaluation Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Associate Professor and Youth Development Specialist.
Revision of Initial and Continued Approval Standard Guidelines for Educational Leadership Programs Presentation to FAPEL Winter Meeting Tallahassee, FL.
1 PP 7.2 CORE COMPETENCIES FOR RIVER BASIN PLANNERS.
Core Competencies for Adolescent Sexual and Reproductive Health Performance Assessment and Human Resources Toolkit.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Chapter 14: Affective Assessment
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Government of Nepal Ministry of Education National Center for Educational Development.
TCRF Strategic Planning Process A Stakeholders’ Consultative Retreat- Morogoro 26 th -27 April 2013.
4-H Simple Evaluation Templates Revamped: Gathering Data for Program Improvement and Impact Reporting 2015 NAE4-HA Jeff Buckley, Kasey Bozeman, Jennifer.
1 Key Roles of River Basin Planners Integrators of Knowledge and People Trade-offs analysts and presenters Communicators and Facilitators CORE COMPETENCIES.
Developing Program Learning Outcomes To help in the quality of services.
SIX PLUS ONE COLUMBUS CITY SCHOOLS IMPLEMENTATION MODEL OF PARENT ENGAGEMENT = 7.
Giving Them our Best: 4-H Professional Development Logic Model Outcome: 4-H educators reflect quality, distinction and leadership in the field of youth.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Simple Surveys: From Program Objectives to Impact Statements in 90 Minutes 2016 Extension Conference Jeff Buckley & Jennifer Cantwell January 13, 2016.
2009 TACUSPA Fall Conference October 5, 2009 El Paso, Texas.
Defining Our Work: Faculty Assignment Documents Elaine Bowen Health Specialist, F & H.
Grading Practices Study Group Trigg County Board of Education November 19, 2015.
Improving Impact Statements Using Survey Data 2016 Extension Conference Jeff Buckley, Jennifer Cantwell, January 14, 2016.
Fitness and Conditioning
Learning Outcomes Awareness Created Knowledge Gained Attitudes Changed
Presentation transcript:

Camping & Environmental Education Institute 2/25/15 Simple Evaluation Templates: Gathering Data for Program Improvement & Impact Reporting Camping & Environmental Education Institute 2/25/15 Jeff Buckley, Jennifer Cantwell, Casey Mull, Whitney Cherry, Julie Dillard, Nicole Crawson You don’t have to be an expert. The people who read our reports and media are not experts either. Just make one, hand it out and see what you get! Take a stand exercise: I have a basic knowledge of program evaluation. I have an advanced knowledge of program evaluation. I am comfortable developing my own surveys. I am comfortable tabulating my own data. We will

Topics we will cover….. What and Why Do You Evaluate? Survey Templates Impact Statements Data Tabulation Templates Tools for creating Surveys Questions Jennifer We will talk about all these things today…

You will leave with…. Resources for developing effective survey questions Survey Templates Resources/tools for collecting and analyzing data Sample impact statements Jennifer We will talk about all these things today…

Disclaimers Adapted from previous presentations by Jeff Buckley, Jenna Daniel & Casey Mull. Materials informed by best practices from Extension Professionals at University of Florida. Resources developed in collaboration with Nick Fuhrman, Ph.D., Associate Professor & Graduate Coordinator, Dept. of Agricultural Leadership, Education, & Communication, UGA. Consult with your state’s Evaluation Specialist for additional, more advanced, guidance on Program Evaluation. We welcome your constructive feedback as we work to provide useful resources in the area of Program Development. Jennifer Note – These resources were developed in collaboration with Dr. Nick. The templates we will discuss today are useful for gathering short term data on a single day of programming.

Why is it important? As a result of shrinking state budgets, the need to effectively document program impacts is more critical than ever. Many states do not have a full time Evaluation Specialist. County Agents and State Specialists must develop their own evaluation instruments. Many County Agents and State Specialists may not feel qualified or confident enough to develop their own instruments and/or to tabulate and report data. Jeff

Evaluation Templates Purse Strings and Heart Strings Qualitative Data – Domain Analysis Post-test Only Retrospective Post then Pre Quiz-like, post-test only Audience Poll Documentation Form Jeff We will go over some of these today – by topic area. Talk about Qualitative Here We wanted to briefly metnion a few others

What and Why? Why do you want to gather data? What types of programs to you provide? What do you want to know? What will you do with the data? Who will you share data with? What are your goals and objectives? Jeff Ask them why they want to gather data? Who’s it for? What do you plan to do with it? These are all big questions we won’t answer today. Today we’re focusing on a simple method of gathering data, that is adjustable to a variety of programs. Begin with the end in mind. Is this to publicize your positive impact or two help improve your program Process – how did we do? Outcome – did your learner/stakeholder/participant they learn as a result of it?

Things to consider Who is your audience? Process Evaluation – Internal Use Outcome Evaluation – External Audience How will you share the results? Impact Statement News Article Journal Article – Be sure to check with IRB! Other Jeff Ask them why they want to gather data? Who’s it for? What do you plan to do with it? These are all big questions we won’t answer today. Today we’re focusing on a simple method of gathering data, that is adjustable to a variety of programs. Begin with the end in mind. Is this to publicize your positive impact or two help improve your program Process – how did we do? Outcome – did your learner/stakeholder/participant they learn as a result of it?

Institutional Review Board (IRB) Research oversight committee charged with ensuring that human subjects research is conducted in compliance with the applicable federal, state, and institutional policies and procedures. Jennifer

When is IRB-approval needed? When using results for: Research You plan to generalize the information learned Scholarly article or report Not required for: Evaluation Internal use – program improvement, etc. Reports to: Funders, stakeholders, media, impact statements, news articles, etc. Jennifer Hypothesis – you have a question you want to answer - if kids participate in this program, they’ll have this result Generalize – if you implement this program exactly the way we did, your kids will have these outcomes

Types of Outcomes Short-term Intermediate Long-term New knowledge, increased skills, changes in attitude or values Intermediate Modified behavior Long-term Improved condition, altered status Jennifer Examples Short – gained or demonstrated knowledge, changed their belief, say they plan to do something different (they now know exercise is important and they plan to do it) Intermediate – actually change behavior (increased exercise) Long-term – they are healthier (lose weight, better blood pressure, etc.)

What can our surveys measure? Short-term Outcomes Knowledge Increase in knowledge Demonstration of knowledge Attitude or Values (behavioral intent) As a result of intervention, I plan to…. Skills Jennifer Short term, not long term But…, we can tie short term outcomes to the research from Tufts, that 4-H is beneficial. We are getting a snapshot of one effective day of programming. Develop one question from each of the above areas, share briefly, go on. The templates we will share with you address the first two bullets. Example of Skills – use rubric example

Let’s look at the results! Camp & EE Surveys Let’s take the survey! Let’s look at the results! Jennifer Make them do HOP Survey with clickers Show how results are saved. Ask what kind of things are we measuring – knowledge, behavior, etc. What can we do with this data? Click through Herpetolgy Powerpoint as an example Hand out Tidelands Surveys

Camp Surveys - Impact Examples After participating in the 4-H Health is Our Pledge (HOP) Class at 4-H Summer Camp participants indicated they…. % of youth Learned new things they can do to be healthier. 92% Have gotten more exercise at 4-H camp than they normally do. 80% Plan to drink less soft drinks and more water or milk in the future. 75% Participants correctly identified best practices for developing a healthy lifestyle 83% of the time. Because of participation in the Herpetology Class at 4-H Summer Camp participants indicated they…. % of youth Learned new things about reptiles and amphibians. 84% Would like to learn more about herpetology and other science-related subjects. 88% Jennifer

Camp & EE Surveys What are we measuring? How can we share the data? Other examples: Herpetology – Survey Tidelands – Survey & Data Tabulation Spreadsheet Make them do HOP Survey with clickers Show how results are saved. Ask what kind of things are we measuring – knowledge, behavior, etc. What can we do with this data? Click through Herpetolgy Powerpoint as an example Hand out Tidelands Surveys

Educational Club Meeting Surveys Survey Examples Career Awareness & Exploration: Government and Public Administration field States of Matter: Physical vs. Chemical Changes Data Tabulation Spreadsheets Impact Statements Different type – 1 or more constructs, combining questions Jennifer Show survey Show Data Tabulation Spreadsheet Show examples of impact statements

Project Achievement Surveys Survey Examples Junior / Senior Project Achievement This survey also includes Likert questions, constructs Data Tabulation Spreadsheets – in progress! Impact Statements – in progress! Jennifer Show survey Show Data Tabulation Spreadsheet Show examples of impact statements Talk about constructs, introduce Likert questions

A little more advanced…. Constructs: Multiple questions designed to measure the same concept Helps: identify poorly worded questions Validity and reliability (negatively worded questions) Jeff Building a construct means, coming up with multiple questions designed to measure the same concept. If you only have one question, poorly worded, you might not get valid data because the question was confusing. So ask several questions targeting the same outcome/construct/learning objective.

Examples of Constructs (draft stage) Junior/Senior Project Achievement Survey: JOBS – better prepared to apply for jobs MASTERY – correctly identify best practices in public speaking CONFIDENCE – indicate increase in confidence in public speaking GENEROSITY/BELONGING – project achievement gave me the opportunity to value and practice service for others and to be part of a safe and inclusive community Jeff & Jennifer – Jeff – maybe explain survey Jennifer – explain spreadsheet Building a construct means, coming up with multiple questions designed to measure the same concept. If you only have one question, poorly worded, you might not get valid data because the question was confusing. So ask several questions targeting the same outcome/construct/learning objective.

Likert Style Templates Standard Likert Survey Data Tabulation Spreadsheet Retrospective Post-Then-Pre Jeff LOOK AT SURVEYS SHOW TEMPLATES – MAYBE JUST RETRO The real trick to creating these evaluations is coming up with a set of questions that can all use the same rating scales. It takes some time and effort. Ask for feedback from a colleague. Maybe develop two different surveys with slightly different wordings to see which ones get better responses.

Likert Style Templates The percentage of program participants who agreed that they were confident in their ability to develop measurable objectives increased from 38% before to 100% after the program Overall respondents indicated they agreed they are more likely to get involved in their community as a result of the GPK Leadership Adventure weekend. Jeff

Points to Consider when Writing Likert Style Questions/Statements Consist of a statement and a rating scale. Need to conform to one rating scale. One data point per question. Is what you’re measuring important to the intended audience of the evaluation? Good for older youth and adults. Jeff One minute for this slide Present the template for the Standard Questionnaire When using a “Likert” style questions, you must conform to the format. Refer to the Sample Ratings Scale Handout Standard Questionnaire versus Retro Post then Pre – Both have their pros and cons.

Resources for Writing Questions Verbs for Writing Measurable Objectives Sample Rating Scales for Likert Questions

QUESTIONS??

To Access Templates, go to… www.georgia4h.org/evaluationresources For more information contact: University of Georgia, State 4-H Office Jeff Buckley, jbuckley@uga.edu Jennifer Cantwell, jecantw@uga.edu State 4-H Office – 706-542-4444

For more information contact: University of Florida Extension Whitney Cherry, cherryw@ufl.edu Julie Dillard, juliepd@ufl.edu Nicole Crawson, ncrawson@ufl.edu