Simple Survey Resources: Templates, Tabulation & Impact Jeff Buckley, Jenna Daniel, & Casey Mull.

Slides:



Advertisements
Similar presentations
One of the most important aspects of any CME activity is evaluation, or outcomes measurement. CME Compliance: Evaluation Measuring the educational outcomes.
Advertisements

Building capacity for assessment leadership via professional development and mentoring of course coordinators Merrilyn Goos.
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
University of Arkansas Cooperative Extension Service Using Crises to Develop Public Issues Education Program Southern Community Development Education Conference.
Simple Survey Resources Jeff Buckley, Jennifer Cantwell, Casey Mull, Whitney Cherry, Julie Dillard, Nicole Crawson.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Does It Work? Evaluating Your Program
Evaluation.
Family Resource Center Association January 2015 Quarterly Meeting.
UniLOA The University Learning Outcomes Assessment The Center for Learning Outcomes Assessment, Inc. ©
How to Write Goals and Objectives
Evaluating Professional Development Debbie Junk, Coordinator for Mathematics Initiatives Mathematics Project Directors’ Meeting Tuesday, October 9th Austin,
This training is conducted by the
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
UNDERSTANDING THE CLASS
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Assessing Financial Education: A Practitioner’s Guide December 2010.
Observations A Mirror of the Classroom. Objectives  Gain specific strategies for communicating with teachers to promote successful teaching and learning.
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
CAA’s IBHE Program Review Presentation April 22, 2011.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Camping & Environmental Education Institute 2/25/15
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Two Strategies for Developing Solid Referral Relationships A Complete Training Series.
Questionnaires and Interviews
Simple Survey Resources: Templates, Tabulation, & Impact for Novices National Urban Extension Conference May 5, 2015 Jeff Buckley, Jennifer Cantwell, Casey.
Using Authentic Discovery Projects to Improve Student Outcomes in Statistics Joint Mathematics Meetings January 16, 2010 Dianna Spence Brad Bailey Robb.
What Works in Multi-Site Evaluations of Nutrition Education Interventions? Andy Fourney, Andrew Bellow, Patrick Mitchell, Sharon Sugerman, Angie Keihner.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
CIC Webinar Community Outcomes Project February 16, 2012 Dawn Helmrich, Director of Data and Outcomes United Way of Greater Milwaukee.
Measuring the effectiveness of your volunteer program Meals On Wheels Association of America 2011 Annual Conference August 30, 2011 Stacey McKeever, MPH,
Building Human Resource Management SkillsNational Food Service Management Institute 1 Performance Standards and Expectations Objectives At the completion.
Evaluating a Research Report
FIRST STEP for Success Texas AgriLife Extension Service New Employee Development.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
Usability Testing Chapter 6. Reliability Can you repeat the test?
Healthy Halls School Wellness Program. 2 Why Healthy Halls School Wellness Program There is a problem in Georgia:  Nearly 40 percent of children in Georgia.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Biology Scholars Program Amy Chang American Society for Microbiology Goal: Develop faculty expertise in evidenced-based science education reform Seven.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
1 PP 7.2 CORE COMPETENCIES FOR RIVER BASIN PLANNERS.
K-State Research and Extension Community Board Leadership Series You are serving on a board-now what? K-State Research and Extension Community Development.
Training Program Music and Movement
Copyright © 2014 by The University of Kansas Data Collection: Designing an Observational System.
Chapter 14: Affective Assessment
Documenting Family Outcomes: Decisions, Alternatives, Next Steps Don Bailey, Ph.D. Mary Beth Bruder, Ph.D. Contact information: Mary Beth Bruder, Ph.D.
A Pilot Study of a Multimedia Instructional Program for Teaching of ESL Grammar with Embedded Tracking.
Evaluation and Implementation 21 October 2015 PUBH 535.
4-H Simple Evaluation Templates Revamped: Gathering Data for Program Improvement and Impact Reporting 2015 NAE4-HA Jeff Buckley, Kasey Bozeman, Jennifer.
1 Key Roles of River Basin Planners Integrators of Knowledge and People Trade-offs analysts and presenters Communicators and Facilitators CORE COMPETENCIES.
Kaplan University Writing CenterSaturday, March 05, IX520 Needs Assessment and Evaluation in Education and Training Unit 7 Seminar September 13,
National 4-H Common Measures Suzanne Le Menestrel, Ph.D. National Institute of Food and Agriculture, USDA Jill Walahoski, Ph.D. University of Nebraska-Lincoln.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Simple Surveys: From Program Objectives to Impact Statements in 90 Minutes 2016 Extension Conference Jeff Buckley & Jennifer Cantwell January 13, 2016.
Defining Our Work: Faculty Assignment Documents Elaine Bowen Health Specialist, F & H.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
TELL Survey 2015 Trigg County Public Schools Board Report December 10, 2015.
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Improving Impact Statements Using Survey Data 2016 Extension Conference Jeff Buckley, Jennifer Cantwell, January 14, 2016.
Data Collection with Forms For Special Education
Hiring and Onboarding: Process Overview
Tunica County School District Budget Planning for the Librarians
National Food Service Management Institute
A Training Design Tool for Stakeholders Tasked with Evaluating New and Innovative Treatment Technologies for Small Drinking Water Systems Be sure to type.
A VOYAGE THROUGH PERFORMANCE MANAGEMENT TOOLS
Student Learning Outcomes Assessment
UNDERSTANDING THE CLASS
Presentation transcript:

Simple Survey Resources: Templates, Tabulation & Impact Jeff Buckley, Jenna Daniel, & Casey Mull

Overview Situation Need Being Met Audience (Agents & Associates) Basic Overview of the Training How Can This Be Utilized in Your State?

Situation As a result of shrinking state budgets, the need to effectively document program impact is more critical than ever. Many states do not have a full time Evaluation Specialist. County Agents and State Specialists must develop their own evaluation instruments. Many County Agents and State Specialists may not feel qualified or confident enough to develop their own instruments and/or to tabulate and report data.

Need Being Met A 4-H program evaluation training was created Implemented by Georgia 4-H State Specialists & CAES Faculty Piloted at the 2012 Winter School – 4-H Agents & Associates

Simple Survey Training Training Format: – 90 minute session – Interactive – 4-H Agents and Associates – Participants created their own program evaluation

Simple Survey Training Why is it Important to Gather Data? – Who is your audience? – Process Evaluation – Internal Use – Outcome Evaluation – External Audience How will you share the results? – Impact Statement – News Article – Journal Article – Other

Simple Survey Training What Can a Survey Measure? – Knowledge – Attitude – Behavior Designing Measurable Objectives – Refer to the handout “Verbs for Writing Measurable Objectives.” – Measurable objectives will translate into more effective constructs and questions.

Simple Survey Training Consider… – What you want your participants to know. – How you want their attitude to change. – What sort of behavior changes you want to see. Creating A Construct – Pick two outcomes/constructs and develop three “questions” each, OR – Pick three outcomes/constructs and develop two “questions” each. – (The template has space for six “questions”.)

Simple Survey Training Creating A Construct – The food in the cafeteria was hot. – The dining area was clean and comfortable. – There were a variety of fruits and vegetables. – Beverage selections included healthy options. – Vegetables were not over-cooked. – The servers were on task and pleasant.

Simple Survey Training Likert Style Questions – These templates are designed for use with Likert style questions. – Pick the rating scale that works best for your objectives/program. – This is the most challenging part of developing your survey. Allow time. Ask for feedback.

Simple Survey Training Likert Style Questions – Consist of a statement and a rating scale. – Need to conform to one rating scale. – Measurable verbs that fit the appropriate cognitive domain. – One data point per question. – Is what you’re measuring important to the intended audience of the evaluation?

Simple Survey Training Instrument Templates – Traditional – Retrospective Postthenpre – 6 quantitative items – 1 qualitative item – Feedback on the program

Simple Survey Training Tabulation – Excel Spreadsheet – Directions – Worksheet for both instrument templates – “Locked” format

Now You’re Ready to Gather Data!

Results Presentations Conducted – Galaxy – Georgia 4-H Winter Conference – Georgia Association of 4-H Agents Conference – Southern Region Volunteer Conference

Results Impact – The percentage of program participants who stated that they could… – develop measurable objectives increased from 60% before to 100% after the program. – design evaluation instruments for their programs increased from 69% before to 91% after. – are likely to develop their own evaluation tools in the future increased from 46% to 89/%. – A domain analysis of the qualitative revealed the following themes… – Participants were very glad to have the templates. – Participants plan to use the tools and knowledge they gained to gather data. – Participants plan to use the data they collect to develop higher quality impact statements.

Results Extension Agent Quotes & Impact Statements – With a mean score of 3.1 (SD=.994), overall respondents from Question 5 indicated they "agreed" they are more likely to get involved in their community as a result of the GPK Leadership Adventure weekend. – Retrospective post then pre: [In Question 2], 84.6% of respondents indicated an increase in their confidence concerning their ability to “design or construct a map of the distribution of major world languages”.

Simple Surveys in Florida 4-H Tabulation of Data – Dropping the data in the sheets is so simple and saves a ton of time. It is much easier than manually tabulating learning gains – Data is easy to interpret from the sheets How Data Has Been Used – Data has been used with stake holders to secure program support and has been used in annual reports of accomplishment. – Data will be used in presentations at professional conferences to show outcomes and projected impacts of programming. 5.

Training Evaluation Training Program Evaluation Exercise – Confidence in ability – Knowledge gain – Training evaluation

How Can You Utilize this Program? Resources: – State Meetings Webinars New Agent Training

Questions? For more information contact: University of Georgia, State 4-H Office Jeff Buckley, Jenna Daniel, Casey Mull, State 4-H Office –