Simple Survey Resources Jeff Buckley, Jennifer Cantwell, Casey Mull, Whitney Cherry, Julie Dillard, Nicole Crawson.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Educational Specialists Performance Evaluation System
Program Goals Just Arent Enough: Strategies for Putting Learning Outcomes into Words Dr. Jill L. Lane Research Associate/Program Manager Schreyer Institute.
A GUIDE TO CREATING QUALITY ONLINE LEARNING DOING DISTANCE EDUCATION WELL.
Who Put “Instructional Monitoring” On My To Do List? Suggestions for Principals M. Ann Levett, Ed.D.
Patient Public Involvement (PPI) Policy What is PPI? PPI means putting patients and public at the centre of all that we do. It encourages the active participation.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
Simple Survey Resources: Templates, Tabulation & Impact Jeff Buckley, Jenna Daniel, & Casey Mull.
Does It Work? Evaluating Your Program
 Bridge Builders Creating Collaborations Between Student Affairs and Fundraising Emilie Cravens Dr. April Heiselt Mississippi State University 2012 SACSA.
SUPERINTENDENT AND BOARD OF EDUCATION MEMBER PERCEPTIONS REGARDING PREFERRED LEADERSHIP BEHAVIORS FOR SUPERINTENDENTS IN WEST VIRGINIA Keith A. Butcher.
Screen 1 of 24 Reporting Food Security Information Understanding the User’s Information Needs At the end of this lesson you will be able to: define the.
Research Ethics Levels of Measurement. Ethical Issues Include: Anonymity – researcher does not know who participated or is not able to match the response.
The Academic Assessment Process
Quality evaluation and improvement for Internal Audit
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
How to Write Goals and Objectives
Standards and Guidelines for Quality Assurance in the European
TeamSTEPPS TM National Implementation Measurement The following slides are not part of the TeamSTEPPS Instructor Guide. Due to federal 508 compliance requirements.
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
HEALTHY HABITS FOR TEENS NURS 440 / GROUP 2 DENISE COONEY, KEVIN DOAN SCOTT KOWALEWSKY, & BRANDON ZOLYNSKY.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
How to Develop the Right Research Questions for Program Evaluation
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Focusing Your Evaluation A Planning Template. Discerning Readiness Evaluate no program before its time Internal Chemistry Objectives Target program selected.
Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Multiple Vantage Points for Employment-Related Feedback Share.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Collaborating with Business: A Survey of Employers Participating in PWDNET December, 2012 Leah Lobato, Utah State Office of Rehabilitation Carol Ruddell,
Camping & Environmental Education Institute 2/25/15
UNIT 1 SEMINAR CM107 INSTRUCTOR: Carol A. Smith, RN, BSN, MA.
Simple Survey Resources: Templates, Tabulation, & Impact for Novices National Urban Extension Conference May 5, 2015 Jeff Buckley, Jennifer Cantwell, Casey.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Merrill Area United Way Outcomes Training (Part 2) Art Lersch Associate Professor Community Resource Development Educator University of Wisconsin – Extension,
Planning and Focusing an Evaluation Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Associate Professor and Youth Development Specialist.
Record Keeping and Using Data to Determine Report Card Markings.
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
1 PP 7.2 CORE COMPETENCIES FOR RIVER BASIN PLANNERS.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Stacy Keyte EDCI 538 Dr. Stetson. Rules and Procedures What I learned:  I learned the difference between rules and procedures as well as the way to effectively.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
ASSESSMENT DO THIS, NOT THAT! UNIVERSITY AT BUFFALO ASSESSMENT DAY
1 One Common Voice – One Plan School Improvement Stage 3 Plan: Develop School Improvement Plan.
4-H Simple Evaluation Templates Revamped: Gathering Data for Program Improvement and Impact Reporting 2015 NAE4-HA Jeff Buckley, Kasey Bozeman, Jennifer.
1 Key Roles of River Basin Planners Integrators of Knowledge and People Trade-offs analysts and presenters Communicators and Facilitators CORE COMPETENCIES.
Monitoring and Evaluation in the GMS Learning Program 7 – 18 May 2012, Mekong Institute, Khon Kaen, Thailand Randy S. Balaoro, CE, MM, PMP Data Collection.
ASK STANDARDS Assessment and Accountability CNS 610 Written & Narrated by: Kelcie Dixon Western Kentucky University.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Simple Surveys: From Program Objectives to Impact Statements in 90 Minutes 2016 Extension Conference Jeff Buckley & Jennifer Cantwell January 13, 2016.
2009 TACUSPA Fall Conference October 5, 2009 El Paso, Texas.
Defining Our Work: Faculty Assignment Documents Elaine Bowen Health Specialist, F & H.
Measuring Institutional Capacity for Sustainability Mark D. Bardini, Ph.D. Chemonics International AEA Webinar September 15, 2011.
Farmers Market and Local Food Promotion Program Grant Writing Workshop Developing Your Idea These workshops are funded by the USDA’s Agricultural Marketing.
EMPOWERMENT THROUGH EDUCATION Business Retention and Expansion Task Force Workshop Joe Lucente Assistant Professor and Extension Educator OSU Extension.
Applied Research Consultants: Applied Research Consultants: Assessment Services and External Evaluation.
Improving Impact Statements Using Survey Data 2016 Extension Conference Jeff Buckley, Jennifer Cantwell, January 14, 2016.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
Development of Self-Determination and Social Skills of College-Bound Students with Visual Impairments Report on an Intervention Program Designed to Improve.
Evaluating the Quality and Impact of Community Benefit Programs
Assessing Academic Programs at IPFW
Presentation transcript:

Simple Survey Resources Jeff Buckley, Jennifer Cantwell, Casey Mull, Whitney Cherry, Julie Dillard, Nicole Crawson

Topics we will cover….. General Evaluation Overview: – Why and what do we evaluate? – Outcomes → Constructs → Questions Review of Templates Create your own – hands on Tabulation Impact Statements

You will leave with…. Resources for developing effective survey questions 3 Survey Templates Resources/tools for collecting and analyzing data Sample impact statements

Disclaimers Adapted from previous presentations by Jeff Buckley, Jenna Daniel & Casey Mull. Resources developed in collaboration with Nick Fuhrman, Ph.D., Associate Professor & Graduate Coordinator, Dept. of Agricultural Leadership, Education, & Communication, UGA. Consult with your state’s Evaluation Specialist for additional, more advanced, guidance on Program Evaluation. We welcome your constructive feedback as we work to provide useful resources in the area of Program Development.

Why is it important? As a result of shrinking state budgets, the need to effectively document program impacts is more critical than ever. Many states do not have a full time Evaluation Specialist. County Agents and State Specialists must develop their own evaluation instruments. Many County Agents and State Specialists may not feel qualified or confident enough to develop their own instruments and/or to tabulate and report data.

Things to consider Who is your audience? – Process Evaluation – Internal Use – Outcome Evaluation – External Audience How will you share the results? – Impact Statement – News Article – Journal Article – Be sure to check with IRB! – Other

Institutional Review Board (IRB) Research oversight committee charged with ensuring that human subjects research is conducted in compliance with the applicable federal, state, and institutional policies and procedures.

When is IRB-approval needed? When using results for: – Research – You plan to generalize the information learned – Scholarly article or report Not required for: – Evaluation – Internal use – program improvement, etc. – Reports to: Funders, stakeholders, media, impact statements, news articles, etc.

Important Terms Outcomes – benefit to participants – Related to knowledge, skills, attitudes, behavior, condition, or status – We might also refer to this as a desired outcome or the objective being measured via the survey Constructs – grouping of questions that measure an outcome Impact statement – summary of program outcome as measured by survey

Types of Outcomes Short-term – New knowledge, increased skills, changes in attitude or values Intermediate – Modified behavior Long-term – Improved condition, altered status

What can our surveys measure? Short-term Outcomes – Knowledge Increase in knowledge Demonstration of knowledge – Attitude or Values (behavioral intent) As a result of intervention, I plan to…. – Skills

Outcomes Examples from your programs?

Measurable Outcomes Refer to the handout “Verbs for Writing Measurable Objectives” Measurable objectives will translate into more effective constructs and questions. Constructs are sets of questions designed to measure a single idea.

Simple Example Program teaches youth to make healthier food choices and to increase activity. Short-term Outcomes – Youth are able to identify healthy food choices – Youth plan to increase exercise and decrease screen time

Creating a Construct: Don’t put all your eggs in one basket 1.Youth are able to identify healthy food choices: – I know more about what foods are healthy. – I can identify healthy food options. – I have learned new ways to eat to be healthier. 2.Youth plan to increase exercise and decrease screen time: – I learned new ideas for being active. – I plan to exercise more. – Decreasing screen time is a healthy choice.

Creating a Construct: Don’t put all your eggs in one basket 1.I know more about what foods are healthy. 2.I learned new ideas for being active. 3.I can identify healthy food options. 4.I have learned new ways to eat to be healthier. 5.I plan to exercise more. 6.Decreasing screen time is a healthy choice.

Evaluation Templates Purse Strings and Heart Strings Quantitative and Qualitative Data Post-test Only Retrospective Post then Pre Quiz-like, post-test only  Audience Poll Documentation Form

Likert Style Questions These templates are designed for use with Likert style questions. Pick the rating scale that works best for your objectives/program (see handout). This is the most challenging part of developing your survey. Allow time. Ask for feedback.

Points to Consider when Writing Likert Style Questions/Statements Consist of a statement and a rating scale. Need to conform to one rating scale. One data point per question. Is what you’re measuring important to the intended audience of the evaluation?

Activity - Build Your Own Survey Pick a Survey Template Develop 6 Likert Style Questions – 2 outcomes (constructs) - 3 questions for each – 3 outcomes (constructs) - 2 questions for each Consider… What you want your participants to know? How you want their attitude to change? What sort of behavior changes you want to see?

Process Activity How did it go? Was it difficult? What did you learn? Do you think this is a helpful template for producing a usable survey for certain programs?

Demonstration of Knowledge

Now, gather your data! Hand out your questionnaires. Take them back up. Open up your tabulation template

Analyzing, Tabulating and Reporting Impact

Screen shot tab spreadsheet

Sample Impact Statements Overall respondents indicated they agreed they are more likely to get involved in their community as a result of the GPK Leadership Adventure weekend.

Sample Impact Statements In general respondents disagreed that they would maintain their involvement in the GPK program. 70% of respondents agreed or strongly agreed that they had made friends with other kids in the GPK program.

Sample Impact Statements For the retrospective post then pre questionnaire, you could utilize responses like: 84.6% of respondents indicated an increase in their confidence concerning their ability to design or construct a map of the distribution of major world languages.

Sample Impact Statements The percentage of program participants who agreed that they were confident in their ability to develop measurable objectives increased from 38% before to 100% after the program.

Sample Impact Statements Outcomes with 1 question/construct: 68% of students correctly identified the United States government as the largest employer in the world. 72% of students stated they thought about different careers they might like when they are older.

Sample Impact Statements Outcomes with multiple questions/constructs: 92% of students correctly matched at least 3 of the Government/Public Administration jobs to the appropriate department. 91% of students correctly identified at least 3 of the government departments with the type of assistance provided.

Simple Surveys in Florida 4-H

To Access Templates, go to… For more information contact: University of Georgia, State 4-H Office Jeff Buckley, Jennifer Cantwell, State 4-H Office –

For more information contact: University of Florida Extension Whitney Cherry, Julie Dillard, Nicole Crawson,

Questions, suggestions for improvement? Thank you for coming.