The process of evaluation Program Evaluation and Quality Assurance HPR 322 Chapter 13.

Slides:



Advertisements
Similar presentations
Test Taking Strategies
Advertisements

FINDING OUT WHAT PEOPLE THINK “Quizzing the community.” Data Gathering techniques including Interviews, Surveys & Questionnaires
Developing a Questionnaire
Ardianto Prabowo Indira Dwiajeng A. Maria Angela Masruroh Susan Kuncoro.
What is a Survey? A scientific social research method that involves
Identifying enablers & disablers to change
2.06 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
Survey Methodology Survey Instruments (2) EPID 626 Lecture 8.
Item Writing Techniques KNR 279. TYPES OF QUESTIONS Closed ended  Checking yes/no, multiple choice, etc.  Puts answers in categories  Easy to score.
Surveys and Questionnaires. How Many People Should I Ask? Ask a lot of people many short questions: Yes/No Likert Scale Ask a smaller number.
Copyright © 2014 by The University of Kansas Collecting and Analyzing Data.
May 5, 2015 Strategies for Evaluation Data Collection Eric Graig, Ph.D.
Screen 1 of 24 Reporting Food Security Information Understanding the User’s Information Needs At the end of this lesson you will be able to: define the.
1 Sources:  SusanTurner - Napier University  C. Robson, Real World Research, Blackwell, 1993  Steve Collesano: Director, Corporate Research and Development.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
Selecting Your Evaluation Tools Chapter Five. Introduction  Collecting information  Program considerations  Feasibility  Acceptability  Credibility.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 8 Using Survey Research.
Formative and Summative Evaluations
Promoting Student Engagement: Involving Students with NSSE Planning and Results William Woods University NSSE Users’ Workshop October 6-7, 2005.
IntroductionIntroduction BA 495: Strategic Management BA 495: Strategic Management.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
OCR Nationals Level 3 Unit 3. March 2012 M Morison  Know the different data collection methods available  Understand the difference between quantitative.
Understanding your child’s IEP.  The Individualized Education Plan (IEP) is intended to help students with disabilities interact with the same content.
1 Evaluation. 2 Evaluating The Organization Effective evaluation begins at the organizational level. It starts with a strategic plan that has been carefully.
Copyright © 2014 by The University of Kansas Member Survey of Process: Ratings of Satisfaction.
Business and Management Research
CBR 106: Developing a Client Satisfaction Survey.
Jeanne M. Clerc, Ed.D. Western Illinois University (WIU) October 14, 2011.
Chapter 14: Usability testing and field studies
Assessment of Staff Needs and Satisfaction in an Evergreen Environment David Green NC Cardinal Consultant State Library of North Carolina.
Principles of Successful Selling
By: Christopher Prewitt & Deirdre Huston.  When doing any project it is important to know as much information about the project and the views of everyone.
Does the use of math journals improve students retention and recall of math facts and formulas?
1.Rationale for using and engaging with wikis 2.Preparation for using wikis 3.Purpose and uses of wikis 4.Wiki to aid in assessment 5.Outcomes from using.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
NorthSky Nonprofit Network Creating Customer Satisfaction Surveys Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates
PEER ASSISTED STUDYING An Untapped Resource for Student Success Presented By Susan Easton
SiTEL LMS Focus Group Executive Summary Prepared: January 25, 2012.
The Impact of the MMP on Student Achievement Cindy M. Walker, PhD Jacqueline Gosz, MS University of Wisconsin - Milwaukee.
Type author names here Social Research Methods Chapter 10: Self-completion questionnaires Alan Bryman Slides authored by Tom Owens.
Data Collection Methods
Fall 2015ECEn 4901 Team work and Team Building. Fall 2015 ECEn Lecture 1 review Did you find the class website? Have you met with your team? Have.
Institute for Financial Literacy © Three Elements to a Successful Financial Literacy Education Program Leslie E. Linfield, Esq. October 29, 2008.
A New Approach to Performing Course Evaluations: Using Q Methodology to Better Understand Student Attitudes Joe Jurczyk Susan Ramlo University of Akron.
SURVEY RESEARCH AND TYPES OF INFORMATION GATHERED.
IFS310: Module 3 1/25/2007 Fact Finding Techniques.
Dr. Kelly Bruning 1. 2 If you have any trouble in seminar, please call Tech Support at: They can assist if you get “bumped” from the seminar.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Research Design and Instrument Development
Questionnaires Questions can be closed or open Closed questions are easier to analyze, and may be done by computer Can be administered to large populations.
10 Questionnaire Design. Role of Questionnaire Survey research, by definition, relies on the use of a questionnaire. A questionnaire is a set of questions.
Fall 2002CS/PSY Empirical Evaluation Data collection: Subjective data Questionnaires, interviews Gathering data, cont’d Subjective Data Quantitative.
Writing a Community Survey 1.It’s a good idea for the students to survey those in their school or community who are impacted by their chosen issue. (They.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
District Climate Survey—Parents & Community Results and Analysis June /10/20101.
CHAPTER 34 Collection and Organisation of Data. PRIMARY AND SECONDARY DATA PRIMARY DATA is collected by an individual or organisation to use for a particular.
Descriptive Research & Questionnaire Design. Descriptive Research Survey versus Observation  Survey Primary data collection method based on communication.
Joint Warfare Analysis Center 360 Degree Feedback Process Rater Orientation.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
Report on the NCSEAM Part C Family Survey Batya Elbaum, Ph.D. National Center for Special Education Accountability Monitoring February 2005.
Dr. Bea Bourne 1. 2 If you have any trouble in seminar, please call Tech Support at: They can assist if you get “bumped” from the seminar.
Nominal Group Process (NGP) A well researched technique (Delbecq et al., 1986) that is effective in facilitating a group to come to the best combined judgements.
NEEDS ASSESSMET Primary Data for Needs Assessment.
Findings – January  Respondents  Access to the practice  Repeat prescription service  Test results  Practice staff  Overall satisfaction 
Documenting Your Teaching and Student Evaluations
Bell Ringer List five reasons why you think that some new businesses have almost immediate success while others fail miserably.
Business and Management Research
Business and Management Research
What is the nature of descriptive measures to gather data?
Presentation transcript:

The process of evaluation Program Evaluation and Quality Assurance HPR 322 Chapter 13

Reasons to Evaluate - from participants’ standpoint Participant satisfaction Measure changes in participant leisure behavior Solicit participant input Evaluation encourages participant support Establishes communication between participant and organization Participant recommendations act as ‘needs assessment’

Reasons to evaluate - from programmer’s standpoint Promotes relationship between recreation leader and participants Enables programmers to develop sensitivity to participants Enables programmers to determine program design effectiveness Reveals the need for program improvements and enhancements

Reasons to evaluate - from organization’s standpoint Means of linking program performance (success or lack of) to budgetary allocations May work to provide specific, measurable objectives (more specific than organization mission/vision statement goals) - this depends on how the evaluation is constructed Helps to determine program priorities Assist with quality control of service delivery

Terminology relating to evaluation Formative - during the programming; frequency usually pre- determined Summative - at the end of a program Assessment - determining the value of a program relative to the overall leisure delivery system. Also refers to measures of participants (esp in TR) Measurements - ways to obtain quantitative or qualitative data Standards - statements of desirable practice or performance Evaluative Research - testing of hypotheses to examine effect or change

Types of evaluative questions Questions relating to goals (what goals were chosen; what alternative goals might have been chosen) Questions relating to strategies (similar questions) Questions relating to program elements (design, implementation) Questions relating to results (results, long-term effects, etc.)

The 5 P’s of evaluation Participants Personnel Place Policies/administration Program

Difficulties in evaluation Unsubstantiated claims, including sweeping generalizations Too much subjectivity (participants had a ‘good time’) Careless or no data collection; trying to gather data after the fact Bias anywhere in the process Presenting or tracking only some data Failing to communicate the point of the process to all who are involved

Evaluation vs. Research  Concern is primarily program quality and attainment of goals  Used to help plan or alter programming - results specific to program  Priority placed on participants’ opinions  Subjective/objective  Concern is proving or disproving hypothesis  Used to determine success of experiment or intervention - results can be generalized  Priority placed more on changes in participants  Mostly objective

Quality assurance/quality control Lots of concern about ‘quality’ Applies to programming, programmers, experience, facilities, etc. What makes a leisure experience ‘high quality?’ What makes one ‘low quality?’ Quality is a subjective term; objective measures must be devised

Written (mail or in person), telephone, in person interview, online, group, combination (call ahead and mail) Cost and return % vary by type; sometimes incentives are offered to get responses Items can be open-ended, multiple choice, Likert (strongly disagree - strongly agree) combination My experience - easiest when people are there (pass out surveys during the last program meeting). Next best is requiring completion for a course grade! Ways to gather evaluation information

Developing a program evaluation instrument Evaluate participants’ feelings about program Determine whether you met program goals/objectives You could write an entire instrument to measure each of these Include some elements of each in your assessment instrument Needs assessment and program goals/objectives provide a starting point - may provide most of what you need to write items

What type of items are most effective? Always provide some place for comments - otherwise it looks like you don’t care Open ended items take longer to complete and may result in incomplete responses Choices are good (multiple choice, Likert) when possible. (Before you use Likert, scales, think about what you will do with your results; you may not need to know ‘degree’ of feeling)

A little more on ‘Likert’ Most are 5 item - from strongly disagree to strongly agree (with ‘neutral’ in the middle) Developed by Rensis Likert for thesis in 1930s Agreement/disagreement important as is degree of agreement/disagreement (how important is the item to you or how strongly do you feel about it). This level of information may be important in research but not to evaluate how well your program was received A 5 point - poor to excellent - scale may be better for this type of program assessment

How should your evaluation look? Overly long evaluations may be intimidating Too short evaluations may not give you enough information While consistency is good, forms probably should evolve over time Be prepared to use (or at least respond to) information you receive

Useful information Like needs assessment - name, address, age, etc. You should be able to more easily gather personal information from participants Identify program that is being assessed. If multiple times/sections are offered, identify which Rating/ranking/opinion of program, instructor, equipment, staff, facility, etc. Recommendations/suggestions about above Typical use patterns (are opinions of regular users more important?)

More useful information Other areas of interest or possible participation - existing or recommendations Interests/participation of other family members “Would you like to be contacted regarding this” or “May we contact you regarding this” Add to mail/ list Anything else that might help you with future programming Information for grant/fund providers

Back to my example This is a new program, so everyone is a first time participant (no need to ask that). I could ask if individuals have done similar programs I have some funding, so I need to be certain I gather information for the funders (they should have told me what they wanted to know when they awarded the funding) I know that I would like to continue so I can ask about possible future participation I needed financial help to keep from operating at a loss but I may not receive it next year. I can ask if participants would be willing to pay more, or I can phrase a question about value for their money

More about my water exercise program I probably won’t ask a lot about the facility - I should be gathering that data from pool attendees. However, I may want to get input from people who have not attended the pool regularly before my class Can I get this completed during the last meeting? Depends on usual behavior of participants (do they shower and change, hang around, or just get out of the pool and leave). Observing their behavior during regular sessions should give me some idea of how/when to administer

Program Evaluation Assignment Participant information - name, address, etc. 10 items evaluating program - facility, instruction, type/format, etc. Can re-use needs assessment items - rephrase if necessary. (Not ‘what time is convenient,’ but ‘was the scheduled time for this program convenient’) At least one item relating to one of the objectives you outlined Tell me your plan for getting the evaluations completed (during classes, mail, phone and mail, go door to door, , etc.)

Assignment One copy of blank evaluation form Directions for completion, if necessary. (If you are mailing, you would put directions about how/where to mail back, same for . You can make up addresses, addresses, phone #s) Indication of your plan for getting as many evaluations back as possible Due November 17 th with complete program