Nathan Lindsay & Larry Bunce February 19, 2014

Slides:



Advertisements
Similar presentations
Chapter 11 Direct Data Collection: Surveys and Interviews Zina OLeary.
Advertisements

Developing a Questionnaire
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
What is a Survey? A scientific social research method that involves
SEM A – Marketing Information Management
2.06 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
The Research Consumer Evaluates Measurement Reliability and Validity
Surveys and Questionnaires. How Many People Should I Ask? Ask a lot of people many short questions: Yes/No Likert Scale Ask a smaller number.
DEVELOPING A QUESTIONNAIRE FOR USE IN OUTCOMES ASSESSMENT
Primary and Secondary Data
Chapter 3 Producing Data 1. During most of this semester we go about statistics as if we already have data to work with. This is okay, but a little misleading.
Chapter 13 Survey Designs
Research Ethics Levels of Measurement. Ethical Issues Include: Anonymity – researcher does not know who participated or is not able to match the response.
Implementing Undergraduate-Faculty Teaching Partnerships in Your Classroom Anna L. Ball Neil A. Knobloch University of Illinois, Urbana-Champaign.
Focus Groups for the Health Workforce Retention Study.
Principles of Marketing
Developing a Questionnaire. Goals Discuss asking the right questions in the right way as part of an epidemiologic study. Review the steps for creating.
Efsa LEARNING PROGRAMME Module 4 - Session 4.5a Non - Probability Sampling Methods.
Chapter 13 Survey Designs
Designing Questionnaires for Research By Janna McColgan EDUC 491/EDUC 403 Research Spring 2012.
Survey Designs EDUC 640- Dr. William M. Bauer
© 2001 Dr. Laura Snodgrass, Ph.D.1 Non-experimental Methods Observation and Survey Research.
DR. DAWNE MARTIN MKTG 241 FEB. 15, 2011 Marketing Research.
Section 29.2 The Marketing Survey
Business and Management Research
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
CHAPTER FIVE (Part II) Sampling and Survey Research.
How to Make a Survey.
The Marketing Survey How to construct a survey
Survey Research Slides Prepared by Alison L. O’Malley Passer Chapter 7.
Assessment of Staff Needs and Satisfaction in an Evergreen Environment David Green NC Cardinal Consultant State Library of North Carolina.
Questionnaires and Interviews
By: Christopher Prewitt & Deirdre Huston.  When doing any project it is important to know as much information about the project and the views of everyone.
C M Clarke-Hill1 Collecting Quantitative Data Samples Surveys Pitfalls etc... Research Methods.
Chapter 7: surveys.
Collecting Quantitative Data
NorthSky Nonprofit Network Creating Customer Satisfaction Surveys Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates
Chapter 12: Survey Designs
An Insider’s Guide to Survey Design Amy Feder Annemieke Rice.
MAC Fall Symposium: Learning What You Want to Know and Implementing Change Elizabeth Yakel, Ph.D. October 22, 2010.
Creating Questionnaires. Learning outcomes Upon completion, students will be able to: Identify the difference between quantitative and qualitative data.
Community Health Assessment: Primary Data Collection LHD TA Project – Learning Collaborative 1 Community Health Assessment Second Learning Session Sheena.
Study of the day Misattribution of arousal (Dutton & Aron, 1974)
MARKETING SURVEYS Constructing the Questionnaire validity  A questionnaire has validity when the questions asked measure what they were intended.
Chapter 12 Survey Research.
Quality Assessment July 31, 2006 Informing Practice.
Quantitative SOTL Research Methods Krista Trinder, College of Medicine Brad Wuetherick, GMCTE October 28, 2010.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Beginning with the End in Mind: Choosing Outcomes and Methods
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Marketing Research Approaches. Research Approaches Observational Research Ethnographic Research Survey Research Experimental Research.
4. Marketing research After carefully studying this chapter, you should be able to: Define marketing research; Identify and explain the major forms of.
Chapter 14: Affective Assessment
1 Introduction to Statistics. 2 What is Statistics? The gathering, organization, analysis, and presentation of numerical information.
Efsa LEARNING PROGRAMME Module 4 - Session 4.5a Sampling.
Usage Numbers Track participation in programs or services Methods : existing data, tracking system, calendar system, Key Performance Indicator (KPI) Student.
Chapter 5: Research. Research is the most important to PR because it is used to... Achieve credibility with management Define audiences and segment publics.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Consumer Behavior, Ninth Edition Schiffman & Kanuk Copyright 2007 by Prentice Hall Chapter 2 Consumer Research.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
Journal Entry §Do you think taking surveys online has had a positive or negative effect of marketing research? Why?
CREATING A SURVEY. What is a survey questionnaire? Survey questionnaires present a set of questions to a subject who with his/her responses will provide.
Evidence Based Practice & Research in Nursing Level 8, Academic Year (AY) 1434—1435 H Vanessa B. Varona, RN, MAN.
DESIGNING GOOD SURVEYS Laura P. Naumann Assistant Professor of Psychology Nevada State College.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Conducting surveys and designing questionnaires. Aims Provide students with an understanding of the purposes of survey work Overview the stages involved.
Highline community college
Business and Management Research
Marketing Surveys Lecture: min 29.2.
Business and Management Research
Presentation transcript:

"Developing Surveys to Measure Student Satisfaction and Learning Outcomes“ Nathan Lindsay & Larry Bunce February 19, 2014 Adapted from Presentations given by Jennie Gambach, Assistant Director of Assessment programs at Campus Labs, and Annemieke Rice, Director of Campus Success for Campus Labs, and Amy Feder, Assessment Consultant at Campus Labs. www.campuslabs.com/blog @CampusLabsCo #labgab Like us on Facebook!

There are many types of surveys to consider… Satisfaction Surveys Learning Outcome Surveys Needs Assessments Exit Surveys Alumni Surveys User Surveys Non-User Surveys Student/Faculty/Staff/General Public Surveys Other?

Steps in survey design Outline topic(s) and draft items Pilot test survey and revise Review and revise survey Determine physical characteristics of survey Determine item sequence Write and edit items Choose response formats Outline topic(s) and draft items

Determine your purpose G N

Begin with the end in mind… What do you want/need to show? Why do you need to show it? Who is the source of your data? How will you use the data? Who will need to see results?

The purpose of this assessment is… To better understand what the needs of our veteran students are, and how the new Veteran’s Center can meet them To evaluate if students achieved the stated learning outcomes of our workshop, and what additional training needs they have To demonstrate to stakeholders the impact that living in the residence halls has on student development To assess student awareness of services in order to develop our marketing and communications plan

Determine your purpose Examine past assessments E S I G N

Examine past assessments Those who cannot remember the past are condemned to repeat it. Philosopher, George Santayana 1905 Did you use the data? If not, what kept you from examining it? If you used the data… What was useful? Was any of the data difficult to analyze? Were there questions you wished you had asked? Did any question wording make you unsure of what the data meant? What feedback did you receive from those who participated?

D E S I G N Determine your purpose Examine past assessments Select the appropriate method S I G N What type of data do you need? Has someone already collected the information you are looking to gather? (How) can you access the existing data? (How) can you use the existing data? Is there potential for collaboration with another individual, program, or department? How can you best collect this data?

Select an appropriate method Indirect vs. direct Quantitative vs. qualitative Formative vs. summative Population vs. sample

Quantitative Qualitative Focus on text/narrative from respondents Focus on numbers/numeric values Who, what, where, when Match with outcomes about knowledge and comprehension (define, classify, recall, recognize) Allows for measurement of variables Uses statistical data analysis May be generalize to greater population with larger samples Easily replicated Focus on text/narrative from respondents Why, how Match with outcomes about application, analysis, synthesis, evaluate Seeks to explain and understand Ability to capture “elusive” evidence of student learning and development

A subsection of that group Sampling Population The whole group Example: survey goes to entire campus If entire campus: use sparingly and coordinate with Institutional Research Sample A subsection of that group Example: survey goes to 30% of campus So sampling is a way to obtain information about a large group by examining a smaller selection (the sample) of group members. If the sampling is conducted correctly, the results will be representative of the group as a whole; therefore it is just as effective as sending out your assessment to the whole group – and you are getting that added benefit of limiting the number of surveys individual students are receiving.

Stratified Random Sample Sampling strategies Simple Random Sample gives everyone in sampling population an equal chance of selection; a probability sample Stratified Random Sample breaks total sample into subpopulations and then selects randomly from each stratum There are different types of samples you can pull. Simple Random Sample Example: Asking your registrar’s office to give a random sample Stratified Random Sample Example: Splitting a demographic file by academic year to randomly choose from each year (often used strategy for our Consortium projects). Or this done be by any number of demographic areas depending on your needs. Systematic Sample Example: List of 300 people, start on the 5th person and use every 3rd person after that The type of sample you will need is dependent on your particular assessment project. Your assessment consultant at campus labs is happy to give you advice on sampling, or often there are in house experts you can go to in research oriented academic departments or the Institutional Research that can support you if you need assistance in this area.

Sample suggestions Number of Students in Population Random Sample Size (suggestion) 1,000 278 500 217 350 184 200 132 100 80 50 44 Based on 5% margin of error Suggestion in: Assessing Student Learning by Linda Suskie Sample size is the desired number of respondents NOT the number of individuals invited to participate. Here are some sample suggestions. Note that your sample size is the desired number of respondents not the number of individual invited to participate. Because as we all know, if you email a survey to 278 students, it’s not very likely that all 278 students are going to respond to it. So for this reason, if your population is relatively small, sampling may be less a of factor, and the number of participant you give the survey to will very depending on the method of administration. For example, if you have a population of 100 student leaders and you give them an assessment in person at the first student leadership conference of the year that they are all required to attend, in that case, just giving the assessment to 80 individuals might work. But on the other hand, if you are emailing the survey to them after the fact, and you know your email survey response rate is usually only around 50%, in that case you are probably going to send the survey to everyone in that population.

Direct Methods Indirect Methods Any process employed to gather data which requires subjects to display their knowledge, behavior, or thought processes. Any process employed to gather data which asks subjects to reflect upon their knowledge, behaviors, or thought processes. I know where to go on campus if I have questions about which courses to register for in the fall. Strongly agree Moderately agree Neither agree nor disagree Moderately disagree Strongly disagree Where on campus would you go or who would you consult with if you had questions about which courses to register for the fall?

Formative Summative Conducted during the program Purpose is to provide feedback Use to shape, modify or improve program Conducted after the program Makes judgment on quality, worth, or compares to standard Can be incorporated into future plans Interaction: In our example, what do we need? Talk about pros/cons

Is a survey right for you? Pros: Include large numbers Relatively fast and easy to collect data Lots of resources available Requires minimal resources Fast to analyze Good for surface level or basic data Cons: Survey fatigue and response rates Non-responsive Limited in type of questions asked Lacks depth in data Skills set in both designing questions and analyzing data properly

Focus Groups Group discussions where the facilitator supplies the topics and monitors the discussion. The purpose is to gather information about a specific (or focused) topic in a group environment, allowing for discussion and interaction by participants. Similar to interviews, but use when the group interaction will give contribute to a richer conversation Really about getting the group talking amongst themselves

Is a focus group right for you? Pros: Helps to understand perceptions, beliefs, thought processes Small number of participants Focus groups encourage group interaction and building upon ideas Responsive in nature Relatively low costs involved Cons: Getting participants (think of time/places) Data collection and analysis takes time Data is as good as the facilitator Beware of bias in analysis reporting Meant to tell story, may not help if numbers are needed Data is not meant to be generalizable

Quick, 1-Minute Assessments On a notecard, write a real-world example of how you can apply what you learned. Pass an envelope containing notecards with quiz questions. Students pick one and have 60 seconds to answer and pass along. At the end of a workshop, ask students to write down 1 thing they learned, and 1 lingering question.

Is a quick assessment right for you? Pros: Provides a quick summary of take away from student perspective Quickly identifies areas of weakness and strengths for formative assessment Can track changes over time (short-term) Non-verbal (provides classroom feedback from all students) Captures student voice Short time commitment Provides immediate feedback Cons: Non-responsive Short (so you may lose specifics) Sometimes hard to interpret Need very specific prompts in order to get “good” data Plan logistics ahead of time and leave time during program/course May need to be collected over time

Mixed Methods Look for same results across multiple data collections Build upon or relate results from one assessment to another Use data from one method (e.g., a survey) to inform another method (e.g., a focus group) Able to increase scope, number, and types of questions

D Determine your purpose E Examine past assessments S Select the appropriate method I Identify ethical/logistical considerations G N

Identify ethical/logistical considerations Do you have the necessary resources and brain power? Do you need to go through IRB? Do you need to identify respondents for follow up, merging of data, tracking of cohorts, or pre/post analysis? Do you need to include demographic questions to drill down or separate data? Who needs to be involved at planning stage to avoid problems when results are in? Does anyone need to approve the project? Are there any political issues to be aware of?

D Determine your purpose E Examine past assessments S Select the appropriate method I Identify ethical/logistical considerations G Generate the best question and answer format N

What to consider Scales that match Mutually exclusive Exhaustive Neutral/Not applicable/Non-response options Choose not to respond Don’t know Not applicable Unable to judge No opinion Neutral Neither ___ nor ___

Pairing Question Text with Answer Choices Question text should be compatible with the answer choices e.g., “How satisfied were you with the following?” e.g., “Did you enjoy the Black History Month speaker?” Strongly agree Somewhat agree Somewhat disagree Strongly disagree Excellent Good Fair Poor Meals at the conference Location of the conference Date of the conference

Mutually Exclusive Answer Choices Response options should never overlap e.g., How many hours per week do you work? 0-10 10-20 20-30 30-40 Response options should exist independently of one another e.g., Which of the following statements describes your peer mentor? He/she is helpful and supportive He/she is difficult to get a hold of

Exhaustive Answer Choices Respondents should always be able to choose an answer e.g., How often do you use the University website? Daily 2-3 times a week Weekly Monthly

Non-response options Always consider a non-response option Choose not to respond Don’t know Not applicable Unable to judge No opinion Neutral Neither ___ nor ___ Customize the non-response option when possible e.g., How would you rate the leadership session? Excellent Good Fair Poor Did not attend

Pitfalls to avoid Socially desirable responding – based on social norms Can never be eliminated Consider sensitive topics like race, drug and alcohol use, sexual activity, and other areas with clear social expectations Leading questions – suggesting there is a correct answer e.g., “Why would it be good to eliminate smoking on campus?” Double-barreled questions – asking more than one question e.g., “What were the strengths and weaknesses of orientation?” Double negatives – including negative phrasing which makes responding difficult e.g., “I do not feel welcome in my residence hall.”

Response Formats Open ended responses Free response - text Numeric Yes/No with please explain Types of multiple choice responses Yes/No Single response Multiple response (e.g., Check all that apply, Select 3) Ranking Scales .

Yes/No When to Use: When Not to Use: There is no response between “Yes” and “No” e.g., “Have you ever lived on campus?” You consciously want to force a choice even if other options might exist e.g., “Would you visit the Health Center again?” When Not to Use: There could be a range of responses e.g., “Was the staff meeting helpful?”

Single response When to Use: When Not to Use: All respondents would only have one response e.g., “What is your class year?” You consciously want to force only one response e.g., “What is the most important factor for improving the Rec Center?” When Not to Use: More than one response could apply to respondents e.g., “Why didn’t you talk to your RA about your concern?”

Multiple response Options: “Check all that apply” or “Select (N)” When to Use: More than one answer choice might be applicable e.g., “How did you hear about the Cultural Dinner?” (Check all that apply) You want to limit/force a certain number of responses e.g., “What were your primary reasons for attending?” (Select up to 3) When Not to Use: It’s important for respondents to only be associated with one response e.g., “What is your race/ethnicity?”

Ranking When to Use: When Not to Use: You want to see the importance of items relative to one another e.g., “Please rank how important the following amenities are to you in your residence: (1=most important)” You are prepared to do the analysis and interpretation! When Not to Use: You want to see the individual importance of each item e.g., “How important are the following amenities to you?”

Scales When to Use: When Not to Use: You want to capture a range of responses e.g., “How satisfied were you with your meeting?” When you would like statistics e.g., 4 = strongly agree 3 = agree 2 = disagree 1 = strongly disagree When Not to Use: The question is truly a Yes/No question e.g., “My mother has a college degree.”

Scales Very safe Consider… Number of points Inclusion of neutral Bipolar – positive or negative (with or without a midpoint) Very safe Somewhat safe Somewhat unsafe Very unsafe Unipolar – no negative A great deal Considerably Moderately Slightly Not at all Consider… Number of points Inclusion of neutral Whether labels are needed Order (e.g., 1, 2, 3, 4, 5 or 5, 4, 3, 2, 1)

Recommended Scales

D Determine your purpose E Examine past assessments S Select the appropriate method I Identify ethical/logistical considerations G Generate the best question and answer format N Note the purpose for each data point

Note the reason for each data point Bubble next to question Compare against purpose to identify gaps Look for overlap Eliminate “nice to know” Help with ordering Retain for analysis step

DATA COLLECTION METHODS After thinking through some of these pre-considerations

Paper surveys Things to consider Captive audience Administrator available for questions No technology issues or benefits Data entry necessary With all the technology available to us, it’s easy to overlook paper surveys, but sometimes paper surveys can be a really great option. If you do have a captive audience, perhaps students taking your class, or employers at a career fair, or students attending a workshop presentation, then passing out a paper survey can really up your survey response. Some of the other benefits is that you don’t need to worry about technology issues and often you are going to be there, if there are any questions. In terms of cons, it’s not going to be as eco friends and of course the big detriment of course is that data entry will be necessary; however we do try to this as easy as possible for you in the Baseline system.

Web surveys Things to Consider: No data entry Technology issues and benefits Immediate results Can be anonymous or identified Not a captive audience As oppose to paper surveys or mobile surveys, you don’t necessarily have a captive audience, but you are getting the benefit of having the data automatically entered into the system, they can either be anonymous or identified, as we discussed eariler.

Data Collection Methods Pros Cons Web No data entry Accuracy is excellent Technology benefits (e.g., display rules, required questions) Immediate results Anonymous Audience is not usually captive Possible misinterpretation (can’t ask ?s) Technology issues Response sample unrepresentative Mobile Accuracy is good Captive audience Administrator is available for ?s Limited formatting Anonymity is questionable Paper No technology issues No benefits of technology Accuracy can be compromised Data entry necessary

SURVEY FATIGUE So let talk about survey fatigue, because everything I said regarding survey administration so far doesn’t mean much if no one agrees to take your survey.

General information Survey response rates have been falling Difficult to contact people Refusals to participate increasing Strategies for correcting low response rates: Weight the data for non-response Implement strategies to increase response rates We know that Survey response rate have been falling throughout the past decades. There are strategies for correcting low response rates after data has been collected be weighting the data for non-response, but it requires some sophistocated statistical anaysis, so generally it’s a little better to implement strategies to increase responses.

Non-response may not be random Correlation exists between demographic characteristics and survey response Higher response has been found among certain sub-populations: Women Caucasians High academic ability Living on campus Math or science majors Research is inconsistent Research is inconsistent here but there are some studies that suggest that there is a correlation between demographic characteristics and survey response.

IMPROVING RESPONSE RATES So from what we know of the research and theories of survey response, what can we do to improve response ratees.

Specific techniques Survey length Preannouncement Invitation text Reminders Timing of administration Incentives Confidentiality statements Salience Request for help Sponsorship Deadlines We’ll go through a number of these techniques from moderating survey length to including deadlines.

Survey length Greater attrition at 22 questions or 13 minutes What to consider: Excluding “nice to know” Eliminate what you already know Outlining how results will be used Number of open-ended questions Number of required questions As a caveate to this slide I will just say that this is very dependent on individual assessment and campus in questions, but looking holistically research has shown greater attrition about the 22 question or 13 minute mark. This is an either or type senaro, because you could have a 22 question survey that is half open ended questions that could take someone 30 minutes to complete and you will experience greater attrition. If you are looking to cut down the length of your survey, a really good exercise is to go through each question one by one abd outline how you plan to use the results so that you can cut out questions that are non-essential. You can also limit the number of open-ended questions and also limit the number of required questions, if maybe it is more important for you to have a student get through to the end, even if it means they skip a couple of questions.

Invitations Importance/Purpose Relevancy to respondent Request for help How and by whom results are used How long it will take to respond Deadline Incentives/Compensation Contact information

Timing of contact/administration Avoid busy times or holidays Send email/ preannouncement 2-3 days prior to survey mailing First half of semester/term may be better if you are surveying in an academic environment In general is best to avoid those extremly busy times and at the same time the slower holiday times when students might not be on campus or checking their email. We have also found that usually the first half of the semester is more effective in the academic environment. Which makes sense, because that is before it is getting too hectic. This is where is it really important to know your own campus culture, however, because the best time to send surveys does vary, for example, we know one campus where sending assessments in that last week after finals, but before graduation has worked pretty well.

Piloting 1. Take it as if you were respondent 2. Seek reviews from colleagues with no prior knowledge 3. Administer to sample of actual population being studied Focus group Questions at end of survey Observing

Reliability & Validity Reliability – yielding the same results repeatedly Test/Re-test – consistency over time Inter-rater – consistency between people Validity – accurately measuring a concept Internal – confidence results due to independent variable External – results can be generalized Face validity – does this seem like a good measure? If a survey is valid, it is almost always reliable!

USING THE SURVEY RESULTS

PR or advertising campaign

Results in survey invitations

Some Final Advice Google your area to see what other surveys have been conducted Contact Larry Bunce (Director of Institutional Research) or Nathan Lindsay for help in designing your survey Online surveys should be coordinated through the Office of Institutional Research

QUESTIONS? Nathan Lindsay Assistant Vice Provost for Assessment University of Missouri-Kansas City lindsayn@umkc.edu 816-235-6084 Customize your name and contact info