Download presentation
Presentation is loading. Please wait.
1
Week 6 Survey
2
Objectives Assignment review
Identify the content/elements of the cover letter Identify the elements of the survey How to prepare your own survey
3
Announcements Introduction Assignment is graded
Lit review Assignment is due today
4
Cover Letters for Survey Research Studies
The cover letter in a survey research study is used to meet IRB requirements and Federal laws pertaining to the protection of human subjects
5
Areas to Address in the Cover Letter
Purpose of the study Return date, process Confidentiality of Data Contact Information Name, address, telephone number of researcher Informed Consent
6
Informed Consent Informed Consent requires the researcher to:
Inform participant of potential risks Participation is voluntary Participants can withdraw at any time without penalty Confidentiality of information is ensured Means by which confidentiality will be kept Participants are required to: sign and date the form prior to participation
7
Informed Consent for Surveys
A signed consent form is not required in a survey study as long as: The Informed Consent information is provided in the cover letter The researcher does not require the respondent to place their name or an identifier onto the survey There are no risks to the participant
8
Survey Research Plan Before a survey research study can be conducted, plans should be constructed to determine: The process for implementing the study Selection of survey subjects Number of subjects to survey Anticipated response rates Costs for administering the survey Data collection and analysis
9
Determining the survey sample
The survey sample must be representative of the total population Characteristics of subjects that could influence results must be identified and controlled The number of subjects to survey can be determined using a defined confidence interval and a margin of error Confidence intervals are calculated differently for averages, proportions, etc.
10
Determining the survey sample
For example, let’s say a 95% confidence interval is selected by the researcher This means, if the same survey questions are administered in 100 different studies, the researcher can expect to obtain the same results 95% of the time. Using this confidence interval, a margin of error can be defined Let’s say a researcher wishes to maintain a 4% margin of error for an item. The researcher obtains a result of 60% yes and 40% no. With a margin of error of 4%, the “yes” results could really be between 64% - 56%.
11
Sample Size Example Formula
This formula can be used for an average that follows the bell shaped curve: Z s/2 is known as the critical value, the positive value that is at the vertical boundary for the area of in the right tail of the standard normal distribution. s is the population standard deviation. N is the sample size. This formula can be used when you know the standard deviation and want to determine the sample size necessary to establish, with a known confidence level and margin of error (E), the mean value to within . As a general rule of thumb, if your sample size is greater than 30, you can replace by the sample standard deviation s.
12
Sample Size for a Various Types of Procedures
There are a number of tables constructed to determine sample sizes, using the power of the test, and the anticipated outcome The power of a statistical test is the probability, assuming that the null hypothesis is false (i.e. an effect is significant) of obtaining a result that will allow the rejection of the null hypothesis. The tables constructed for various types of statistical procedures in which the researcher can determine the number of subjects required to find an anticipated outcome while meeting a certain power level. Using a correlation power table as an example, the table indicates that with an Alpha Level of .05, and an anticipated correlation on .80, I would need to have a minimum of 19 subjects in my study to be 99% sure of identifying a significant finding if it does exist. Note: We will cover the power of a test and Type I and Type II errors in a later class.
13
Determining the survey sample
A minimum number of responses are needed to maintain the confidence interval and the margin of error. The number of subjects can be calculated using the desired confidence interval and margin or error. Remember, this is the number of respondents not the number of people in the sample If a researcher can expect a response rate of 20%, then they need to survey many more people to obtain the required sample size
14
Procedures for collecting the data
How will the data be collected? Online, paper. etc Who will enter the data? Is training required (i.e.: interviews)?
15
Activity Cover Letter Develop a cover letter that will be attached to the survey instrument. Be sure to include vital information required by the University Institutional Review Board
16
Guidelines for Writing Surveys
A number of resources available Survey Research by Backstrom and Hursh-Casar Survey Research Methods by Fowler
17
Survey Terms closed question: context effects: open question:
A survey question that offers response categories. context effects: The effects that prior questions have on subsequent responses. open question: A survey question that does not offer response categories. recency effect: Over reporting events in the most recent portion of a reference period, or a tendency to select the last-presented response alternative in a list. Respondents focusing on most recent thing they heard response effects: The effects of variations in question wording, order, instructions, format, etc. on responses. screening questions: Questions designed to identify specific conditions or events.
18
General Guidelines Keep the survey as short as possible
Ask only what you need The more white space on the survey instrument the better Typically, surveys begin with demographic items then get into specific research areas Save the more “delicate” questions for the end of the survey
19
Consistency Instructions should be provided for the survey and unique types of items Provide definitions if necessary The survey instrument should have like types of items grouped together Keep scales in the same direction
20
Closed Response Questions
Agree–disagree Forced choice Ordered response categories or scales
21
Question Order Question order changes the context in which a
particular question is asked. Prior questions can influence answers to subsequent questions through several mechanisms. For example, an obscure “monetary control bill” was more likely to be supported when a question about it appeared after questions on inflation, which presumably led respondents to infer the bill was an anti-inflation measure.
22
Terminology “Avoid ambiguity” is a truism of questionnaire design.
Language is inherently ambiguous, and seemingly simple words may have multiple meanings. Research by Belson and others demonstrates that ordinary words and phrases, such as “you,” “children,” and “work”are interpreted very differently by different respondents.
23
Terminology In a national sample, respondents were randomly assigned to be asked one of two questions: 1. “Do you think the United States should allow public speeches against democracy?” 2. “Do you think the United States should forbid public speeches against democracy?” Support for free speech is greater–by more than 20 percentage points–if respondents answer question 2 rather than question 1. That is, more people answer “no” to question 2 than answer “yes” to question 1; “not allowing” speeches is not the same as “forbidding” them, even though it might seem to be the same.
24
Leading Questions A leading question is simply one that, either by its form or content, suggests to the subject what answer is desired or leads him/her to the desired answer (Loftus & Palmer, 1974). How fast were the cars going when they hit each other?“ In this question the word hit would be replaced with other words such as smashed, collided, bumped, and contacted. When the word smashed was used, results showed that the subject estimated the car to be traveling at a faster speed then when the word bumped was used.
25
Don’t Know Giving “don’t know” as an explicit response option?
On one hand, this has been advocated as a way of filtering out respondents who do not have an opinion and whose responses might therefore be meaningless. On the other hand, it increases the number of respondents who say “don’t know,” resulting in loss of data.
26
Likert Scales There are correct and incorrect ways of setting up a Likert Scale Used to collect data about respondents' feelings or attitudes The number at one end of the scale represents least agreement, or "Strongly Disagree," and the number at the other end of the scale represents most agreement, or "Strongly Agree." Ends of a Likert Scale are opposites Label each scale item (Don’t place one at the top of the page and require the respondent to refer back to it) The minimum number of points on the scale with descriptors should be both ends and the middle (If an odd number scale)
27
Odd Versus Even Number Likert Scales
When using an odd number of points on the scale (ie: 5 point scale), ends are opposite and the middle is “neutral” Don’t know or NA is not neutral When using an even number of points on the scale (ie: 4 point scale), end points are opposite. Even numbered scales force the respondents to make decisions.
28
Available Choices When asking a question with boxes to check, provide instructions as to “check one box”, check all that apply, etc. Be sure to eliminate overlapping categories How many years have your worked here? 0-1 year years years No possible answer What does someone check if they worked 7 years?
29
Thank You and Return Instructions
Thank the participant Provide instruction on how to return the survey
30
Follow-up for non-respondents
Non-respondents need follow up Send out reminders, second survey In true survey studies, a second survey study is conducted to determine if significant differences exist between those who responded to initial surveys and those who needed follow ups
31
Pilot Testing Before a survey instrument is ever used in a real study, it must be pilot tested!!
32
Planning Develop a plan for administering a short survey instrument. Determine the survey population, the procedures for collecting the data, etc. Answer the following questions: Who will be selected and how? How the survey will be distributed? What will the costs be to administer the survey? What will the timeline be? What will be done to stimulate the response rate?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.