Presentation is loading. Please wait.

Presentation is loading. Please wait.

Surveys. How do you feel about surveys? Annoying? Intrusive? Frustrating? Ambiguous? Boring? Fun? Exciting? Great use of my time? Too short? Clear? How.

Similar presentations


Presentation on theme: "Surveys. How do you feel about surveys? Annoying? Intrusive? Frustrating? Ambiguous? Boring? Fun? Exciting? Great use of my time? Too short? Clear? How."— Presentation transcript:

1 Surveys

2 How do you feel about surveys? Annoying? Intrusive? Frustrating? Ambiguous? Boring? Fun? Exciting? Great use of my time? Too short? Clear? How do you feel about lectures on surveys?

3 Surveys A survey is an instrument that collects data that takes the form of a questionnaire or an interview Surveys can be Cross-sectional – a moment in time, the “snapshot” Longitudinal – perceptions over time Trends – follow a topic over time Cohorts /panels – follow a group or small sample over time

4 Sampling Surveys are either “sample” or “census” We cant always reach the whole population – but may instead look at a sample of that population If we want to generalize to the population we will need to randomly select the sample, or systematically select a sample that looks just like the population

5 Sampling Example For example, let’s say we want to know what special educators think are the most important factors keeping them on the job. What do you think they will say?

6 Sampling Example For example, let’s say we want to know what special educators think are the most important factors keeping them on the job. What do you think they will say? We probably cant survey ALL special educators (the census) We can randomly select teachers from all over the country… We can randomly select teachers from randomly selected districts… We can stratify the sample and then randomly select (small, medium, large, for example) Another alternative is to never generalize. If we select a sample from our district – it is simply the opinion of a few. If we don’t account from sample problems, we introduce bias

7 Retention Factors

8 Made up Example of Bias We want to know how our parents feel about year round schooling – there are 200 parents at our school – so we try to survey them all. We don’t know it, but about 20% (40) of the population is vehemently in favour of it – and will stop at nothing to voice their opinion The other 80% (160) are fairly ambivalent, but would probably prefer things the way they are now We send out surveys. We get a 30% (60) return rate Results indicate 66% of people surveyed would prefer year round schooling, so we make the change. BUT – all of the 40 replied, and only 20 of the 160 replied.

9 Real Example of Bias 34 masters students are sending their written comprehensive exam by Monday at noon. One student sends it 24 hrs late. The comps coordinator sends an email out to faculty asking if we should accept the paper. 4 reply – absolutely no. 7 do not reply. A later lunch conversation revealed that the 7 didn’t vote because they are newer faculty – and hesitate taking a stand. The majority of them would have accepted the paper. The student “failed out” of the program

10 Overview: Steps to Create a Survey 1. Define the purpose/delineate broad issues you will address 2. Choose a format and create a survey plan 3. Construct the questions 4. Pilot test the questions 5. Administer the survey 6. …Analyze the data

11 Step 1. Define the Purpose and Population The purpose of your survey comes directly from the purpose of your program evaluation – it is simply a component of the larger evaluation Be sure to isolate specific topics you want to address with the survey For example – do parents read to kids at home? This topic area might turn into 6 questions on a survey. The population will be made up of those who have the information you need. There may be multiple perspectives, so your population may not be homogenous For example – parents, students, teachers, and admin all have something to say about year round school

12 Step 2. Choose a Format/Make a Plan Questionnaire can be Web, Email, phone, or in person (individual or focus group) Each have advantages and disadvantages

13 What are the good and bad? (1 pt per box where you hit) MethodThe GoodThe Bad Mail Email Phone In-person

14 Format Comparison MethodThe GoodThe Bad MailInexpensive Can be confidential Standardized items/procedures Response rate low Cannot probe further Only reading respondents Can be slow to get responses EmailSame as above, plus Fast Easy to give to a lot of people Same as above, plus Need email Possibility of ballot box stuffing PhoneHigh response rate Quick data collection Can reach remote areas Need phone numbers Comprehensive admin training needed People don’t like it! In-personAbility to probe deeply High return rate Can be recorded for later Flexible format possible People need to be close Time-consuming No-anonymity Possible interviewer bias Comprehensive admin training needed

15 Advantages of web surveys Low costs - only software, no printing, envelopes or postage needed. No data entry costs. Minimal data entry errors. Easy to correct problems during survey administration. Quicker and cheaper than other methods Response rates are good, maybe somewhere around 40 - 60%, depending on the topic and population. Most are familiar with the process (you are now!)

16 Step 2. Choose a Format/Make a Plan No matter your delivery mode – you will need to prepare a cover letter. This explains the survey, and announces the upcoming delivery. It lets respondents know how important their input is. On a web survey – your cover letter is an email, and repeated on the first page of the survey Make a plan about when the cover letter goes out, when the survey goes out, and how you will follow-up with non- responders (reminders)

17 Example Web Survey Response Rate with Follow-ups

18 Step 3. Construct Questions Structured/closed (mc), semi-structured /open ended What are the advantages/disadvantages of each? Do you require demographics? Design Issues Keep it simple: avoid complicated designs with lots of colors. Make the survey one (screen) page in length, unless categories makes sense in which case group items by concept For drop down boxes, make sure first response category is not the first option (“select” usually works) Keep the survey as short as possible. Nobody likes a long survey.

19 Step 4. Pilot test You get an awful feeling after you send out a survey, when people let you know they didn’t understand questions It is hard to pilot test all the items It is harder to un-administer a bad survey The pilot sample should not just be your friends A variety of people that might loosely look like your sample It should also include your friends, because they will take time and be honest (usually) It should not be your partner – because they really wont care and will lie to you

20 Step 5. Administer the Survey The “tailored design” approach (Dillman (2000)) Different levels of response depending what happens Pre-notice (cover), survey, reminder, follow-up with survey again for non-responders, and so on. For web surveys it is simply multiple emails With multiple emails you need to take out respondents from your email list. Follow up emails can thus be tailored to non-responders If you do only one survey mailing, you WILL get a low response rate.

21 User Tracking You have two choices: 1. Ask for name and other identifying information Multiple responses can be eliminated. Responses can be linked to existing data No evidence this affects response rates Important to state that only aggregate data will be released. 2. Allow anonymous responses. In theory, anyone can answer the survey. Multiple responses will vary with topic and survey length. Cannot use incentives. Research indicates that promises of anonymity and/or confidentiality do not affect response rates except for sensitive topics such as sexual behavior.

22 Finer Points of Web Administration The email should be concise, with hyperlink to survey visible when participant opens the email. Provide a time estimate up front Avoid sending emails on Monday – I don’t know why. I would send my first on a Tuesday, and my second on a Saturday. How many? 2 or 3 reminders are fine, especially if you allow participant to opt out. Paper based invitation asking for Web participation does not work as well as an email invitation

23 Step 6. Analyze the Data You may need to merge with other data sources Frequencies, descriptives (e.g. mean and sd), group comparisons (e.g., t-tests and ANOVAs), relationships (e.g., correlations and regression) And finally – write the report.

24 Sources Dillman, Don A. (2009) Mail and Internet Surveys: The Tailored Design Method. New York: John Wiley & Sons. Stephen Porter and Michael Roy, Wesleyan Univ. Gay, Mills, & Airasian (2009). Educational Research. New York: Pearson. Donohue, E. (2004). Survey Techniques and Tactics. Google it.


Download ppt "Surveys. How do you feel about surveys? Annoying? Intrusive? Frustrating? Ambiguous? Boring? Fun? Exciting? Great use of my time? Too short? Clear? How."

Similar presentations


Ads by Google