Questionnaire Design and Data Collection Specialized part of quantitative research Only a few people in the world do it well Corporate organizations RAND for example
Data Collection Standardized “survey instrument” Training in unbiased collection techniques Data cleaning and compiling Question testing and Interviewer training Flow and instructions
Types of Data Collection Interview—phone or in person Self-administered—mailed or web
Mailed surveys Must include Letter of introduction and inducement Means of return mail Clear instructions and flow An address—therefore they have an inherent bias Risk of small response rate—junk mail
Mail Surveys Graph return rates Must use larger numbers to get adequate response rate Low response rates (50%) Graphing returns Solutions for poor returns are followup mailings, post card reminders
Interview Surveys Adds a layer of interpersonal interaction that may have positive attributes and negative ones Negatives—interviewer bias, poor measurement and sample interviewer quality and other errors of interaction Positives are that most people will respond, decreases bias if done well and can include a better balanced sample
Interview Surveys 90% response good Don’t know and not answered higher though (?) Explanation may produce bias Neutrality of interviewer and training in the giving of the survey key to negatives elimination (sampling error)
Interview Surveys Interviewer must be familiar with instrument Follow questions exactly No prompting “Probing” – a form of helping that can be neutral but must be taught exactly or it will produce bias
Interview training Practice Pilot studies—these help train interviewers and work out the kinks in the survey instrument itself Back up supervision Constant reassessment of process
Pretest and pilot studies Assure good training Test question validity Test collection and coding processes Allows for changes in format before spending lots of money
Phone surveys Anonymous Must have a phone (bias) Often do not include cells (bias) Cost, same instrument but all auditory stimuli—harder to standardize process and more likely to cause interviewer bias and lack of response (70% good)
Web surveys Must have computer (bias) Must have specific format that makes answering easy and fast Fast = errors Used for opinion polls and not for “real” surveys—too much bias
Data processing Database format Assign each response a number and make a key that is locked and kept secret—confidentiality—must avoid being able to know what person said what Conversion to statistically usable format
Coding and storage The process of changing the word response to statistically manageable format is called coding See GSS—assigned numbers Storage on disc and in paper form— always two places—cannot afford to lose data
Coding Optical coding may be wave of future Codebooks—provide the key to the conversion of answers to specific numbers Example MHSS in class
Data cleaning The process always has errors of copying and transcription Scopes out these errors Computer aided May need to eliminate questions and answers if not able to clean
Questionnaire Format Closed ended questions Easy and clear explanations of questions that are teachable to interviewer or respondent Flow understandable Tiring and attrition issues—interesting first then demographics Matrix questions—may confuse Ordering issues
Questionnaire format Topic sequential—to avoid confusion of topics Random versus purposeful Must balance tiring with confusion at times The format depends on the type of survey
Modern developments Web Cell phone inclusion—2004 election Random-digit dialing Laws preventing solicitation Public mistrust of interviews