IHTE-1800 Research methods: Surveys and interviews Sari Kujala, Spring 07
Contents Survey - Types - Process Interviews - Types Literature
What is a survey? (Pfleeger& Kitchenham, 2001) A comprehensive system for collecting information to describe, compare or explain knowledge, attitudes and behavior
Types of surveys Supervised - Telephone interviews - Group surveys Unsupervised - Mailed questionnaire - Electronic questionnaire
Survey process (Pfleeger &Kitchenham, 2001) 1. Setting measurable objectives 2. Planning and scheduling the survey 3. Ensuring that appropriate resources are available 4. Designing the survey 5. Preparing the data collection instrument 6. Validating the instrument (piloting) 7. Selecting participants 8. Administrating and scoring the instrument a 9. Analyzing the data 10. Reporting the results
Setting measurable objectives Preferably research questions or hypotheses Statements of the survey’s expected outcomes - What information will be identified - Target population Definitions of all potentially ambiguous terms
Designing the survey The goal is to provide the most effective means of obtaining the information - No bias - Apparopriate (makes sense in the context of the population) - Cost-effective
Descriptive designs Cross sectional - Information at one fixed point in time Cohort - Information about changes in a specific population Case control - Retrospective information about previous circumstances to help explain a current phenomenon
Experimental designs Concurrent control studies - participants are randomly assigned to groups - E.g. is training changing attitudes? Concurrent control studies – participants are not randomly assigned to groups Self-control studies - Pre- and post-treatment measures Historical control studies - E.g. comparison with previous surveys Combination of techniques
Sample size Sample size should be big enough - If different groups are compared, each group should have app. 30 data points in order to make statistical analyses Start sampling by defining a target population - May be a subset of a larger population, inclusion or exclusion criteria may be used
Sampling methods Probabilistic - Random - Stratified random sample (subgroups) - Systematic sampling (every nth member) - Cluster-based sampling (belonging to a defined group) Non-probabilistic - Convenience (who is available), snowball
Response rate A low response rate can destroy a good survey (range is from 10 % up to 90 %) The reasons for non-response should be known
Improving response rate Over-sampling, reminders, rewards Ensuring that people are able, willing and motivated to answer the questions Respondents should see some clear benefit to answering the questions
Data collection instrument Search the relevant literature - Existing instruments? Construct an instrument Evaluate/pilot the instrument Document the instrument
Constructing an instrument: what to ask? Use survey objectives Consider the respondents - Questions should be easy and accurate to answer, events not happened long in the past - Respondents have sufficient knowledge and position to answer Remember background or demographic questions to identify the respondent
Question types Open questions Standardized response format - Multiple-choice question - Likert scale statements
How to ask Don’t ask too many questions – too time demanding to answer –low response rate - Think every time: how you can use the answers, what is your hypothesis Don’t ask too many open questions - Laborious to answer - Difficult to classify, time consuming analyze Standardize responses when appropriate - offer possibility to answer “other” Give the respondent enough instructions
Information and instructions for respondents Include a cover letter providing a contact name and information Explain the purpose and relevance of the study Describe who is sponsoring the study and how confidentiality will be preserved Explain how the respondents were chosen and why Explain how to return the questionnaire Provide a realistic estimate of the time to complete the questionnaire
A good question is: Purposeful from the respondents’ point of view Clear Neutral, not leading Concrete Concentrating on essentials Asks only one issue All in all easy to answer
Example: Requirements quality questionnaire Disagree Agree don’t know Requirements are completely defined [ ] The requirements describe a system that meets user needs [ ] The requirements are based on the information gained from users and customers [ ] In all likelihood, there are moderately few errors in the requirements [ ]
Piloting Is the most important way of improving the quality of the questionnaire - Use real people belonging to target group - How is your questions and the used words understood? - Is all options available? - Iterate
Types of reliability Test-retest reliability Internal consistency - Is a group of items forming a single scale? - Statistical measures Inter-observer reliability when observers are completing a survey instrument - Statistical measures
Types of validity Face validity Content validity - A group of reviewers with knowledge of the subject matter and target population members review the survey contents Criterion validity - A measure of how well one instrument compares with another instrument or predictor Construct validity - The extent to which different data collection approaches produce similar results
Analyzing the data Check the incomplete questionnaires Partition the responses according to subgroups Make figures of the results Statistical analysis depending on the scale type of replies - Frequency, mean, variance etc. - Chi-squared test to measure associations among nominal scale variables - Variance analysis, correlations
Interviewing types Structured interviews - Questionnaire used Semi-structured or thematic interviews - Pre-defined themes used - Additional questions in order to understand the answers and find new interesting issues Open interviews - More like free discussions Individual or group interviews, focus groups
Literature Bourque, L. and Fielder, E. (1995) How to conduct self- administrated and mail surveys, Sage Publications Inc. Fink, A. (1995). The Survey Handbook. Sage Publications. Pfleeger and Kitchenham (2001). Principles of survey research. Parts 1-5. Software Engineering Notes. Straub, D.W. (1989) Validating Instruments in MIS Research, MIS Quarterly, 13, 2, Andrews, D., Nonnecke, B., Preece. J. (2003) Electronic survey methodology: A case study in reaching hard-to-involve internet users. International journal of human-computer interaction, 16, 2, Ropponen, J. and Lyytinen, K. (2000) “Components of Software Development Risk: How to Address Them? A project manager survey”, IEEE Transactions on Software Engineering, 26, 2, 2000, (good example)