Download presentation
Presentation is loading. Please wait.
1
Questionnaires and Instruments
Chapter 13 Questionnaires and Instruments
2
Learning Objectives Understand...
The link forged between the management dilemma and the communication instrument by the management-research question hierarchy. The influence of the communication method on instrument design. The three general classes of information and what each contributes to the instrument.
3
Learning Objectives Understand . . .
The influence of question content, question wording, response strategy, and preliminary analysis planning on question construction. Each of the numerous question design issues influencing instrument quality, reliability, and validity. The sources for measurement questions The importance of pretesting questions and instruments.
4
Research Thought Leader
“WAP (mobile browser–based) surveys offer full survey functionality (including multimedia) and can be accessed from any phone with a web browser (which is roughly 90 percent of all mobile devices). As an industry we need to get comfortable with mobile survey formats because there are fundamental differences in survey design and we also need to be focused on building our mobile capabilities as part of our sampling practice.” Kristin Luck, president, Decipher This note relates to the effort it takes to develop a good measurement scale, and that the emphasis is always on helping the manager make a better decision—actionable data.
5
Overall Flowchart for Instrument Design
Exhibit 13-1 is a suggested flowchart for instrument design. The procedures followed in developing an instrument vary from study to study, but the flowchart suggests three phases. Each phase is discussed in this chapter.
6
Flowchart for Instrument Design Phase 1
Exhibit 13-2: By this stage in a research project, the process of moving from the general management dilemma to specific measurement question has traveled through the first three question levels: Management question –the dilemma, stated in question form, that the manager needs resolved; Research question(s) – the fact-based translation of the question the researcher must answer to contribute to the solution of the management questions; Investigative questions – specific questions the researcher must answer to provide sufficient detail and coverage of the research question. Within this level, there may be several questions as the researcher moves from the general to the specific; Measurement questions – questions participants must answer if the researcher is to gather the needed information and resolve the management question. Once the researcher understands the connection between the investigative questions and the potential measurement questions, a strategy for the survey is the next logical step.
7
Strategic Concerns in Instrument Design
What type of scale is needed? What communication approach will be used? Should the questions be structured? Should the questioning be disguised?
8
Technology Affects Questionnaire Development
Write questionnaires more quickly Create visually driven instruments Eliminate manual data entry Save time in data analysis
9
Disguising Study Objectives
Reluctantly shared, Conscious-level information Willingly shared, Conscious-level information Situations where disguise is unnecessary Knowable, Limited-conscious-level information Subconscious-level information
10
Factors Affecting Respondent Honesty
Exhibit 13-3 is developed from actual respondent behaviors.
11
Dummy Table: American Eating Habits
Age Use of Convenience Foods Always Use Use Frequently Use Sometimes Rarely Use Never Use 18-24 25-34 35-44 55-64 65+ Exhibit 13-4: Researchers are concerned with adequate coverage of the topic and with securing the information in its most usable form. A good way to test how well the study plan meets those needs is to develop “dummy” tables that display the data one expects to secure. For example, we might be interested to know whether age influences the use of convenience foods. The dummy table would match the age ranges of participants with the degree to which they use convenience foods. The preliminary analysis plan serves as a check on whether the planned measurement questions meet the data needs of the research questions. This also helps the researcher determine the type of scale needed for each question.
12
Flowchart for Instrument Design Phase 2
In Phase 2 (Exhibit 13-5), you generate specific measurement questions considering subject content, the wording of each question, and each response strategy. The order, type, and wording of the measurement questions, the introduction, the instructions, the transitions, and the closure in a quality communication instrument should accomplish the following: Encourage each participant to provide accurate responses, Encourage each participant to provide an adequate amount of information, Discourage each participant from refusing to answer specific questions, Discourage each participant from early discontinuation of participation, and Leave the participant with a positive attitude about survey participation.
13
Question Categories and Structure
Administrative Target Questionnaires can range from those that have a great deal of structure to those that are unstructured. They contain three categories of measurement questions. Administrative questions identify the participant, interviewer, interviewer location, and conditions. These questions are rarely asked of the participant but are necessary for studying patterns within the data and identify possible error sources. Classification questions usually cover sociological-demographic variables that allow participants’ answers to be grouped so that patterns are revealed and can be studied. These questions usually appear at the end of a survey. Target questions address the investigative questions of a specific study. These are grouped by topic in the survey. Target questions may be structured or unstructured. Classification
14
Engagement = Convenience
“Participants are becoming more and more aware of the value of their time. The key to maintaining a quality dialog with them is to make it really convenient for them to engage, whenever and wherever they want.” Tom Anderson managing partner Anderson Analytics This note relates to the effort it takes to develop a good measurement scale, and that the emphasis is always on helping the manager make a better decision—actionable data.
15
Question Content Should this question be asked?
Is the question of proper scope and coverage? Can the participant adequately answer this question as asked? These are the seven activities the researcher must accomplish to make an experiment a success. In the first step, the researcher is challenged to select variables that are the best operational definitions of the original concepts, determine how many variables to test, and 3) select or design appropriate measures for the chosen variables. The selection of measures for testing requires a thorough review of the available literature and instruments. In an experiment, participants experience a manipulation of the independent variable, called the experimental treatment. The treatment levels are the arbitrary or natural groups the researcher makes within the independent variable. A control group can provide a base level for comparison. A control group is a group of participants that is measured but not exposed the independent variable being studied. Environmental control means holding the physical environment of the experiment constant. When participants do not know if they are receiving the experimental treatment, they are said to be blind. When neither the participant nor the researcher knows, the experiment is said to be double-blind. The design is then selected. Several designs are discussed on the next several slides. The participants selected for the experiment should be representative of the population to which the researcher wishes to generalize the study’s results. Random assignment is required to make the groups as comparable as possible. Will the participant willingly answer this question as asked?
16
Question Wording Shared vocabulary Adequate alternatives Single
meaning Criteria Misleading assumptions Personalized Survey questions should be revised until they satisfy these criteria: Is the question stated in terms of a shared vocabulary? Does the question contain vocabulary with a single meaning? Does the question contain unsupported or misleading assumptions? Does the question contain biased wording? Is the question correctly personalized? Are adequate alternatives presented within the question? Biased
17
Response Strategy Objectives of the study Participant’s motivation to
share Participant’s level of information Factors In choosing response options in questions, researchers must consider these factors. Ease and clarity with which participant communicates Degree to which participants have thought through topic
18
Free-Response Strategy
Free-response questions, also known as open-ended questions, ask the participant a question and either the interviewer pauses for the answer or the participant records his or her ideas in his or her own words in the space provided on a questionnaire. Survey researchers try to reduce the number of these questions as they are difficult to interpret and are costly to analyze. What factors influenced your enrollment in Metro U? ____________________________________________
19
Dichotomous Response Strategy
Did you attend the “A Day at College” program at Metro U? Yes No Dichotomous questions are measurement questions that offer two mutually exclusive and exhaustive alternatives.
20
Multiple Choice Response Strategy
Which one of the following factors was most influential in your decision to attend Metro U? Good academic standing Specific program of study desired Enjoyable campus life Many friends from home High quality of faculty Multiple-choice questions are appropriate when there are more than two alternatives or where we seek gradations of preference, interest, or agreement. A problem can occur with this question type when one or more responses have not been anticipated. A second problem occurs when the list of choices is not exhaustive. Participants may also feel that they have multiple answers while the question allows for only one response; in this case, the response choices are not mutually exclusive. The order in which choices are given can also cause bias. Order bias with non-numeric response categories often leads the participant to choose the first alternative (primacy effect) or the last alternative (recency effect) over the middle ones. Primacy effects dominate in visual surveys while recency effects dominate oral surveys. Multiple choice questions usually generate nominal data.
21
Checklist Response Strategy
Which of the following factors influenced your decision to enroll in Metro U? (Check all that apply.) Tuition cost Specific program of study desired Parents’ preferences Opinion of brother or sister Many friends from home attend High quality of faculty The checklist question is a question that poses numerous alternatives and encourages multiple unordered responses. Checklists are efficient and provided nominal data.
22
Rating Response Strategy
Strongly influential Somewhat influential Not at all influential Good academic reputation Enjoyable campus life Many friends High quality faculty Semester calendar
23
Ranking Please rank-order your top three factors from the following list based on their influence in encouraging you to apply to Metro U. Use 1 to indicate the most encouraging factor, 2 the next most encouraging factor, etc. _____ Opportunity to play collegiate sports _____ Closeness to home _____ Enjoyable campus life _____ Good academic reputation _____ High quality of faculty When relative order of the alternatives is important, the ranking question is ideal. The ranking question is a measurement question that asks the participant to compare and order two or more objects or properties using a numeric scale. It is always best to have participants rank only those elements with which they are familiar. For this reason, ranking questions might follow a checklist question which identifies the objects of familiarity. Avoid asking participants to rank more than seven items. Ranking generates ordinal data.
24
Issues Related to Measurement Questions
Exhibit 13-6 summarizes the text issues related to measurement questions. This is 1 of 3 slides.
25
Issues Related to Measurement Questions
26
Issues Related to Measurement Questions
27
Summary of Scale Types Type Restrictions Scale Items Data Type
Rating Scales Simple Category Scale Needs mutually exclusive choices One or more Nominal Multiple Choice Single-Response Scale May use exhaustive list or ‘other’ Many Multiple Choice Multiple-Response Scale (checklist) Needs exhaustive list or ‘other’ Likert Scale Needs definitive positive or negative statements with which to agree/disagree Ordinal Likert-type Scale Exhibit 13-7 summarizes some important considerations in choosing between the various response strategies. While all of the response strategies are available for use in Web questionnaires, there are slightly different layout options for response in Web surveys.
28
Multiple Rating List Scale
Summary of Scale Types Type Restrictions Scale Items Data Type Rating Scales Numerical Scale Needs concepts with standardized meanings; Needs number anchors of the scale or end-points Score is a measurement of graphical space One or many Ordinal or Interval Multiple Rating List Scale Needs words that are opposites to anchor the end-points on the verbal scale Up to 10 Ordinal Fixed Sum Scale Participant needs ability to calculate total to some fixed number, often 100. Two or more Interval or Ratio Exhibit 13-7 summarizes some important considerations in choosing between the various response strategies. While all of the response strategies are available for use in Web questionnaires, there are slightly different layout options for response in Web surveys.
29
Summary of Scale Types Type Restrictions Scale Items Data Type
Rating Scales Stapel Scale Needs verbal labels that are operationally defined or standard. One or more Ordinal or Interval Graphic Rating Scale Needs visual images that can be interpreted as positive or negative anchors Score is a measurement of graphical space from one anchor. Ordinal (Interval, or Ratio) Exhibit 13-7 summarizes some important considerations in choosing between the various response strategies. While all of the response strategies are available for use in Web questionnaires, there are slightly different layout options for response in Web surveys.
30
Paired Comparison Scale
Summary of Scale Types Type Restrictions Scale Items Data Type Ranking Scales Paired Comparison Scale Number is controlled by participant’s stamina and interest. Up to 10 Ordinal Forced Ranking Scale Needs mutually exclusive choices. Ordinal or Interval Comparative Scale Can use verbal or graphical scale. Exhibit 13-8 summarizes some important considerations in choosing between the various response strategies. While all of the response strategies are available for use in Web questionnaires, there are slightly different layout options for response in Web surveys. Exhibit 13-7, starting on the next slide, illustrates these layout options.
31
Internet Survey Scale Options
Exhibit 13-7 ( 1 of 3)
32
Internet Survey Scale Options
Exhibit 13-7 ( 2 of 3)
33
Internet Survey Scale Options
Exhibit 13-7 ( 3 of 3)
34
Sources of Questions Handbook of Marketing Scales
The Gallup Poll Cumulative Index Measures of Personality and Social-Psychological Attitudes Measures of Political Attitudes Index to International Public Opinion Sourcebook of Harris National Surveys Marketing Scales Handbook American Social Attitudes Data Sourcebook Exhibit 13-9 provides sources of questions from books and web sites. This slide highlights many of the books listed in the Exhibit.
35
Flowchart for Instrument Design: Phase 3
As depicted in Exhibit 13-10, instrument design is a multistep process. Develop the participant-screening process along with the introduction. A screen question is a question to qualify the participant’s knowledge about the target questions of interest or experience necessary to participate. Arrange the measurement question sequence: Identify groups of target questions by topic Establish a logical sequence for the question groups and questions within groups Develop transitions between these question groups. Prepare and insert instructions including termination instructions, skip directions, and probes. Create and insert a conclusion, including a survey disposition statement. Pretest specific questions and the instrument as a whole.
36
Guidelines for Question Sequencing
Interesting topics early Simple topics early Sensitive questions later Classification questions last The question process must quickly awaken interest and motivate the participant to participate in the interview. More interesting topical target questions should come early. Classification questions that are not used as filters or screens should come at the end of the survey. The participant should not be confronted by early requests for information that might be considered personal or ego-threatening. Buffer questions are neutral measurement questions designed to establish rapport with the participant. These can be used prior to sensitive questions. The questioning process should begin with simple items and then move to the more complex, as well as move from general items to the more specific. Taxing and challenging questions later in the questioning process. Changes in the frame of references should be small and should be clearly pointed out. Use transition statements between different topics of the target question set. An example of a transition is provided in Exhibit Transition between topics Reference changes limited
37
Illustrating the Funnel Approach
How do you think this country is getting along in its relations with other countries? How do you think we are doing in our relations with Iran? Do you think we ought to be dealing with Iran differently than we are now? (If yes) What should we be doing differently? Some people say we should get tougher with Iran and others think we are too tough as it is; how do you feel about it? The procedure of moving from general to more specific questions is sometimes called the funnel approach. The objectives of this procedure are to learn the participant’s frame of reference and to extract a full range of desired information while limiting the distortion effect of earlier questions on later ones.
38
PicProfile: Branching Question
Sometimes the content of one question assumes other questions have been asked and answered. This is a branched question. In web surveys, branching allows for respondents to avoid “skip patterns”. Based on their answers, respondents are branched to the appropriate section of the survey. Web surveys also allow for piping. Piping takes the answer from one question and uses the answer in a later question. For instance, if in one question, a respondent named Diet Coke as his or her favorite beverage, a later question might ask “Which is your favorite characteristic of Diet Coke?”
39
Components of Questionnaires
Exhibit illustrates the components of a communication instruments. This is the first of three slides. Instructions to the interviewer or participant attempt to ensure that all participants are treated equally. Two principles form the foundation for good instructions: clarity and courtesy. Instruction language needs to be unfailingly simple and polite. Instruction topics include those for 1) terminating an unqualified participant, 2) terminating a discontinued interview, 3) moving between questions on an instrument, and 4) disposing of a completed questionnaire. The role of the conclusion is to leave the participant with the impression that his or her involvement has been valuable.
40
Components of Questionnaires
Exhibit illustrates the components of a communication instruments. This is slide 2 of 3. Instructions to the interviewer or participant attempt to ensure that all participants are treated equally. Two principles form the foundation for good instructions: clarity and courtesy. Instruction language needs to be unfailingly simple and polite. Instruction topics include those for 1) terminating an unqualified participant, 2) terminating a discontinued interview, 3) moving between questions on an instrument, and 4) disposing of a completed questionnaire. The role of the conclusion is to leave the participant with the impression that his or her involvement has been valuable.
41
Components of Questionnaires
Exhibit illustrates the components of a communication instruments. This is slide 3 of 3. The role of the conclusion is to leave the participant with the impression that his or her involvement has been valuable.
42
MindWriter Survey Now is a great time to evaluate the MindWriter instrument that has been discussed in several vignettes and Close-ups. It is contained in this chapter’s Close-Up. In Exhibit The following two slides focus in ton the two parts of the questionnaire. You may hide these slides if you do not wish to example this instrument more closely.
43
MindWriter Survey Use this slide if you want to examine the scales more closely (Close-Up. In Exhibit 13-13), otherwise, hide this slide.
44
MindWriter Survey Use this slide if you want to examine the scales more closely (Close-Up. In Exhibit 13-13), otherwise, hide this slide.
45
Overcoming Instrument Problems
Build rapport Redesign question process Explore alternatives There is no substitution for a thorough understanding of question wording, question content, and question sequencing issues. However, the researcher can do several things to help improve survey results. These are listed in the slide. Most information can be secured by direct undisguised questioning if rapport has been developed. The assurance of confidentiality can also increase participant motivation. You can redesign the questioning process to improve the quality of answers by modifying the administrative process and the response strategy. When drafting the original question, try developing positive, negative, and neutral versions of each type of question. Minimize nonresponses to particular questions by recognizing the sensitivity of certain topics. The final step toward improving survey results is pretesting, the assessment of questions and instruments before the start of a study. Pretesting can allow one to 1) discover ways to increase participant interest, 2) increase the likelihood that participants will remain engaged, 3) discover question content, wording, and sequencing problems, 4) discover target question groups where researcher training is needed, and 5) explore ways improve the overall quality of survey data. Urge your students to review Appendix 13a, especially Exhibit 13a-5, Restructuring Questions, for some insight into overcoming problems. Use other methods Pretest
46
Key Terms Administrative question Branched question Buffer question
Checklist Classification question Dichotomous question Disguised question Double-barreled question Free-response question Interview schedule Leading question Multiple-choice question
47
Key Terms Structured Unstructured Pretesting Primacy effect
Ranking question Rating question Recency effort Screen question Structured response Target question Structured Unstructured Unstructured response
48
Additional Discussion opportunities
Chapter 13 Additional Discussion opportunities
49
Verint: Enterprise Feedback Management
“[Systems] lacking the capability to analyze captured data in a holistic manner, render valuable information useless because it’s hidden and inaccessible-resulting in isolated, cumbersome decision-making.”
50
Snapshot: Mobile Questionnaires
10 or fewer questions Simple question modes Minimize scrolling Minimize non-essential content Minimize distraction
51
PicProfile: Travel Issues
You could have students work backward from this data presentation, asking them to create the questions that generated this information.
52
PicProfile: Invoke Engage
Real-time session 200 prescreened participants Text, visual, audio presentations Quantitative measures This picprofile relates to using moderators for discussions with large groups (supergroups). Qualitative measures
53
PicProfile: Survey Length
Insight Express shares insights on web survey length before abandonment of survey.
54
PicProfile: Office Romance
Spherion Workplace Snapshot Harris Interactive QuickQuery 1,588 working adults Men and women consider dating colleagues Few companies have workplace romance policy
55
Research Thought Leader
“Research that asks consumers what they did and why is incredibly helpful. Research that asks consumers what they are going to do can often be taken with a grain of salt.” Al Ries author, co-founder, and chairman Ries & Ries. This note relates to the effort it takes to develop a good measurement scale, and that the emphasis is always on helping the manager make a better decision—actionable data.
56
PulsePoint: Research Revelation
60 The percent of businesses hit annually by cybercrime. See the text Instructors Manual (downloadable from the text website) for ideas for using this research-generated statistic.
57
Questionnaires and Instruments
Chapter 13 Questionnaires and Instruments
58
Photo Attributions Slide Source 18 Courtesy of Wittenberg University
19 20 21 23 36 Photographer's Choice/SuperStock 45 Ravi Tahilramani/Getty Images 49 Courtesy of Verint 52 Courtesy of Invoke 54 Digital Vision
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.