Download presentation
Presentation is loading. Please wait.
Published byReginald Patterson Modified over 9 years ago
1
Basics of Survey & Scale Design Chan Kulatunga-Moruzi, PhD Department of Family Medicine McMaster University
2
Agenda Presentation: Overview of scales and surveys Survey and scale - similarities Survey and scale – differences Guidelines to scale and survey construction Group work: Identify common mistakes in survey questions
3
Surveys & Scale: Similarities Tools of research Usually ask a series of questions Gather data pertaining to central construct May contain sub-constructs Survey: Construct- Perception of PAs Scale: Construct - Professional Burnout
4
Surveys & Scale: Similarities Often use rating scales Likert-type, semantic differential Often self-administered Often based on self-report Similar issues/problems Social desirability bias – jeopardize validity
5
Scale: Description Also known as an index or inventory Responses: categorical, more likely rating scale* Combine an individual’s data to one meaningful number (interval level)** – Eating Disorders Inventory – Quality of Life Index – Minnesota Multiphasic Personality Inventory – Suicidal Ideation Scale
6
Scale: Function Used to describe a population/construct Overall scores or sub-scores used to make – Inferences – Identify, describe and compare – Make decisions (e.g. treatment) – Further research
7
Scale: Construction Knowledge of construct Depression: symptoms DSM, ICD-9, differentials Knowledge of psychometrics Reliability: test-retest, internal consistency, discrimination Validity: construct, external (concurrent/predictive) Reliability sets upper limit of validity
8
Scale: Construction Research to find existing measurement scale(s) Use previously validated scale Amend previously validated scale to suit your needs
9
Survey: Description Response format: mixture preferred Rating scale - Likert/semantic differential Multiple choice-categorical Rank order Open-ended Do not combine individual’s data to produce one meaningful number
10
Survey: Function Often used simply to describe a population Used to inform policy /administration, Used for program evaluation Individual questions may be used to make inferences, compare cohorts/populations
11
Survey: Construction Requires some knowledge of construct May be exploratory to learn about the construct Reliability & Validity assumed: – by securing representative sample – by asking well written questions – by using well constructed response options – by sound analyses
12
Survey & Scale Development Broad general topic Narrow down focus - Identify research question(s) - Operationalize/define concepts Objective: What is it that you want to know? Can you state your objective clearly and succinctly? What information is necessary to meet objective? Start with the end in mind
13
Survey & Scale Development Each question addresses research question Each question relevant to objectives Limited time/Survey fatigue Anticipate results you might receive Think about how you might analyze data Will help to construct better questions Will help use best questions formats
14
Survey & Scale Development Keep your respondents in mind Who will complete your survey? representative sample Respondents able to understand the question? Respondents able to answer the question? How can you make it easy to complete? Are questions relevant to all respondents?
15
Question Design: “BOSS” Be BRIEF Keep questions short and to the point Avoid long list of response alternatives to choose from or to rank order Take time to edit meaning visual clutter
16
Be OBJECTIVE Ensure questions are neutral Avoid leading questions Avoid built in assumptions Avoid loaded questions Be cognizant of the possible impact of words chosen and question phrasing/framing Question Design: “BOSS”
17
Be Simple Use simple language Avoid jargon and technical terminology Avoid double-barrel questions Question Design: “BOSS”
18
Be Specific Avoid broad questions May be interpreted differently by respondents May need to define/specify what you mean Question Design: “BOSS”
19
Group Work: 4 Cases Identify any problems you see with the item Re-write the items to address problems. Pay attention to stem & response options. Is there a better way to ask the question to meet the objectives of the research?
20
Case 1: Age Stem provides no context for the question a. To which age category do you belong? (nominal level) b. How old are you? (interval level) What is your date of birth? a.Easy to fill, increase response rate, personal question b.Enable better analysis, option to group later
21
Case 1: Age Problems with response options: inconsistent - words/hyphens not exhaustive – older/younger students not exclusive – 16 included in 2 options intervals not equal - 3 vs. 4 years
22
Case 2: Communication Skills Language used in the question Vague, wordy, jargon/too advanced Leading question Researcher assumptions “metamorphosized over the duration of…” Expects students are able to remember and accurately report back from the beginning
23
How might the researcher better meet his objectives? Student rate his/her communication skills after each patient encounter through out year SP rate students’ communication skills after each patient encounter though out year Video tape students throughout the year, ask blinded expert to rate communication skills Case 2: Communication Skills
24
Case 3: Engagement & Learning Outcomes Vague stem Which of these activities do you engage in? What do we mean by engage in? Dichotomous response options (yes/no) Reduce variability, reliability, validity Scaled response (5-7 pts) increase variability, reliability, validity
25
Inconsistent pronouns (you/I) Double barreled questions (class & office hrs) Improper punctuation (?) Case 3: Engagement & Learning Outcomes
26
Case 4: Diversity & Barriers to Higher Ed Loaded question Researcher’s assumptions Leading question Double barreled question Response options (odd vs. even number)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.