Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Class 9 Analyzing Pretest Data, Modifying Measures, Keeping Track of Measures, Creating Scale Scores November 15, 2007 Anita L. Stewart Institute for.

Similar presentations


Presentation on theme: "1 Class 9 Analyzing Pretest Data, Modifying Measures, Keeping Track of Measures, Creating Scale Scores November 15, 2007 Anita L. Stewart Institute for."— Presentation transcript:

1 1 Class 9 Analyzing Pretest Data, Modifying Measures, Keeping Track of Measures, Creating Scale Scores November 15, 2007 Anita L. Stewart Institute for Health & Aging University of California, San Francisco

2 2 Overview of Class 9 u Analyzing pretest data u Modifying/adapting measures u Keeping track of your study measures u Creating and testing scales in your sample

3 3 Summarize Data on Pretest Interviews u Summarize problems and nature of problems for each item u Determine how important problems are u Results become basis for possible revisions/adaptations

4 4 Methods of Analysis u Optimal: transcripts of all pretest interviews u For each item - summarize all problems –During standard administration –Responses to specific probes u Types of problems –Interviewer problems –Respondent problems

5 5 Methods of Analysis u Analyze dialogue (narrative) for clues to solve problems –During standard administration –Responses to specific probes

6 6 Behavioral Coding u Complements cognitive interviews u Systematic approach to identifying problems with items –“interviewer” and “respondent” problems

7 7 Examples of Interviewer “Behaviors” Indicating Problem Items u Question misread or altered –Slight change – meaning not affected –Major change – alters meaning u Question skipped

8 8 Examples of Respondent Behaviors Indicating Problem Items u Asks for clarification or repeat of question u Did not understand question u Doesn’t know the answer u Qualified answer (e.g., it depends) u Indicates answer falls between existing response choices u Refusal

9 9 Summarize Behavioral Coding For Each Item u Proportion of interviews with each problematic behavior u For standard administration –# of occurrences of each problem divided by N, e.g., 7/48 respondents requested clarification

10 10 Behavioral Coding Summary Sheet: Standard Administration (N=20) Item # Interviewer: difficulty reading Subject: asks to repeat Q Subject: asks for clarification 12/2001/20 2000 3 3/202/20 401/200

11 11 Additional “Behavioral Codes” Based on Probes u Respondents may appear to answer question appropriately u Additional problems identified with probes

12 12 Examples of Behavioral Codes Based on Probes u Probe on meaning –Response indicates lack of understanding u Probe on use of response options –Response indicates options are problematic

13 13 Behavioral Coding of Probe Results I asked you how often doctors asked you about your health beliefs. What does the term “health beliefs” mean to you? Behavioral coding: # times response indicated lack of understanding as intended –e.g., 2/15 respondents did not understand meaning based on response to probe

14 14 Behavioral Coding Summary Sheet: Standard Administration (N=20) + Probes Item #Probes Meaning unclear Interviewer -difficulty reading Subject: asks to repeat Q Subject: asks for clarification 1102/102/2001/20 200000 3154/151/203/202/20 415001/200

15 15 Interpret Results u Determine if problem is common –Items with only a few problems may be fine –Items are questionable when »several types of problems were found »several subjects experienced the same problem u Another approach: –Problem identified in >15% of interviews as criterion for further exploration

16 16 Interpret Results (cont.) u Determine if common problems with an item are serious –Gross misunderstanding of the question –Yields completely erroneous answer –Couldn’t answer the question at all u Some less serious “common” problems can be addressed by improved instructions or a slight modification

17 17 Behavioral Coding: Identifies Problem Items u Solution not always obvious u How to determine ways to modify the items

18 18 Content Analysis of Entire Interview u Use qualitative analysis software (e.g., NVIVO) u Review all dialogue that ensued during administration of structured items and open- ended probes –can reveal source of problems –can help in deciding whether to keep, modify or drop items

19 19 Results: Probing Meaning and Cultural Appropriateness I asked you how often doctors asked you about your health beliefs? What does the term ‘health beliefs’ mean to you? “.. I don’t want medicine” “.. How I feel, if I was exercising…” “.. Like religion? --not believing in going to doctors?”

20 20 Results: Probing Meaning and Cultural Appropriateness I asked you how often doctors asked you about your health beliefs? What does the term ‘health beliefs’ mean to you? “.. I don’t want medicine” “.. How I feel, if I was exercising…” “.. Like religion? --not believing in going to doctors?” u We changed the question to “personal beliefs about your health

21 21 Results: Probing the Meaning of a Phrase What does the phrase “office staff” mean to you? “the receptionist and the nurses” “nurses and appointment people” “the person who takes your blood pressure and the clerk in the front office”

22 22 Modification: Probing the Meaning of a Phrase What does the phrase “office staff” mean to you? “the receptionist and the nurses” “nurses and appointment people” “the person who takes your blood pressure and the clerk in the front office” u We changed the question to receptionist and appointment staff

23 23 Other Examples u On about how many of the past 7 days did you eat foods that are high in fiber, like whole grains, raw fruits, and raw vegetables? Probe: what does the term “high fiber” mean to you?

24 24 Other Examples u On about how many of the past 7 days did you eat foods that are high in fiber, like whole grains, raw fruits, and raw vegetables? u Behavioral Coding of item –Over half of respondents exhibited a problem u Review answers to probe –Over ¼ did not understand the term Blixt S et al., Proceedings of section on survey research methods, American Statistical Association, 1993:1442.

25 25 Other Examples (2) u I seem to get sick a little easier than other people (definitely true, mostly true, mostly false, definitely false) u Behavioral coding of item –Very few problems Blixt S et al., Proceedings of section on survey research methods, American Statistical Association, 1993:1442.

26 26 Other Examples (2) u I seem to get sick a little easier than other people (definitely true, mostly true, mostly false, definitely false) u Review answers to probe –Almost 3/4 had comprehension problems –Most problems around term “mostly” (either its true or its not)

27 27 Exploring Differences by Diverse Groups u Back to issues of “equivalence” of meaning across groups u All cognitive interview analyses can be done separately by group

28 28 Results: Use of Response Scale u Do diverse groups use the response scale in similar ways? u Re questions about cultural competence of providers –Interviewers reported that Asian respondents who were completely satisfied did not like to use the highest score on the rating scale CPEHN Report, 2001

29 29 Results: Probe on Difficulty: CES-D Item “During the past week, how often have you felt that you could not shake off the blues, even with help from family and friends” u Probe: Do you feel this is a question that people would or would not have difficulty understanding? –Latinos more likely than other groups to report people would have difficulty TP Johnson, Health Survey Research Methods, 1996

30 30 Use of Response Scale (Not in Pretest) u In an exercise class of Samoans, instructor asked them to rate the difficulty of the exercise he just did on a 1-10 scale u They did not understand what he meant by a 1-10 scale –“Western” metric?

31 31 Overview of Class 9 u Analyzing pretest data u Modifying/adapting measures u Keeping track of your study measures u Creating and testing scales in your sample

32 32 Now What! u Issues in adapting measures based on pretest results

33 33 Switzer et al. reading u From class 3 section of class binder u p 405-406 – modifying measures

34 34 Criteria for Whether or Not to Modify Measure u Contact author –May be open to modifications, working with you u Be sure your opinion is based on extensive pretests with consistent problems –Don’t rely on a few comments in a small pretest u Work with a measurement specialist to assure that proposed modifications are likely to solve problem

35 35 Tradeoffs of Using Adapted Measures Advantages u Improve internal validity Disadvantages u Lose external validity u Know less about modified measure u Need to defend new measure

36 36 Strategies for Modifying u Retain original intact items (if feasible) u Add modified items –New items –Slight modifications u If modifications are extensive –Pretest your new items

37 37 Modifying response categories u If response choices are too few and/or coarse, can improve without compromising too much –Try adding levels within existing response scale

38 38 One Modification: Too Many Response Choices SF36 version 1 u 1 - All of the time u 2 - Most of the time u 3 - A good bit of the time u 4 - Some of the time u 5 - A little of the time u 6 - None of the time SF36 version 2 u 1 - All of the time u 2 - Most of the time u 3 - Some of the time u 4 - A little of the time u 5 - None of the time

39 39 Modification of Health Perceptions Response Choices for Thai Translation Usual responses: u 1 - Definitely true u 2 - Mostly true u 3 - Don’t know u 4 - Mostly false u 5 - Definitely false Modified: u 1 – Not at all true u 2 – A little true u 3 - Somewhat true u 4 - Mostly true u 5 – Definitely true e.g., My health is excellent, I expect my health to get worse

40 40 Modifying Item Stems u If item wording will not be clear to your population –Can add parenthetical phrases u Have you ever been told by a doctor that you have diabetes (high blood sugar)?

41 41 Writing New Items u One approach if you find serious problems with a standard measure –Write new items that you think will be better »Same format u Always include entire original measure (if feasible) –New items are “extra”

42 42 Strategy for Modified Measures u Test measure in original and adapted form u Choose measure that performs the best

43 43 Analyzing New Measure (Scale) u Factor analysis –All original items –Original plus new items replacing original u Correlations with other variables –Does the new measure detect stronger associations? u Outcome measure –Does the new measure detect more change over time?

44 44 Overview of Class 9 u Analyzing pretest data u Modifying/adapting measures u Keeping track of your study measures u Creating and testing scales in your sample

45 45 Questionnaire Guides u Organizing your survey data and measures –Way to keep track of measurement decisions u Documents sources of measures before you forget –Any modifications

46 46 See “Sample Guide to Measures Used in Questionnaire/Survey” Handout u Type of variable u Concept u Measure u Data source u Number of items/survey question numbers u Number of scores or scales for each measure u References

47 47 Codebook: See “Sample Questionnaire Guide: Summary of Variables..” Handout u Develop codebook of scoring rules u Several purposes –Variable list –Meaning of scores –Special coding –How you want missing data handled

48 48 Item Naming Conventions u Optimal coding is to assign raw items their questionnaire number –Can always link back to questionnaire easily u Some people assign a variable name to the questionnaire item –This will drive you crazy

49 49 Variable Naming Conventions u Assigning variable names is an important step –make them as meaningful as possible –plan them for all questionnaires at the beginning u For study with more than one source of data, a suffix can indicate which point in time and which questionnaire –B for baseline, 6 for 6-month, Y for one year –M for medical history, L for lab tests

50 50 Variable Naming Conventions (cont) Medical History Questionnaire HYPERTMB HYPERTM6 Baseline 6 months

51 51 Variable Naming Conventions (cont) u A prefix can help sort variable groupings alphabetically –e.g., S for symptoms SPAINB, SFATIGB, SSOBB

52 52 Overview of Class 9 u Analyzing pretest data u Modifying/adapting measures u Keeping track of your study measures u Creating and testing scales in your sample

53 53 On to Your Field Test or Study u What to do once you have your baseline data u How to create summated scale scores

54 54 Preparing Surveys for Data Entry: 4 Steps u Review surveys for data quality u Reclaim missing and ambiguous data u Address ambiguities in the questionnaire prior to data entry u Code open-ended items

55 55 Review Surveys for Data Quality u Examine each survey in detail as soon as it is returned, and mark any.. –Missing data –Inconsistent or ambiguous answers –Skip patterns that were not followed

56 56 Reclaim Missing and Ambiguous Data u Go over problems with respondent –If survey returned in person, review then –If mailed, call respondent ASAP, go over missing and ambiguous answers –If you cannot reach by telephone, make a copy for your files and mail back the survey with request to clarify missing data

57 57 Address Ambiguities in the Questionnaire Prior to Data Entry u When two choices are circled for one question, randomly choose one (flip a coin) u Clarify entries that might not be clear to data entry person

58 58 Code Open-Ended Items u Open-ended responses have no numeric code –e.g., name of physician, reason for visiting physician u Goal of coding open-ended items –create meaningful categories from variety of responses –minimize number of categories for better interpretability –Assign a numeric score for data entry

59 59 Example of Open-Ended Responses 1.What things do you think are important for doctors at this clinic to do to give you high quality care? u Listen to your patients more often u Pay more attention to the patient u Not to wait so long u Be more caring toward the patient u Not to have so many people at one time u Spend more time with the patients u Be more understanding

60 60 Process of Coding Open-Ended Data u Develop classification scheme –Review responses from 25 or more questionnaires –Begin a classification scheme –Assign unique numeric codes to each category –Maintain a list of codes and the verbatim answers for each –Add new codes as new responses are identified u If a response cannot be classified, assign a unique code and address it later

61 61 Example of Open-Ended Codes Communication = 1 u Listen to your patients more often = 1 u Pay more attention to the patient = 1 Access to care = 2 u Not to wait so long = 2 u Not to have so many people at one time = 2 Allow more time = 3 u Spend more time with the patients = 3 Emotional Support = 4 u Be more understanding = 4 u Be more caring toward the patient

62 62 Verify Assigned Codes u Ideally, have a second person independently classify each response according to final codes u Investigator can review a small subset of questionnaires to assure that coding assignment criteria are clear and are being followed

63 63 Reliability of Open-Ended Codes u Depends on quality of question, of codes assigned, and the training and supervision of coders u Initial coder and second coder should be concordant in over 90% of cases

64 64 Data Entry u Set up file u Double entry of about 10% of surveys –SAS or SPSS will compare two for accuracy »Acceptable 0-5% error »If 6% or greater – consider re-entering data

65 65 Print Frequencies of Each Item and Review: Range Checks u Verify that responses for each item are within acceptable range –Out of range values can be checked on original questionnaire »corrected or considered “missing” –Sometimes out of range values mean that an item has been entered in the wrong column »a check on data entry quality

66 66 Print Frequencies of Each Item and Review: Consistency Checking u Determine that skip patterns were followed u Number of responses within a skip pattern need to equal number who answered “skip in” question appropriately

67 67 Print Frequencies of Each Item and Review: Consistency Checking (cont.) 1. Did your doctor prescribe any medications? (yes, no) 1a. If yes, did your doctor explain the side effects of the medication? u If 75 respondents (of 90) said yes to 1, expect 75 responses to question 1a. –Often will find that more people(e.g., 80) answered the second question than were supposed to

68 68 Print Frequencies of Each Item and Review: Consistency Checking (cont.) u Go back to a questionnaires of those with problems –check whether initial “filter” item was incorrectly answered or whether respondent inadvertently answered subset –sometimes you won’t know which was correct u Hopefully this was caught during initial review of questionnaire and corrected by asking respondent

69 69 Deriving Scale Scores u Create scores with computer algorithms in SAS, SPSS, or other program u Review scores to detect programming errors u Revise computer algorithms as needed u Review final scores

70 70 Creating Likert Scale Scores u Translate codebook scoring rules into program code (SAS, SPSS): –Reverse all items as specified –Apply scoring rules –Apply missing data rules u Sample for SAS (see handout)

71 71 Testing Scaling Properties and Reliability in Your Sample for Multi-Item Scales u Obtain item-scale correlations –Part of internal consistency reliability program u Calculate reliability in your sample (regardless of known reliability in other studies) –internal-consistency for multi-item scales –test-retest if you obtained it

72 72 SAS – Chapter 3: Assessing Reliability with Coefficient Alpha u Review statements and output u How to test your scales for internal consistency and appropriate item-scale correlations

73 73 SAS – Chapter 3: Assessing Scale Reliability with Coefficient Alpha u PROC CORR –DATA=data-set-name –ALPHA –NOMISS –VAR (list of variables) u Output: –Coefficient alpha –Item correlations –Item-scale correlations corrected for overlap

74 74 Testing Reliability in STATA u www.stata.com/help.egi?alpha www.stata.com/help.egi?alpha Alpha varlist [if] [in] [, options] SEE HANDOUT

75 75 What if Reliability is Too Low? u Have to decide if you need to modify a scale u New scales under development –Modify using item-scale criteria u Standard scales – cannot change –Simply report problems as caveats in your analyses u If problem is substantial –Can create a modified scale and report results using standard and modified scale

76 76 Homework for Class 10 u Summarize briefly your pretest results u Indicate whether the measure appears to be appropriate for the people in your pretest –No inferences to broader sample needed.


Download ppt "1 Class 9 Analyzing Pretest Data, Modifying Measures, Keeping Track of Measures, Creating Scale Scores November 15, 2007 Anita L. Stewart Institute for."

Similar presentations


Ads by Google