Presentation is loading. Please wait.

Presentation is loading. Please wait.

2004 Public Health Training and Information Network (PHTIN) Series.

Similar presentations


Presentation on theme: "2004 Public Health Training and Information Network (PHTIN) Series."— Presentation transcript:

1 2004 Public Health Training and Information Network (PHTIN) Series

2 Site Sign-in Sheet Please mail or fax your site’s sign-in sheet to: Linda White NC Office of Public Health Preparedness and Response Cooper Building 1902 Mail Service Center Raleigh, NC 27699 FAX: (919) 715 - 2246

3 Outbreak Investigation Methods From Mystery to Mastery

4

5 2004 PHTIN Training Development Team Pia MacDonald, PhD, MPH - Director, NCCPHP Jennifer Horney, MPH - Director, Training and Education, NCCPHP Anjum Hajat, MPH – Epidemiologist, NCCPHP Penny Padgett, PhD, MPH – Epidemiologist and Surveillance Officer, NCCPHP Amy Nelson, PhD - Consultant Sarah Pfau, MPH - Consultant Amy Sayle, PhD, MPH - Consultant Michelle Torok, MPH - Doctoral Candidate Drew Voetsch, MPH - Doctoral Candidate Aaron Wendelboe, MSPH - Doctoral student

6 Future PHTIN Sessions September 14 th.......“Designing Questionnaires” October 12 th......... “Analyzing Data” December 14 th....... “Risk Communication” Each session will be on a Tuesday from 10:00 am - 12:00 pm (with time for discussion)

7 Session I – VI Slides After the airing of each session, NCCPHP will post PHTIN Outbreak Investigation Methods series slides on the following two web sites: NCCPHP Training web site: http://www.sph.unc.edu/nccphp/training/index.html North Carolina Division of Public Health, Office of Public Health Preparedness and Response http://www.epi.state.nc.us/epi/phpr/

8 Session III “Interviewing Techniques”

9 Today’s Presenters Anjum Hajat, MPH Epidemiologist, NC Center for Public Health Preparedness Martha Salyers, MD, MPH Team Leader, Public Health Regional Surveillance Team 6, Buncombe County Health Center Sarah Pfau, MPH Moderator

10 “Interviewing Techniques” Learning Objectives Upon completion of this session, you will: Recognize the interrelatedness of interview techniques and questionnaire design Understand key survey research terms Understand the advantages and disadvantages of face-to-face and telephone interview methods

11 Learning Objectives (cont’d.) Understand the advantages and disadvantages of mail and web based survey implementation Know what to cover in interviewer training Recognize good interview techniques Understand confidentiality concerns from the perspectives of both the respondent and the outbreak investigation

12 Basic Steps of an Outbreak Investigation 1.Verify the diagnosis and confirm the outbreak 2.Define a case and conduct case finding 3.Tabulate and orient data: time, place, person 4.Take immediate control measures 5.Formulate and test hypothesis 6.Plan and execute additional studies 7.Implement and evaluate control measures 8.Communicate findings

13 Interviewing Techniques Introduction

14 The role of interviews in outbreak investigations Types of interviewing methods Interrelatedness of interview method and questionnaire design Key survey research concepts –Sampling –Response rates

15 Role of Interviews in Outbreak Investigations Primary purpose: data collection Case identification Risk factor identification Hypothesis generation

16 Interviewing Methods 1.Interviewer Administered –Face-to-face –Telephone 2.Self Administered –Mail-out –Email –Web-based 3.Combination of 1 and 2

17 Questionnaire Design Interview Method Influenced by: Length and format of questionnaire Question types used in a survey Cost considerations for survey implementation

18 Questionnaire Design September 14 th PHTIN Session: “Designing Questionnaires”

19 Sampling

20 Sampling is the systematic selection of a portion of the larger source population. A sample should be representative of the larger source population.

21 Sampling Source Pop: Students (12,000) Sampled pop (150 students)

22 Sampling Why Sample? Because it is more efficient – saves time and money!

23 Sampling Sample size Is the purpose of the study to determine the source of the outbreak? –A small number of cases and controls can reveal risk factors for infection. Is the purpose of the study to determine the number of persons who become sick over a specific period of time [attack rate]? –A cohort study would require a larger sample.

24 Sampling Types of Sampling Simple Random Sample (SRS) Randomly select persons to participate in study. There are many variations of SRS. Convenience Sample Choose those individuals who are easily accessible.

25 Sampling Problems with Convenience Sampling Based on subjective judgment Cases may or may not be representative of the total population May lead to biased results

26 Sampling Additional Resources: http://www.sph.unc.edu/nccphp/training/all_trainings/at_sampl.htm 1.“Sampling Case Studies” 2.“Survey Sampling: Precision, Sample Size, and Conducting a Survey” 3.“Survey Sampling Terminology and Methods”

27 Response Rates

28 Response rates measure the percentage of your sample that has participated in your survey. Example: Using the campus directory, you email a survey to a random sample of 100 freshmen. 40 of those students complete the survey and return it electronically. Your response rate is 40%.

29 Response Rates High response rates ensure that survey data are representative of the source population, and that results will be valid.

30 Response Rates Types of Non-response Non-contact: No one at home Refusal to participate Inability to participate (due to language barrier or physical or mental condition)

31 Response Rates What is an average response rate?

32 Response Rates Determining Response Rates Refer to the American Association of Public Opinion Research website: www.aapor.orgwww.aapor.org –Link to the document titled, “Standard definitions” from the home page.

33 Interviewer Administered Data Collection Considerations

34 Interviewer Administered Data Collection Advantages and disadvantages of face-to- face interviews Advantages and disadvantages of telephone interviews Computer Assisted Interviews –PHRST Region 5 PDA initiative

35 Interviewing Methods 1.Interviewer Administered –Face-to-face –Telephone

36 Face-to-Face Interview Advantages: Higher response rate Longer survey instrument Can have more complex skip patterns More accurate recording of responses –Less item non-response Appropriate for hard to reach populations (e.g., illiterate, institutionalized)

37 Face-to-Face Interview Disadvantages: Costly Potential for interviewer error Less anonymous than self-administered –Participants less inclined to be honest

38 Telephone Interview Advantages: Less costly than face-to-face Higher response rates than mailed Quicker access to participants Supervision of interviewers feasible Can collect more sensitive information Survey design can be more efficient

39 Telephone Interview Disadvantages: Lower response rates than face-to-face Shorter questionnaires used Unable to capture important visual information (e.g., rash, working conditions) Under-coverage (e.g., population without phones)

40 Percentage of Households with No Telephone Service by County, NC Data source: 2000 U.S. Census

41 Computer Assisted Interviewing (CAI) CATI – Telephone CAPI – Personal ACASI – Audio

42 CAPI Example: PHRST Region 5 In the PHRST Region 5, NC public health professionals are training to use PDAs* for outbreak investigation and rapid needs assessment face-to-face interviews. * PDA: Personal Digital Assistant, also sometimes called hand-held computers, palmtops, and pocket computers To learn more about this technology initiative, please contact Steve Ramsey at sramsey@co.guilford.nc.ussramsey@co.guilford.nc.us

43 CAPI Example: PHRST Region 5

44 Self Administered Data Collection Considerations

45 Self-administered Data Collection Advantages and disadvantages of mailed questionnaires Advantages and disadvantages of Web- based questionnaires

46 Interviewing Methods 2.Self Administered –Mail-out –Email / Web-based

47 Mailed Questionnaire Advantages: More anonymous May collect more honest responses No interviewer error Less expensive Respondent has more time to think about question

48 Mailed Questionnaire Disadvantages: Questionnaire must be simple Higher item non-response Lower response rate Data collection takes more time Sample population must be literate Coverage / frame deficiencies

49 Web-based Questionnaire Advantages: Among some populations, most people may have access to the Web / email Inexpensive and fast No data entry required –Improves data quality Many vendors send data in a variety of formats

50 Web-based Questionnaire For a list of vendors that provide Web- based survey tools, please visit: http://www.surveymonkey.com/Pricing.asp

51 Web-based Questionnaire Example: Dartmouth University: 698 (13.8%) of 5060 students had conjunctivitis in spring 2002 To identify risk factors... –web-based questionnaire set up –E-mail sent to 3682 undergraduates –No data entry - rapid analysis 1832 responded (50% response rate) -- Source: An outbreak of conjunctivitis due to atypical Streptococcus pneumoniae. N Engl J Med. 2003;348 (12):1112-21.

52 Web-based Questionnaire Disadvantages: Mandatory access to and experience with Internet Potential connection speed and hardware / software capacity limitations Potential for multiple responses from one individual Potential for responses from non-sampled respondents Need email address list to contact sample

53 Question and Answer Opportunity

54 Standardizing Interviews

55 The goal of standardization is to help minimize error, thereby yielding better data quality Minimizing interviewer error is done through making surveys more standard or consistent

56 Error Interviewer Error Definition: Deviation from expected answer due to the effects of interviewers.

57 Interviewer Error Example: Gonorrhea outbreak Bias Interviewers are not told to probe on the sexual history section. Variance A male interviewer may elicit different responses from a female respondent than a female interviewer.

58 Error Additional Resource Schwarz, N., Groves, R., and Schuman, H., “Survey Methods” Chapter 4 in Gilbert, D. et al (Eds) (1998). The Handbook of Social Psychology. Boston: McGraw-Hill; pp 143 – 179.

59 Standardizing Interviews Contributing Factors: 1.Question wording 2.Interviewer selection 3.Interviewer training 4.Interviewing procedures 5.Supervising interviewers

60 1. Question Wording

61 Question Wording Criteria for Standardized Interview Questions Must be fully scripted Must mean the same thing to every respondent More discussion to follow in the September 14 th PHTIN session, “Designing Questionnaires”

62 2. Interviewer Selection

63 Interviewer Selection Criteria for Telephone Interviewer Selection Ability to read questions fluently Clear and pleasant telephone voice Responds quickly to respondent’s questions Reliability

64 Interviewer Selection Criteria for Face-to-Face Interviewer Selection Logistical skills (reading maps) Good interpersonal skills Independent workers Reliability In certain circumstances, parallel demographic characteristics among interviewers and interviewees

65 3. Interviewer Training

66 Interviewer Training Training is NOT optional! Trainings must be interactive Interviewers must practice reading questions out loud Provide support documentation (manual)

67 Interviewer Training What to cover Purpose of survey Respondent selection process Administering questionnaire Logistics Answering respondents’ questions Tracking calls / completed surveys Confidentiality

68 Interviewer Training Respondent Selection Process Provide proxy respondent rules for adults and children because proxy response impacts: –Data quality –Sampling

69 Interviewer Training Questionnaire Administration To establish legitimacy of the survey upon first contact, tell the respondent: Who is calling What is requested Why respondent should cooperate How respondent was chosen

70 Interviewer Training Logistics Face-to-Face Reading maps Getting to respondents’ homes Reimbursement Dress code Scheduling callbacks Telephone Operation of equipment Operation of CATI software (if applicable)

71 Interviewer Training Other Considerations Record some resolution to each question –Are missing responses due to skip patterns or errors? Review interview after completion –Missing responses –Illegible responses

72 Interviewer Training Interviewer Manual An interviewer manual serves as a reference to interviewers during interviews and as survey documentation.

73 Interviewer Training Suggested Interviewer Manual Contents Background information Fieldwork Interviewing techniques Survey instrument terms and definitions

74 5 minute break

75 Interviewer Training Program Example Behavioral Risk Factor Surveillance System (BRFSS)

76 BRFSS Interviewer Training On-line training covers: Why BRFSS data are important, how data are used Interviewer responsibilities Nuts and bolts of the interviewing process Interviewing techniques

77 BRFSS Interviewer Training On-line interviewer training available at: http://apps.nccd.cdc.gov/BRFSS_Training _Int/overview.asp General information about BRFSS: http://www.cdc.gov/brfss/ http://www.cdc.gov/brfss/

78 4. Interviewing Procedures

79 Interviewing Procedures Rules Read questions exactly as worded Probe inadequate answers, if necessary Record answers without interviewer discretion Maintain rapport with respondents Maintain an even pace

80 Interviewing Procedures Read questions exactly Read entire question before accepting an answer Clarify questions if necessary

81 Interviewing Procedures Read questions exactly Use only standard definitions / clarification provided Use the phrase: “Whatever x means to you”, OR “Whatever you think of as x.” When asked to repeat only one of several response options, repeat ALL options given for a question

82 Interviewing Procedures Probe A probe is a standardized way to obtain additional information from a respondent. Use probes when a respondent’s answer is unclear or irrelevant.

83 Probe Examples of responses requiring a probe: Interviewer: "In the past two weeks, have you been swimming in a public pool?” Irrelevant Response: “I swam in a lake at a national park last month." Unclear Response: “I stayed in a hotel with a pool when I was on vacation last week."

84 Interviewing Procedures Standard Probe Examples Repeat the question Retrieve receipts / calendars What do you mean? How do you mean? If respondent has narrowed down answer: –Which would be closer? –If you had to choose, which would you pick?

85 Interviewing Procedures Recording Answers Do not direct respondent toward an answer (leading) Do not assume that an “answer” received in passing is correct Do not skip questions, even if “answer” was given earlier Do not remind respondent of earlier remark if answer differs from what you expect

86 Probing versus Leading Example: Interviewer: In the last 7 days, how many times did you eat prepared food at the dorm cafeteria? Would you say: a.Noned. 3 times b.Oncee. More than 3 times c.Twice Respondent: “Oh, gee, I didn’t go very often... maybe a few times.”

87 Probing versus Leading Example: Interviewer Probe (correct) “Which would be closer: none, once, twice, 3 times, or more than 3 times?” Interviewer Leading (incorrect) a.“So, would you say twice, or three times?” b. “Do you mean twice, or three times?”

88 Interviewing Procedures Maintain Rapport An interviewer should be: Nonjudgmental Noncommittal Objective

89 Maintain Rapport “Any line can be said a thousand ways.” - BRFSS interviewer training Interviewers can put respondents at ease by doing the following: Read the questions in a friendly, natural manner Speak at a moderate rate of speed Sound interested Strive for a low-pitched voice

90 Feedback Helps Maintain Rapport Feedback is a statement or action that indicates to the respondent that s/he is doing a good job. –Give feedback only for acceptable performance - not “good" content. –Give short feedback phrases for short responses, longer feedback for longer responses. –Specific study information and interviewer task- related comments can serve as feedback. –Telephone interviewers should give feedback for acceptable respondent performance 30-50% of the time.

91 Feedback Examples “I see…” “Uh-huh” “Thank you / Thanks” “That is useful / helpful information” “I see, that is helpful to know” “That is useful for our research” “Let me get that down” “I want to make sure I have that right (REPEAT ANSWER)” “We have touched on this before, but I need to ask every question in the order that it appears in the questionnaire”

92 Interviewing Procedures Maintain Even Pace Pace refers to the rate of progression of the interview. Pace can vary by question type. Let the respondent set the pace.

93 Question and Answer Opportunity

94 Activity: Correct Interview Procedures Probing vs. Leading vs. Feedback Completion time: 5 minutes

95 Activity Interviewer: “Are you still experiencing Diarrhea?” Respondent 1: “I’m not sure” Respondent 2: “I definitely had diarrhea last Tuesday” Respondent 3: “Yes” Activity Instructions: How should the interviewer respond to these 3 answers? Provide an example of either a clarification, probe, or feedback that the interviewer could use. Try to think of one correct use of each technique.

96 Activity Suggested Answer Respondent 1: “I’m not sure” Try a clarification: “For the purposes of this survey, we consider diarrhea to be 3 or more loose bowel movements in a 24 hour period.”

97 Activity Suggested Answer Respondent 2: “I definitely had diarrhea last Tuesday” Try a Probe: “OK, but are you still experiencing diarrhea?”

98 Activity Suggested Answer Respondent 3: “Yes” Good Feedback: “I see” Bad Feedback: “Are you sure?” (leading)

99 5. Supervising Interviewers

100 Supervising Interviewers Monitoring, evaluation, and feedback given to interviewers should focus on the way interviewers handle the question- answer process.

101 Other Supervision Tasks Scheduling interviewers –Number of interviewers needed –Time calls / visits will be made Setting up interview space Tracking who has been called and who has not Reviewing data from completed interviews

102 Confidentiality

103 Human Subjects & Informed Consent Outbreak investigations are considered a public health emergency, with the purpose of identifying and controlling a health problem. Informed consent or Institutional Review Board (IRB) clearance are not required.

104 Confidentiality Human Subjects & Informed Consent If further analysis of outbreak investigation data is conducted for the purpose of research, IRB approval should be obtained.

105 Confidentiality Respondent Perspective Opening statement of every interview should indicate that all information collected will be kept confidential.

106 Confidentiality Outbreak Investigation Perspective Do not discuss details about the outbreak Provide only a brief description of the purpose of the survey at first contact

107 Confidentiality Example: Violation of respondent’s confidentiality from BRFSS training http://apps.nccd.cdc.gov/BRFSS_Training_Int/confidential_2.asp

108 5 minute break

109 Guest Expert Lecturer Martha Salyers, MD, MPH Team Leader, PHRST 6 Buncombe County Health Center

110 North Carolina Hurricane Isabel Rapid Needs Assessment September 2003

111 Rapid Needs Assessment

112 Background September 20, 2003: Hurricane Isabel en route to NC coast (Beaufort County) Access to Outer Banks not possible because of travel restrictions NC Emergency Management positioned regionally for response

113 Rapid Needs Assessment Background RNA process had been used in other disasters, e.g., Ankara earthquake 1999 Decision to perform RNA made as Isabel approached the NC coast 5 PHRSTs called to Raleigh; 2 PHRSTs in affected regions stayed to serve their area

114 Rapid Needs Assessment Background Thirty census clusters selected for a survey sample across 14 counties Ten assessment teams comprised of Public Health Regional Surveillance Team (PHRST) staff, UNC Chapel Hill School of Public Health students, and state agency volunteers deployed to “forward base” in Greenville

115 Rapid Needs Assessment Purpose was to collect data about: External or flood damage to homes Access to household utilities Incidence of hurricane-related illness and injury Access to food and water Access to medical care or medication Immediate needs

116 Needs Assessment Survey Instrument

117 Survey Instrument One-page survey instrument 24 questionnaire items 33 data fields Accompanied by a one-page “explanatory notes” form for interviewers

118 Survey Instrument

119 Interviewer Training

120 Cooper Building—Public Health response base; houses Public Health Command Center Approximately 2 hours / day over 2 days— repeated second day for new interview team members Conducted by out-posted CDC staff

121 Interviewer Training Each item on the “explanatory notes” form covered Detailed discussion of questionnaire items Questionnaire produced in Spanish

122 Interview Process

123 Overview Assessment teams deployed in official vehicles to selected census areas From starting point, moved sequentially along roadways to collect data from seven households per cluster Data collection was paper-based Total of 210 interviews completed

124 Interview Process Interview Teams Comprised of two interviewers Generally multi-disciplinary teams Radio assigned to each team

125 Interview Process Interview Teams Various approaches: Took turns interviewing –One person collated and numbered while the other interviewed Had a consistent interviewer

126 Interview Process Interview Teams Tools: Identifying clothing ID tag Writing implement Forms Clipboard Educational materials

127 Interview Process Challenges Initial selection of census tracts –In areas with relatively minor impact Differential effects –By neighborhood, by home Communication –Radios not useful; cell phones unreliable Reading maps Gasoline supply

128 Interview Process Challenges Logistics at forward base Collating and analyzing data Accountability / safety Wanting to help

129 Lessons Learned

130 Use existing materials wherever possible and tailor to your purposes Prepare in advance as much as possible Keep questionnaire brief; only ask as much as you have to know

131 Lessons Learned Train interviewers consistently Use a software program that allows for data merging Leverage partnerships innovatively

132 Lessons Learned Be flexible Be consistent—don’t be tempted to change the way you ask questions or record answers Back up your data on paper

133 Lessons Learned Don’t take unnecessary chances Assure consistent communication from field to base Working in teams is highly effective

134 Lessons Learned In a disaster, use an incident management structure such as ICS (Incident Command System) to organize response effectively –Manageable span of control –Accountability –Division of responsibility –Clear reporting relationships –Safety first

135 Lessons Learned You will be overwhelmed, so prepare beforehand to meet the challenge –Preposition resources –Prepare templates, collect exemplar documents, have expert consultants on tap –Exercise your staff –Have backups –Think innovatively about team composition

136 The Isabel Team

137 Session Summary

138 The primary purpose of interviews in outbreak investigations is to collect data for case identification, risk factor identification, or hypothesis generation. Interview methods can be interviewer administered (face-to-face or telephone) or self administered (mailed, emailed, or Web-based). There are advantages and disadvantages to employing either method.

139 Session Summary Questionnaire design and interview methods are interrelated in the overall process of an outbreak investigation. Sampling is the systematic selection of a representative portion of the larger source population to be interviewed. If the purpose of your study is to determine the point source of infection, you may be able to interview a smaller sample; if the purpose of your study is to calculate an attack rate, you may need to interview a larger sample.

140 Session Summary Survey response rates measure the percentage of your sample that has participated in your survey. Average response rates vary from as little as 56% for mailed surveys to 75% for face- to-face surveys. Non-response to surveys can be a result of no one being home, refusal to participate, or individual inability to participate (e.g., because of a language barrier or physical or mental condition).

141 Session Summary Survey data collection error is a result of both bias and variance in the interview process. Interviewer error can be prevented with adequate interviewer training and the standardization of survey instruments.

142 Session Summary Sound interviewing procedures include: reading questions exactly as they are worded; probing inadequate answers; recording answers without interviewer discretion; and maintaining rapport with respondents. Communicate established proxy respondent rules to interviewers prior to survey implementation to avoid altering the sampling method or compromising data quality.

143 Session Summary Develop and distribute an interviewer manual to provide interviewer support. Such documentation reduces error and enhances the quality of data collected. While you will be exempt from obtaining Institutional Review Board clearance and informed consent from interviewees during an outbreak investigation, you should not overlook confidentiality issues from both the respondent and outbreak investigation perspectives.

144 References and Resources 1.American Statistical Association (1997). What Is a Survey? More About Mail Surveys. Alexandria, VA: Section on Survey Research Methods, American Statistical Association. 2.American Statistical Association (1997). What Is a Survey? How to Collect Survey Data. Alexandria, VA: Section on Survey Research Methods, American Statistical Association. 3.Fowler, F. and Mangione, T. (1990). Standardizing Survey Interviewing. Newbury Park: Sage Publications.

145 References and Resources 4.Gregg, M. (ed). (1996). Field Epidemiology. Oxford University Press. 5.Last, J.M. (2001). A Dictionary of Epidemiology: 4 th Edition. Oxford University Press: New York. 6.Levy, P. and Lemeshow, S. (1991). Sampling of Populations. John Wiley & Sons. 7.Salant, P. and Dillman, D. (1994). How to Conduct Your Own Survey. John Wiley & Sons.

146 References and Resources 8.Stehr-Green, J.K. (2002). Gastroenteritis at a University in Texas: Case Study Instructor’s Guide. Atlanta, GA: U.S. Department of Health and Human Services, Public Health Service, Centers for Disease Control and Prevention. 9.Wiggins, B. and Deeb-Sossa, N. (2000). Conducting Telephone Surveys. Chapel Hill, NC: Odum Institute for Research in Social Science.

147 Slides from today’s session Following this program, please visit one of the web sites below to access and download a copy of today’s slides: NCCPHP Training web site: http://www.sph.unc.edu/nccphp/training/index.html North Carolina Division of Public Health, Office of Public Health Preparedness and Response http://www.epi.state.nc.us/epi/phpr/

148 Next Session September 14th 10:00 a.m. - Noon Topic: “Designing Questionnaires”

149 Site Sign-in Sheet Please mail or fax your site’s sign-in sheet to: Linda White NC Office of Public Health Preparedness and Response Cooper Building 1902 Mail Service Center Raleigh, NC 27699 FAX: (919) 715 - 2246


Download ppt "2004 Public Health Training and Information Network (PHTIN) Series."

Similar presentations


Ads by Google