Download presentation
Presentation is loading. Please wait.
Published byAustin Harrison Modified over 9 years ago
1
The Unrecognized Interviewer Studying Respondent Behavior in an Establishment Survey of U.S. Academic Institutions Presented at the Third International Conference on Establishment Surveys (ICES-III) June 20, 2007 Scott D. Crawford Survey Sciences Group, LLC Emilda B. Rivers National Science Foundation
2
2 Our Establishment Survey Survey of Graduate Students and Postdoctorates in Science and Engineering (GSS) Surveys U.S. academic institutions granting graduate degrees in science, engineering (S&E), and selected health-related fields Jointly sponsored by NSF, the National Institutes of Health and the Department of Energy
3
3 Brief History of the GSS 1966–1971 part of NSF grant application package 1972 became full-scale survey with annual administration 1973 expanded to all graduate medical schools 1998 Web version added
4
4 What is collected in the GSS? GSS produces national estimates for S&E and selected health-related fields on: fall graduate enrollment counts by demographic categories by main mechanisms and sources of financial support postdoctoral (postdoc) appointment counts by demographic categories by main sources of support by first professional degrees in medical and related fields non faculty research staff counts by demographic categories by first professional degrees in medical and related fields
5
5 How is the GSS data collected? NSF Sends survey materials to Institutional Contact Institutional Contact Confirms contact information Identifies new or defunct departments /programs /centers with grad students or postdocs Departmental Contact Provides enrollment data for eligible departments/ programs/ centers
6
6 What potential errors exist in this effort? Errors of Nonobservation Sampling Coverage Nonresponse Observational Errors Respondent Instrument Mode Very little is known!
7
7 The GSS Respondent: Very Little is Known Are they knowledgeable about both postdocs and graduate students? If so, are they the best person to respond to these requests? What resources do the respondents rely upon to provide a response? Do they have direct access to the source of these data? How are the data organized? How do respondents translate definitional issues in their response?
8
8 The Establishment Response Process The Establishment Responder identification process Records Individual Responder Personal motivations Resources available Establishment knowledge Response to Establishment Survey
9
9 The Establishment Response Process Response Behavior Survey The Establishment Responder identification process Records Individual Responder Personal motivations Resources available Establishment knowledge Response to Establishment Survey
10
10 The Response Behavior Survey (RBS) Unique characteristics that make this approach feasible High response rates to GSS Demonstrates strong motivation to complete Institutional nature of the data collection Many respondents have it as part of their job responsibilities to respond Survey design unchanged in years may have provided a culture where it is easy for respondents to critique Comfort with the web in the GSS Allows for a rapid (web) follow-up
11
11 RBS Sample Frame GSS department contact person list Things considered Consistency with GSS design Large enough to identify trends and comparison cells in specific response characteristics (quantitative baseline study) Small enough to minimize burden on respondents in an ongoing study Some GSS departments do not have postdocs Given the respondent-department link, we needed to consider the fact that some respondents had multiple departments
12
12 RBS Questionnaire Content Respondent characteristics Involvement in the GSS Institutional characteristics Identifying resources for the GSS response Data sources Barriers to access of data sources Assessment of data quality Survey topic definitions – their use and understanding
13
13 GSS / RBS Survey Flow
14
14 RBS Contact Strategy Data and mode of each contact effort for the early and late sample Approximately 45 days of data collection required, most of it was during the primary study (GSS) data collection period. Contact ModeDays After Prev. Contact 1 st Contact Heads up email from NSFNA 2 nd Contact Invitation letter mailed14 days 3 rd Contact Invitation email sent7 days 4 th Contact 1 st email reminder sent6 days 5 th Contact Letter reminder mailed3 days 6 th Contact 2 nd email reminder sent5-7 days 7 th Contact 3 rd email reminder sent4 days 8 th Contact Telephone callsAll conducted in late Sept. and early Oct.
15
15 RBS Response & Completion Rates Contact ModeCumulative Response Rate (RR2) Completion Rates 1 st ContactHeads up email from NSF 0.0%NA 2 nd ContactInvitation letter1.8%NA 3 rd ContactInvitation email24.9%NA 4 th Contact1 st email reminder38.8%NA 5 th ContactLetter reminder45.9%NA 6 th Contact2 nd email reminder54.6%NA 7 th Contact3 rd email reminder59.4%NA 8 th ContactTelephone calls72.1%NA OVERALL72.1%84.5% EARLY SAMPLE74.7%85.3% LATE SAMPLE61.6%80.7%
16
16 The “It’s Not Me” Phenomenon Right from the start the RBS uncovered something amiss in the response behavior of the departmental person “of record” 6.5% of the sample reported that they were not involved in the 2005 GSS Potential causes (unknown) Sample frame problem – ineligible department Institutional responder using departmental responder information for something other than originally intended – potentially indicates a need for a 2 nd layer RBS
17
17 Who are our GSS respondents? Demographics Gender, 26.2% are male Education 29.0% have less than a BA/BS degree 25.4% only a BA/BS Position (top 4 categories) 27.8% administrative 24.2% administrative support 10.9% department chair 9.5% faculty
18
18 Who are our GSS respondents? Involvement in maintaining records 56.6% maintain graduate school records 18.5% maintain postdoc records Percent for maintaining postdoc records only increases to 37.8% when looking at those in postdoc only departments
19
19 Who are our GSS respondents? Are professional survey respondents Have been involved in the GSS for a mean of 6.5 years 29.5% respond to other NSF surveys, 52.3% respond to other surveys in general However, only 7.1% have been involved with IPEDS May not be the most knowledgeable Only 58.9% believe they are the most knowledgeable person to answer questions about postdocs 19.9% are users of GSS data
20
20 What do respondent characteristics tell us? Education While the survey is meant to collect data on grad students and postdocs, the responder is not likely to be of same educational status Assumptions on definitions may be incorrect Interpretations that are effecting counts may not be appropriate
21
21 What do respondent characteristics tell us? Professional respondents These are likely to be people with job responsibilities to complete surveys With 52.3% of the GSS responders also responding to other general surveys, the potential exists for a more streamlined format that may rely on these professionals’ experience and resources
22
22 What do respondent characteristics tell us? Most knowledgeable With nearly half not the most knowledgeable respondent, this raises questions about potential problems in: The departmental respondent selection process How the survey is structured
23
23 What did the GSS respondents report on their own? When asked about reporting all of the data for specific data types The majority reported all of the data on graduate students However, most were not able to report all of the data on postdocs Relied on assistance and other resources to report They reported that they had less knowledge of the computer systems that manage such data
24
24 RBS Measures of Data Quality Actual measures (break-off rates, item missing data in the GSS) were not sufficient to be used as a measure of data quality (few beak-offs and few item missing data) Self-reported estimate of measure quality was used – respondent provided their thoughts on the quality of the data they were providing. Some internal validity was found as our expectations of correlations with self-assessed data quality panned out through further analyses.
25
25 Response Behavior Effects on Data Quality Impact on Perception of Data Quality PositiveNo ImpactNegative When there is a person who always responds to requests for student data (Postdoc counts and financial support data only) Whether the respondent is usually selected by position or on an individual basis. Whether the respondent is one of the people always responsible for responding to student data requests or the GSS Whether respondents have been involved in providing data for other surveys
26
26 Response Behavior Effects on Data Quality Impact on Perception of Data Quality PositiveNo ImpactNegative Data sources is easily accessible by respondents With how the primary data source is maintained or stored When GSS respondents do not know whether the department has an official postdoc definition Data source is considered accurate by the respondent The format of the primary data source (being aggregate or individual) Respondents who consider their data source as “complex” Respondent is familiar with how the data is entered in the primary data source
27
27 Response Behavior Effects on Data Quality It is clear that the current GSS departmental contact is not the right one to provide postdoc data in most cases. Future evaluations of postdoc data collection should focus on the development of protocols for the identification of the correct person. Postdoc offices appear to be a key starting point for this exploration Knowledge of postdoc policies, as well as knowledge and accessibility of data sources, should be explored as a way to quickly identify the correct postdoc reporting individual
28
28 Instrument Interaction in the Response Behavior Context A little more than 1/2 of GSS respondents indicated that they had read the instructions for the criteria of what constitutes a postdoc Their perceived data quality was much higher than for those who had not read the criteria.
29
29 Instrument Interaction in the Response Behavior Context Less than half of departments’ postdoc definitions are consistent with the GSS’ definition. However, there is no significant difference in self- assessed data quality between departments whose definition is consistent with the GSS and those whose is not. While departments/institutions have different definitions of postdocs, if the respondent read the GSS definition, they were able to provide good quality data (perceived). Quality problems emerged when the definition was not used.
30
30 Instrument Interaction in the Response Behavior Context RECOMMENDATIONS Definitions and their usability must be improved Efforts to improve postdoc data should examine why foreign counts are the easiest data element to provide – something about that data makes it easier to report
31
31 The Practice of Responding Does time of response (early, middle or late) correlate with perceived postdoc data quality? What specific recommendations does the respondent have about improving data quality?
32
32 9 out 10 Reported that Current Data Collection Timing was Adequate Those who didn’t, reported that these months would be better
33
33 How did timing of response play a role in data quality? Middle responders provided lower perceived quality postdoc data than early and late responders. There was no difference between self- assessed data quality between early responders and late responders.
34
34 Postdoc Counts Varied by Response Timing to the GSS Does lower perceived data quality (middle responders) translate into meaningful or significant survey response differences?
35
35 Responses: The Web Mode Most GSS respondents used the web survey as their primary mode of providing data The majority of respondents reported a preference for the web mode
36
36 Responses: The Paper Mode The paper survey was used more as a worksheet to collect responses prior to submission. Approx. one-third of the respondents rely on the paper survey to use as a worksheet for completing the Web later.
37
37 Top Responses for Improving Response Time & Data Quality Regarding GSS submission Providing more time –Earlier distribution of the GSS –Extending the deadline Improving data contacts (frequency and the right person) Regarding their own institutions Improving their databases Improving support, reducing workload, expanding personnel Regarding the format of the GSS Make the format more user-friendly, simplify the form Clarify wording and definitions Designate a contact person for support
38
38 Study Recommendations Consider additional efforts to support institutions and responders in their process of providing GSS and postdoc data Given differences in response patterns uncovered in the RBS, a more extensive nonresponse analysis of GSS data is recommended. Research the design of paper worksheet versions of the GSS rather than a form intended to be submitted.
39
39 Is the RBS an effective tool? Potential strengths Assisted in the identification of potential frame issues (“it’s not me”) Effectively described the demographic characteristics of the individual responder Provided some understanding of how the respondent – establishment relationship can effect data quality Identified at least one area (timing of GSS) where conflicting results show that the answer for a redesign is not so simple
40
40 Is the RBS an effective tool? Potential weaknesses Did not capture much information about other “people” who were involved in the response process Self reported measure of quality has little hard evidence so far – should be validated
41
41 Is the RBS an effective tool? Overall, the answer is yes. Further research is required to further develop models of the response process when an individual is responding for and within the context of an establishment.
42
42 Thank you. Questions / Comments: Scott D. Crawford scott@surveysciences.com scott@surveysciences.com
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.