Alfred D. Tuttle Rebecca L. Morrison United States Census Bureau

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

AUDIT AND FOLLOW-UP IN CLINICAL STUDIES 1 DECEMBER 2007, ISTANBUL In a general sense, an audit may involve PersonnelOrganizationSystemProcessProjectProduct.
What is Primary Research and How do I get Started?
From Start to Pilot: Comprehensive Redesign of a Questionnaire Measuring Foreign Direct Investment Alfred D. Tuttle Rebecca L. Morrison United States Census.
1 © 2006 by Smiths Group: Proprietary Data Smiths Group Online Performance Review Tool Training.
Grant Proposal Basics 101 Office of Research & Sponsored Programs.
Chapter 9 Descriptive Research. Overview of Descriptive Research Focused towards the present –Gathering information and describing the current situation.
Business and Management Research
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Copyright 2010, The World Bank Group. All Rights Reserved. Questionnaire Design Issues Section B Disclaimer: The examples used are not necessarily good.
11 Reasons Why Manuscripts are Rejected
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
Social Innovation Fund Creating an Application in eGrants Technical Assistance Call 1 – 2:00 p.m. Eastern Time on Friday, March 19, ;
The techniques involved in systems analysis Explanation of a feasibility study:Explanation of a feasibility study: –economic, –legal, –technical, –time.
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
The effects of Peer Pressure, Living Standards and Gender on Underage Drinking Psychologist- Kanari zukoshi.
1 Cognitive Aspects Associated with Sample Selection Conducted by Respondents in Establishment Surveys La Toya Barnett Thomas Rebecca L. Morrison Grace.
1. 2 Issues in the Design and Testing of Business Survey Questionnaires: Diane K. Willimack U.S. Census Bureau Economic Census The International.
Component D: Activity D.3: Surveys Department EU Twinning Project.
HRM-755 PERFORMANCE MANAGEMENT OSMAN BIN SAIF LECTURE: TWENTY THREE 1.
Division of HIV/AIDS Managing Questionnaire Development for a National HIV Surveillance Survey, Medical Monitoring Project Jennifer L Fagan, Health Scientist/Interview.
Research Methods for Business Students
PROCESSING DATA.
The Effects of Prelisted Items in Business Survey Questionnaire Tables
Hidden Slide for Instructor
The problem you have samples for has been edited to reduce the amount of reading for the students. We gave the original late in the year. As we were working.
CHAPTER OVERVIEW The Case Study Ethnographic Research
Introduction paragraph – what looking to investigate.
DATA COLLECTION METHODS IN NURSING RESEARCH
Jennifer Crafts, Westat Audrey Kindlon, National Science Foundation,
SP_ IRS : Research in Inclusive and Special Education
1.10 Report Findings to Communicate Research Information to Others
Research Methods for Business Students
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
(Winter 2017) Instructor: Craig Duckett
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Evaluation of Research Methods
Reports Chapter 17 © Pearson 2012.
Systems Analysis and Design
Research Process №5.
Tutorial 1 – Creating a Document
Adding Assignments and Learning Units to Your TSS Course
ISO 9001:2015 Auditor / Registration Decision Lessons Learned
Qualitative and Quantitative Data
Dissemination Workshop for African countries on the Implementation of International Recommendations for Distributive Trade Statistics May 2008,
Collaboration with Google Drive
Deliberate Practice PGP
Business and Management Research
Tourism Marketing for small businesses
Building Disaster-Resilient Places
Data Entry Interface (DEI) Overview
CERNER MILLENNIUM Letters & Faxing
Data and Data Collection
Identify & Document Client Requirements.
Chapter 13: Systems Analysis and Design
Systems Analysis and Design
Data Entry Interface (DEI) Overview
Web Usage in a Business Panel Survey
Business and Management Research
Synthesis.
TECHNOLOGY ASSESSMENT
Data Entry Interface (DEI) Overview
METHODOLOGY INTERVIEW & QUESTIONNAIRE - Dr. Elizabeth M
A REPORT.
CHAPTER OVERVIEW The Case Study Ethnographic Research
Using the Rule Normal Quantile Plots
Case studies: interviews
Presentation transcript:

From Start to Pilot: Comprehensive Redesign of a Questionnaire Measuring Foreign Direct Investment Alfred D. Tuttle Rebecca L. Morrison United States Census Bureau David H. Galler United States Bureau of Economic Analysis

Overview – Redesign of BEA Form BE-605 Reimbursable project with BEA Multiple, intensive stages of research, design, testing, revision Qualitative and quantitative evaluations New BE-605 form mailed out in March 2007 First let me offer an overview of the research project I’ll be describing today: The project is the result of an agreement between Census and BEA (Bureau of…) It involved three stages of intensive research, design and evaluation, both qualitative and quantitative Culminated in the redesigned form being used for data collection this past March

Outline Background – BEA Form BE-605 Summary of stages of research Findings Next steps Now let me offer an outline of my talk…

Background BE-605 (quarterly) and BE-15 (annual) Foreign Direct Investment (FDI) – >10% of voting rights in an incorporated entity (or equivalent interest in unincorporated entity) Sample – foreign-owned businesses in US Background: My talk will focus on BE-605, though our findings come from research that included the BE-15, a related survey. The 605 is a quarterly survey and the 15 is conducted annually. The two are linked by common economic concepts, called Foreign Direct Investment. The two surveys help to formulate the US’s national income and product accounts, international transactions accounts, and international investment position. Both sample the same universe of about 14,500 foreign-owned business entities in the US at present. The conceptual underpinning of the surveys based in part on FDI played a significant part our redesign effort. I’ll say more about it later.

Research stages and methods Background investigations Expert review of form Focus groups with survey analysts Observations of callbacks Respondent debriefings To summarize the stages of research: First we conducted several different background investigations, to become familiar with the BE-605 survey and with the problems respondents have with the form We reviewed the form from the position of establishment survey methodologists, looking for likely problems with terminology, formatting and visual design, navigation, etc. Next we conducted three focus group interviews with the survey analysts who collect the survey data and prepare it for publication. We wanted to learn their perspectives on the problems respondents encounter with the forms. Then we sat in on telephone calls analysts made to respondents as part of their normal collection procedure, in which the analysts verify reported data or query data not reported. Another way of triangulating on reporting problems experienced by respondents. We got to learn more about analysts’ perspectives and also about Rs’ points of view about the form. Also learned a little about how analysts resolve problems with completed reports. Our last background investigation was a round of respondent debriefings. We visited 28 companies and probed in-depth as to the responses they gave to one of the recent surveys. We were able to learn a great deal about how they were interpreting survey items, how they retrieved data from their records, and how the questionnaires caused specific problems for their reporting processes. The samples for the respondent debriefings and all the other interviews conducted for this project were convenience samples created by BEA staff. The primary criterion was location, to facilitate in-person interviews, followed by industry and size, with larger companies in industries with more complicated reporting requirements being preferred.

Research stages and methods Cognitive testing of mockups On the basis of this wealth of background material, we identified the more problematic items and sections of surveys and drafted new versions of them, using best practices in visual formatting and questionnaire design. We went on to test our redesigned versions in five rounds of cognitive interviews with 60 companies.

Research stages and methods Pilot test Respondent debriefings Evaluation questions Our findings from the cognitive testing phase informed the complete redesign of the BE-605 form, and this new version was mailed out to a sub-sample of the survey in a pilot test. We evaluated the pilot form via respondent debriefings and a set of evaluation questions added to the form.

Findings from background research Data collection and design issues: Dense formatting Economic accounting concepts differ from US GAAP Separate instructions (SHOW ORIGINAL 605 FORM AND INSTRUCTIONS) From the first phase of background research, we distilled a few essential problems with the questionnaires that tend to contribute to inaccurate reporting. First, the formatting of the forms is very compact, with dense blocks of text in small font, and they are printed on legal-sized paper, contributing to a very “busy” and intimidating look. Format also obscures the intended navigational paths through the form, and allows critical information and survey items to be easily overlooked. We also learned that the FDI concepts on which the survey is based do not quite match up with respondents’ frames of reference, which are usually based on generally accepted accounting standards (GAAP), and sometimes other rules for regulatory reporting or taxation. There are, at times, subtle but significant differences that, if not properly understood, can result in material differences in reported data from what is desired. Compounding this problem in comprehension is the fact that the instructions that explain these differences are found in separate sections and are lengthy, complicated, and densely formatted. Many respondents tend not to read separate instructions, especially when survey items appear to be straightforward, and so often miss critical information needed for accurate reporting.

Recommendations Formatting recommendations Letter size pages (not legal) More “open” visual design Incorporate instructions into form: within questions adjacent to questions on facing page key instructions become questions Diagrams to augment questions To address these issues, we made several major recommendations for changes to the questionnaires, to make them easier to read and fill out, and to increase the likelihood that respondents will consider critical information necessary for correct reporting: Switching from legal- to letter-sized pages; Adding space and removing lines for a more open, less complicated visual design; Placing instructions where respondents need to see them, namely adjacent to and within the relevant questions; Also, converting certain key instructions into questions. Respondents tend not to ignore questions, as long as they see them, so making questions from crucial reporting instructions all but guarantees that Rs will be exposed to concepts they need to understand. We also created diagrams to illustrate relationships between affiliated business entities, in an attempt to make more difficult questions more comprehensible.

Original form (SHOW FRONT PAGE OF OLD FORM) Here is an example of our general formatting recommendations. First, a section of the original form…

Recommendation: More “open” design …And the same section reformatted in a more open design.

Recommendation: Embedded instructions Here is another example of our reformatting. Instructions are embedded in the questions, in this case next to a “yes” response option.

Recommendation: Adjacent instructions Here are left- and right-hand pages. On the right are the survey items, and on the left instructions that originally appeared in a separate booklet. This gives respondents immediate access to relevant instructions, rather than having to refer to and search through another document.

Recommendations: Questions from instructions, and diagrams These questions were created from instructions about the reporting unit. The reporting unit is one area of disconnect between FDI and respondents’ frames of reference. Respondents tend not to skip questions, as long as they are aware of them. Answering each question in the sequence should lead respondents to the correct reporting unit. To the right of the questions are some of the diagrams we created. We got the idea for these from corporate organization charts that respondents frequently referred to during our meetings with them. We thought that, since respondents are familiar with these kinds of diagrams, they might be useful in illustrating survey questions about inter-company relationships to improve comprehension.

Cognitive testing Problematic sections mocked-up and tested (not entire forms) 5 rounds of cognitive interviews Iterative process: mock-ups revised between rounds design features modified & re-tested Our background research informed the selective redesign of the more problematic sections of the form, though not the entire form. We then tested the draft sections in cognitive interviews with 605 respondents, in all, five rounds of interviews with respondents at 60 companies. This phase of the research was iterative, in that the results from each round of interviews informed modifications to the drafts, which were then tested in a subsequent round.

Findings from cognitive testing New design features were effective: Majority of respondents liked the more open format Embedded instructions would likely improve data quality Some “InstructionsQuestions” and diagrams required extensive modification, though most did not We found from the cognitive testing phase that our design innovations were largely successful. The more open format was appreciated by most respondents, especially the open space, the larger fonts, and the use of shaded background to make white response boxes stand out. Though a very few preferred the compactness of the original questionnaire. The instructions added in and near questions were also favorably received. There were a few cases where experienced respondents found answers to their own questions about particular items in the added instructions, and most respondents said they would refer to them if they needed to. Given the technical and complicated nature of many items in the survey, this gives us reason to think that data quality will improve as a result of adding instructions to the form. For the most part, our conversion of key instructions to questions was successful in getting respondents to attend to and comprehend those instructions. Exceptions were cases where skip patterns were required; the skips were often ignored and respondents were confused by questions that did not apply to them. Such questions, and the diagrams that accompanied them, were continually modified throughout the testing stage. However, most of the diagrams and questions made from instructions were comprehensible and found to be effective at making respondents aware of key reporting requirements. For more about the diagrams, see Tuttle/Morrison in 2006 AAPOR conference proceedings, or contact us for a copy…

Developing 605 pilot Redesigned entire BE-605 form Close collaboration with BEA survey staff Our findings from the testing phase informed our redesign of the entire BE-605 form. We worked closely with the survey personnel at BEA to apply the findings and our design guidelines to untested sections of the form, and to resolve other issues associated with them.

Pilot test Mailed 2nd quarter 2006 Sent to sub-sample (n = 653; survey sample ~4,000) Completing pilot form optional Response rate: 53% (350 returned forms) during initial 8-week processing period The pilot form was mailed out at the end of June, 2006, for the second quarter. It was mailed to a sub-sample of 653 foreign-owned US companies and other business entities (after Rs who use the electronic survey instrument were eliminated), out of a survey sample of about 4,000. A cover letter informed respondents that completing the pilot form was optional, though if they chose not to complete it they would still be required by law to fill out the original form. 350 completed pilot forms were returned during our monitoring period, a response rate of 53%.

Pilot form evaluation Respondent debriefings Evaluation questions in form We employed two methods to evaluate the pilot test of the form: First, we conducted debriefings with recent respondents. 10 in person, 13 over via telephone. We also evaluated Rs’ perceptions of the form by their responses to 4 questions added to the end of the form.

Findings – R debriefings Favorable impressions of new form: More pleasing to the eye White reporting spaces easier to see against colored background Larger font, bold text, more space Letter size pages easier to handle Diagrams were helpful Embedded instructions helpful Our findings from the debriefings largely echoed those from cognitive testing..

Findings – R debriefings Drawbacks: More pages Booklet format makes it harder to fax Took longer to complete than original form (disruption of reporting routine) A few respondents noted some drawbacks, such as the increased physical size of the form (from 4 legal-sized pages to 16 letter-sized), and the difficulty of faxing the new booklet format. Also, several respondents said they thought that the new form took longer to complete than the original form. The reason is that they had to compare the new form to the old one and figure out the new item locations and the numbering system. However, they said once they adapted their reporting routines to the new layout, that it should take no longer to complete the new form. Further, we have some evidence that suggests that some of the perceived additional time of completion is the result of respondents taking more time to read the included instructions. If respondents are changing their responses on the basis of a better understanding of the reporting requirements, then their responses are more likely to trigger edit failures that compare current to prior-quarter figures. In those cases, the increase in edit failures would indicate improved data quality.

Limitations Stymied by experienced Rs, reporting routines… …though even experienced Rs were helped by new formatting and instructions. There was one significant limitation to the cognitive testing and pilot respondent debriefing stages, which was that participation was voluntary, and the volunteers in nearly all cases were very conscientious, highly motivated respondents who were at least fairly familiar with the BE-605 survey. Also respondents’ use of documentation of past reports and reporting routines limits extent to which questions were “interpreted anew”. We were able to visit very few “problematic” respondents (eg, brand new 605 reporters, non-accountants, etc), and so we did not get much insight into how new respondents would react to the design innovations. However, we do have some evidence that even experienced respondents refined their understanding of the form’s reporting requirements as a result of the inclusion of instructions in the form.

Findings – Evaluation questions Four questions appended to survey Each item received between 279 and 290 responses Lastly, we will report our findings from the four evaluation questions we added to the end of the form. Responses to the individual questions ranged from 279-290. We’re not able to describe the distribution of companies who responded to these questions or make any inferences about the BE-605 sample, but we’ll provide descriptions of the types of responses.

Findings – Evaluation questions Instruction placement? 79.3% - Near questions 18.3% - In separate booklet 2.4% - Not applicable – I haven’t completed the usual BE-605 The first question asked whether respondents preferred instructions near the questions, as in the pilot form, or in a separate booklet, like the original form. In the absence of a statistical test, the evidence suggests that the people who answered these questions tend to prefer the instruction format of the pilot over that of the original form.

Findings – Evaluation questions Open space and number of pages? 48.7% - less open space, fewer pages 47% - more open space, more pages 4.3% - Not applicable – I haven’t completed the usual BE-605 The second question asks respondents about their preference for open space versus more pages. Again, in the absence of a statistical test, we cannot confirm that these responses are statistically different or not different.

Findings – Evaluation questions Use of diagrams? 65.4% - very helpful or somewhat helpful 11.7% - only a little helpful 18% - did not need them 3.9% - confusing The third question asked respondents what they thought of the diagrams accompanying some questions, whether they found them helpful, not helpful, confusing, or they did not use them. A majority of the respondents to this question found the diagrams either very helpful or somewhat helpful.

Findings – Evaluation questions Easy/difficult compared to usual form? 50.5% - harder 42.5% - easier 7% - Not applicable – I haven’t completed the usual BE-605 The last question asked whether respondents found the pilot form to be easy or difficult compared to the usual form. Again, in the absence of a statistical test, we cannot confirm that these responses are statistically different or not different. In our debriefings of pilot form respondents, many said that to complete the new form they had to match up the questions on the old form with their locations in the new one, which added to the usual burden of completing the survey. Several of those said that once they adapt their reporting routines to the new numbering scheme that the new form would not be any more difficult than the old form. This last was repeated by some respondents in an open-ended comments section included with the evaluation questions.

Findings – Evaluation questions Instructions preferred near questions Organization charts helpful Easier or more difficult? Preference for more/fewer pages? To summarize the findings from the evaluation questions: More respondents to these questions prefer the instructions adjacent to questions rather than in a separate document; A majority of them found the org-chart diagrams helpful It is not clear whether the numbers of respondents who found the new form easier than the old, or who prefer the more open format over the original with fewer pages, are significantly greater.

Summary Redesigned BE-605 went into production in March 2007 Recommended analysis of edit checks during form processing The fully redesigned BE-605 went into production starting with the first quarter 2007 report. We advised the survey managers to monitor error rates resulting from their editing process. Our limited analyses of the error rates suggest that they will be higher initially for the new form, but we believe that the error rates should drop once respondents have adjusted their reporting routines. We advised the program staff to monitor error frequencies as well as conduct more detailed qualitative studies to better understand the causes of edit failures.

Summary Multi-method approach to comprehensive survey redevelopment We have a great deal of confidence in the redesigned BE-605 form, thanks to the comprehensive, multi-method approach we took to understanding the survey and the respondents, and to revising and testing the new form. We employed three different methods during the phase of background investigation: interviews with survey analysts, observations of calls to respondents, and debriefings with recent respondents. These allowed us to gain the perspectives of both survey staff and respondents. The investigative phase gave us a rich understanding from which to redesign the form. We then tested the redesigned sections with a fairly large sample (for a qualitative study) of respondents. Our iterative approach allowed us to revise and retest problematic parts of the new form, to the point where we no longer encountered major problems. Finally, we followed up with more debriefings with respondents to the redesigned pilot form to see how they reacted to the new form. We also got the perspectives of a broader sample of pilot respondents with our evaluation questions. The survey programmers themselves were intensively involved in many parts of the redesign process. They were exposed to some of the problems with the original form from the respondents’ perspectives, and witnessed the trial and error process of testing the revised form. They came away from this process with a deeper understanding of their survey respondents, and implemented the new instrument in full ownership of it.

Thanks! Alfred D. Tuttle alfred.d.tuttle@census.gov Rebecca L. Morrison rebecca.l.morrison@census.gov David Galler david.galler@bea.gov