Download presentation
Presentation is loading. Please wait.
Published byAnis Arnold Modified over 6 years ago
1
From Concept to Question: Using Early Stage Scoping Interviews to Develop Effective Survey Questions to Measure Innovation in Businesses Dave Tuttle, U.S. Census Bureau Audrey Kindlon, National Science Foundation ESRA 2013 Conference Any views expressed are those of the authors and not necessarily those of the U.S. Census Bureau or the National Science Foundation.
2
Overview Background – Innovation Survey Project
Early stage scoping (ESS) method Objectives of our research Types of data collected and how used A brief overview of our talk… Emphasize that this is about developing a business survey… there are important distinctions from household surveys
3
Innovation Survey Project
Measuring business innovation is not new Concerns about performance of some survey questions Desire to expand types of data collected Research coordinated by Organisation for Economic Cooperation and Development Goals: Assess feasibility of collecting additional data on private-sector innovation Inform design of survey questions Not new. Collected for several years on Community Innovation Surveys in EU countries and BRDIS in US Concerns about how some questions are performing. Also a desire to expand the types of data collected. This research is being funded by the Nat. Sci. Fdn., and coordinated by OECD. Goals: See if feasible to collect additional innovation data, and inform design of survey questions
4
Innovation Survey Project
Innovation means “new or significantly improved” Four types of innovation: Product Process Organizational Marketing Sales and expenditures data Per OECD’s definition, innovation means “new or significantly improved” Four types according to OECD’s definitions: Product means good or service Processes for making or delivering products Organizational innovations – business practices, workplace organization, etc. Marketing strategies – targeting markets, pricing, promotions, etc. Besides presence of innovative activities, interested in sales and expenditures associated with innovations.
5
What is Early Stage Scoping (ESS)?
Qualitative interview technique Inform development of survey questions Understand what types of data are obtainable Bridge survey concepts and respondents’ concepts to facilitate comprehension Respondents and companies are the starting point Early stage scoping, or ESS. Analogous to focus groups for household surveys, which are not always feasible for business surveys. Method used to understand what types of data are amenable to survey requests. Also to understand differences, if any, between survey concepts and respondents’ ways of thinking about their activities of interest to the survey, and to facilitate comprehension of the survey concepts. Use companies and their representatives as starting point for developing survey questions.
6
ESS vs. Cognitive Interviews
Test draft questions and questionnaires Wording Visual design Navigation Functionality Early Stage Scoping Evaluate survey concepts Inform design of survey questions Target data that are consistently measurable How is ESS different from cognitive interviewing? The two methods have different but complementary goals. CI’s – Testing questions and questionnaires. Ensure the reliability of the instrument. ESS intvs – More exploratory, and more focused on the validity of survey concepts underpinning the questionnaire, as well as reliability. ESS intvs should be done early in the development of a survey, before the questionnaire is drafted.
7
Why ESS? Snijkers & Willimack (2011)
Concept development is “missing link” in survey development practices Misunderstood role of cognitive interviews Incomplete survey concept specification results in “frequent backtracking” in CIs Wil. & Sni. point to a “missing link” in survey development. CIs are seen as a “magic bullet” for survey development. Thorough development of survey concepts is often not accomplished by the time questions are drafted. Backtracking as CIs reveal incomplete conceptual shortcomings, attempts to refine concepts and develop further specifications, when focus should be on testing questions and instruments
8
Why ESS? Cibelli & Willimack (2010) – Meta-analysis of cognitive interview reports Identified types of measurement error found in pre-testing establishment surveys Problems are frequently found with survey concepts Meta-analysis of cognitive interview results for a small sample of establishment survey projects. Coded individual findings from CIs according to types of measurement error. Their findings point to problems with survey concepts.
9
Why ESS? Most frequent measurement error codes for pre-testing findings Ambiguous concept 18% Information not available / other issues with data availability 8% Inadequate instructions 5% Wording issues Problematic terms 4% From the authors’ list of 10 most common sources of measurement error. Not the whole list, just the ones that point to inadequate survey development. “Ambiguous concept” is the most frequently cited source of measurement error in their study. (n=777 code assignments in 18 reports)
10
“Ambiguous Concept” Emerging Themes (Cibelli & Willimack 2010)
Concept is unfamiliar or unknown to the respondent. Interpretations of concept varies: Across respondents From the intent of the survey Let’s take a closer look at “ambiguous concept”. The authors identified several emerging themes among the findings assigned this code.
11
“Ambiguous Concept” Emerging Themes (Cibelli & Willimack 2010)
Rs’ particular circumstances are unforeseen Rs not sure how their situation fits the data request Rs unsure what business features to include / exclude Here we get closer to the idea of inadequate concept specification
12
ESS Procedures Semi-structured interviews with company representatives
Non-probability sample n=23 companies, 36 participants Companies – Diverse with regard to size and relevant industries Participants – Detailed knowledge of company activities Companies were selected purposively to capture a diverse group in terms of size and industry. Participant selection was beyond our control. When contacting companies and setting up interviews, we tried to get in touch with people having knowledge of company operations and strategic decision-making. These generally turned out to be corporate-level officers, program directors, etc. We avoided getting typical survey respondents, who tend to be in areas associated with accounting, government reporting, etc. From past experience we knew these were usually not people who could inform our discussion of how survey concepts relate to features of the businesses.
13
Objectives of our research
14
1. Understand context of target population
Is “innovation” a relevant concept? How do respondents define innovation? What terms are used? Examples of company activities? Internal criteria / metrics? First objective – Understand the context… Is innovation relevant to business management, strategy, decision-making, etc.? How do respondents think about “innovation” and how might they apply this concept? What activities do they consider innovative, and how do they evaluate the success of them?
15
2. Assess clarity of survey concepts
Where does respondents’ understanding of innovation agree and disagree with survey concepts? How can Rs’ perspectives be used as basis to facilitate understanding? Second objective – Assess the clarity… Where are the points of agreement between the survey concepts and how respondents think of them, and where are the disconnects? How can respondents’ ways of thinking be used as the starting point for communicating the survey concepts?
16
3. Assess availability of data
What knowledge and records are associated with in-scope activities? Who are appropriate respondents? Third objective – Assess the availability… What kinds of information exist in companies related to innovative activities? Who would be the appropriate people to respond to such a survey request?
17
Types of data collected and their purpose
Types of data we collected in ESS interviews, and why. Illustrate with findings from our research.
18
1. Respondents’ descriptions of innovation
Provide basis for: Learning Rs’ concepts and terminology Assessing consistency and variation among Rs, companies, and industries Evaluating validity of draft survey definitions Asked respondents to talk about innovation in their own words. Rs’ descriptions… before exposure to survey concepts. Allowed us to: Hear the language that respondents use to describe innovation Learn how concepts and terms used vary between people, companies, industries Begin to evaluate the validity of the draft survey concepts and definitions Everyone has a definition of “innovation” but very subjective. Generally “new”, “unique”. Rs generally thought they understood the definitions, but for the most part they are not used by companies. Rs used a variety of criteria to define innovation, but generally related to market success – competitive advantage, growth, risk, etc. Also, they frequently referred to new products and processes to make them, which correspond to survey concepts.
19
2. Reactions to draft survey definitions
Provide basis for : Detecting whether draft survey definitions are clear or not Identifying specific points of agreement and disagreement with Rs’ concepts and language Understanding relevance to company activities Next we exposed Rs to draft survey definitions and got their reactions to them. Allowed us to: See whether our definitions appear to be clear Identify specific parts that are clear and what parts deviate from how Rs think about them Understand how the concepts relate to activities of interest in companies All four definitions of types of innovation appeared to be clear, although appeared to be more intuitive to everyone from a business perspective – product and process innovation. The definitions of organizational and marketing innovation were understandable but not intuitive, because, e.g., businesses commonly re-organize for efficiency, and may experiment with new ways of marketing, but they are not necessarily considered innovative.
20
3. Examples of company activities associated with innovation
Provide basis for: Clarifying Rs’ definitions Assessing comprehension and validity of survey concepts Discussing records Developing effective instructions (in-scope vs. out-of-scope activities) Asked for examples of activities Rs consider innovative. This is an important query, because it provides A more objective way to understand nuances of R’s definition that they may not have articulated clearly. Another way to assess comprehension of concepts and their validity with regard to company activities of interest. Concrete starting point for discussing what information companies may keep track of related to innovation. Guidance on survey concept specification, and what kinds of instructions may be needed to ensure in-scope activities are reported and out-of-scope activities are excluded in survey responses. Nearly all examples matched one of the four types of innovation, and were correctly associated by Rs with the appropriate concept. Gave us confidence that the definitions may largely stand on their own. A few outliers gave some idea of types of activities to specifically exclude, lead to more robust specification of the concept.
21
4. Discussion of data sources
Individual knowledge vs. records Level of detail Terminology Response processes Data quality, burden Who are appropriate respondents? What sources of information exist related to innovation. Whether in individuals’ knowledge or information systems. What level of detail. Also terms used, e.g., accounting labels. Response processes – data retrieval, estimation strategies. Which relate directly to the quality of responses as well as the burden our requests may impose. Finally, who are the right respondents with the relevant knowledge, or access to information systems? Large businesses and those with strong technological or logistical orientation (e.g., manufacturers, information processers, wholesalers, etc.) more likely to have detailed information related to innovation. But generally “innovation” is not found in records. Sales and expenditures would have to be associated with specific innovative activities in order to report them. Thanks to careful recruiting, we believe we reached appropriate respondents with broad yet detailed knowledge of operations – corporate-level upper managers. They would likely be in a position to get assistance from people in finance-related areas in order to report sales and expenditures data associated with innovative activities.
22
ESS Questionnaire design strategies
Possible need to avoid using “innovation” in questions, potential bias Operationalize concrete aspects, minimize reliance on judgments Parse out specific aspects of innovation, ask about them individually A few brief comments on how our ESS findings may inform the design of the survey. Please note that we have not yet reached this stage of the project so these comments are hypothetical. Since “innovation” may not be defined the same way by different people, we may want to avoid using the term in specific questions, asking instead about specific and concrete aspects of the concepts. For example, take apart an aggregate concept like “organizational innovation” and ask about its components – “business practices” and “workplace organization” – separately.
23
Summary – Conduct ESS to:
Evaluate, validate, and refine survey concepts Inform design of survey questions to account for: R language and concepts Data availability Cognitive processes Response processes To summarize… Conduct ESS prior to developing survey questions to… (read bullets)
24
Contact info: Dave Tuttle Audrey Kindlon
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.