Download presentation
Presentation is loading. Please wait.
Published byTerence Brooks Modified over 9 years ago
1
Evaluating the Research Environment as Part of a System of Innovation: Toward Policies & Practices That Encourage Complex, Inter- Organizational Teams To Bridge The Gap Between Scientific Discovery & Commercialization NSF Industry/University Cooperative Research Centers (I/UCRC) Program Evaluators' Semi-annual Conference June 3-4, 2009 Gretchen B. Jordan, Ph.D Sandia National Laboratories gbjorda@sandia.govgbjorda@sandia.gov In collaboration with the Center for Innovation, University of Maryland Work presented here was completed for the U.S. DOE Office of Science by Sandia National Laboratories, Albuquerque, New Mexico, USA under Contract DE-AC04-94AL8500. Sandia is operated by Sandia Corporation, a subsidiary of Lockheed Martin Corporation. Opinions expressed are solely those of the author.
2
Outline Assessing the research environment --what does it take to do excellent research that has an impact How this fits within an innovation system How can that impact be faster, better, cheaper if researchers work in research teams that cross basic and development arenas 2 G. Jordan, June 2009
3
Assessing the research environment and what researchers need to be high performing 3 G. Jordan, June 2009
4
4 Motivation for assessing research environment and management Project funded by the Office of Basic Energy Sciences in the U.S. Department of Energy beginning in 1996 Desire to define strategies to improve research effectiveness –Research environment is deteriorating –Limited studies to date on management of science –Organize thinking about differences in RTD (Research Technology & Development) organizations, and circumstances –Examine multiple levels and linkages (portfolio, projects) Respond to public demand for demonstrating accomplishments –Legislative and administrative requirements (GPRA, PART) –Need for a leading indicator 4 G. Jordan, June 2009
5
5 Evolution of the project 19 focus groups (DOE, industrial, university) and extensive literature review Defined attributes and organized within the Competing Values Framework (Cameron, Quinn, et al), extending for RTD A survey to capture employee perceptions of their research environment –To link to nature of work –To analyze and present data to encourage action plans Used with case studies to determine impact of specific management interventions Beginning to Link survey findings with data on performance Develop management and measurement models 5 G. Jordan, June 2009
6
6 Key attributes of the research environment were determined through … Information from 19 focus groups of scientists and managers at three DOE laboratories, one industry lab, and one university “What do you need in your research environment to do excellent work?” “What attracted you to the lab and what keeps you here?” Study of current literature Developed and tested survey questions √PNNL EHSD Division in 1999, Ford Research Lab in 2000 √SNL – 3 Centers in 1998, 17 Centers in 2001, 2003, 2008 √SNL and NOAA case studies in 2003-2004, 2005-2007 √NMSU in 2006 6 G. Jordan, June 2009
7
Attributes were logically grouped in a modification of the “Competing Values” framework* Agile, Long term Investment Focus with Clearly Defined Goals Quantity & Quality of Resources Organizational Support for Research Coordination by Managers Rewards for Research/Work Value of Managers of Research Autonomy Exploration Internal Collaboration/ Integrate Ideas External Collaboration/ Integration Exploration, Autonomy, & Integration Organizational Strategy & Investment People Rewards & Management Tensions of Achieving Organizational Effectiveness Resources, Control & Support Systems Particularly important for more radical innovation Important for ALL types of research (*Cameron and Quinn 1999) 7 G. Jordan, June 2009
8
8 42 attributes with a focus on innovation External Collaboration/ Integration Collaboration outside the organization Exchange ideas within the field Exchange ideas with different fields External teams with multiple fields Focus with Clearly Defined Goals Research Vision Research Strategies An integrated R&D portfolio Quantity & Quality of Resources Equipment for research Lab/ Physical Work Environment Stability of funding Quality of Technical Staff Staffing for Optimal Mix of Skills Organizational Support for Research Services for Staff Laboratory Systems & Process Competencies – depth Competitiveness of Overhead Rates Reputation for Excellence Control Via Managers Project Planning & Execution Project-Level Measures of Success Rewards for Research/Work Salaries Benefits Educational Development Technical Career Advancement Recognition for Merit Respect for People Value of Managers of Research Management Integrity Technical value added Overall Value-Added Management Autonomy Autonomy in Decision-Making Freedom to Explore New Ideas Resources for Exploring New Ideas Internal Collaboration/ Integrate Ideas Internal Communication about research Collaboration inside the organization Internal teams with multiple fields Provide critical thinking for each other Exploration Time to Think Creatively Able to Take Risks with Ideas Sense of enthusiasm Agile, Long term Investment Investing in new program areas Investment in basic research Identify new opportunities Internal Resource Allocation Tensions of Achieving Organizational Effectiveness Exploration, Autonomy, & Integration Resources, Control & Support Systems Organizational Strategy & Investment People Rewards & Management 8 G. Jordan, June 2009
9
9 Researchers or project leader identify the work profile (can apply to science or technology projects) Complex Task Problems are multi-dimensional Specialized Task Relatively straightforward problems Large R&D Requires large scale or specialized equipment or facilities Small R&D Doesn’t require large expenditures of resources Accomplished in a year No significant adjustments are needed in other dimensions If more than one area of expertise is involved the areas are fairly easy to combine Usually requires a period of years for success Requires an order of magnitude improvement or shift of primary focus Not easy to combine the areas of expertise that are involved Requires a modest improvement or customization Necessary areas of expertise are fairly easy to combine Requires significant adjustments in many dimensions of product, process, and/or organization Not easy to combine the areas of expertise that are involved 9 G. Jordan, June 2009
10
10 Areas of Agreement Among 40 Research Organizations (2200 staff in three different laboratories) Highest Favorable RatingsLowest Favorable Ratings Quality of staff (37) Respect for people (26) Equipment & physical environment (25) Sense of challenge & enthusiasm (23) Autonomy (18) Identifying new projects/opportunities (28) Rewards & recognition (27) Internal research funds allocation (26) Laboratory-wide measures of success (16) Reducing overhead rate/burden (15) Drivers of Satisfaction (in top ten) Drivers of View on Trend (in top ten) Research vision & strategies (21) Invests in future capabilities (19) Sense of challenge & enthusiasm (19) Identification of new opportunities (17) Project level measures of success (17) Research vision & strategies (27) Investment in future capabilities (28) Identification of new opportunities (20) Decisive, Informed management (19) Champion long term research (18) Reward and recognize merit (18) What is important to RTD workers? Note: Does not include data from 2003 forward 10 G. Jordan, June 2009
11
Analyzing differences across time can be useful, especially when tied to management or external changes in that period ANOVA Table Data shown here are notional 200X200X + 2 11 G. Jordan, June 2009
12
An innovation system and our systems evaluation framework (Jordan, Hage, and Mote)
13
An Intellectual Call to Arms “Are we funding all the R&D we need to defend ourselves, improve and sustain our quality of life, and compete with other nations in a globalized high-technology economy?... How much should a nation spend on science? What kind of science? How much from private versus public sectors? Does demand for funding by potential science performers imply a shortage of funding or a surfeit of performers?... …We need econometric models that encompass enough variables in a sufficient number of countries to produce reasonable simulations of the effect of specific policy choices.” John Marburger, Director Office of Science and Technology Policy Executive Office of the President April – May 2005 Source: Bhavya Lal, STPI, at AEA 2006 The call for a “Science of Science and Innovation Policy” 13 G. Jordan, June 2009
14
http://www.cs.unibo.it/schools/AC2005/docs/Bertinoro.ppt#266,11,The Blind Men and the Elephant Parts are studied and understood better than the whole! Summary – What We Know 14 G. Jordan, June 2009
15
A Science of Science and Innovation Policy must build a theory that connects levels Research Team Research Organization The Sector’s Idea Innovation Network The Sector’s National and Global Context micro meso macro 15 G. Jordan, June 2009
16
Theories that guide our framework Research Team –Management of innovation literature, learning theory Research Organization –Organizational innovation theories –Research Profiles theory Science/technological Sector –Idea Innovation Network on RTD process –Network theories –Sector economic models National and global context –Modes of coordination theories –Institutional and institutional change theory A framework first presented at New Frontiers of Evaluation, Vienna, Austria April 24-25, 2006 16 G. Jordan, June 2009
17
Our aim –an evaluation framework that answers national policy makers’ questions A fruitful way to do this is to improve and connect existing theories to identify blockages and bottlenecks to innovation (new rationales for policy) at levels of Organizations Networks of organizations Macro institutional rules To answer fundamental questions such as How much RTD funding goes to which technological and service sectors, RTD arenas, and performers? Are we developing commercially/mission successful products and services, and how fast? How do we best contribute and coordinate at the national level? 17 G. Jordan, June 2009
18
Micro level questions Allocation of RTD funds within a sector Possible blockages and bottlenecks Amount of funds (public vs. private) allocated to each arena Amount of funds allocated by how radical the RTD and how large the scope of focus within arena portfolios Presence of specific structure and management profiles in performing organizations (research profiles and environment) Theory suggests (given mission and technical/market opportunities) Fill funding gaps Fund larger amounts where strategy is radical advance, or large scope is needed Match funding for organizational profile to strategy Evaluation implications Gather sector level comparative data and start to establish norms 18 G. Jordan, June 2009
19
Blockages to innovation at research team level Agile, Long term Investment Focus with Clearly Defined Goals Quantity & Quality of Resources Organizational Support for Research Coordination by Managers Rewards for Research/Work Value of Managers of Research Autonomy Exploration Internal Collaboration/ Integrate Ideas External Collaboration/ Integration Exploration, Autonomy, & Integration Organizational Strategy & Investment People Rewards & Management Tensions of Achieving Organizational Effectiveness Resources, Control & Support Systems Particularly important for more radical innovation Important for ALL types of research 19 G. Jordan, June 2009
20
A blockage could be the funding mix across four Research Profiles with different strategic outcomes. Incremental Advance Straightforward, Intra Organizational Task Broad Scope of Focus Large, Coordinated Programs Narrow Scope Advance Small, Autonomous Projects Radical Advance Complex, Inter Organizational Task Be First Expand into new at large scale Be New Expand into new at small scale Be Sustainable Exploit existing at small scale Be Better Exploit existing at large scale An organization or program can have a mix of the four profiles and would manage them differently. 20 G. Jordan, June 2009
21
Blockage could be a lack of connectedness in the innovation process There is increasing differentiation of arenas in the innovation process. For successful introduction of new product/ mission solution RTD advance can occur in one or more arenas Ideas move between arenas Inter-organizational networks transfer tacit knowledge Manufacturing, quality research can’t be ignored Basic research Manufacturing research Applied research Development research Quality research Commercialization research INNOVATION Universities Bio Tech firms Pharmaceutical companies......... sub networks An example The idea innovation network: Hage and Hollingsworth (2000), modifying Kline and Rosenberg (1986) 21 G. Jordan, June 2009
22
Systems evaluation framework puts focus on the technology sector Bottlenecks can be spotted more easily here Meso level connects macro with micro Mission and policy decisions are often sector specific Policy impacts differ by sectors because sectors differ in –Amount of investment by RTD arena –Rates of technical change Organization/Team Idea Innovation Network within Technological Sector Nation/state Policy Objectives Macro Meso Micro 22 G. Jordan, June 2009
23
Meso level questions – Performance and connectedness Possible blockages and bottlenecks Technical achievement in real time in each arena (connected to sector performance) Overall sector socio-economic performance (new sales in product mix, speed to develop, how radical/broad) Strength of networks between differentiated arenas, among small organizations within arena Theory suggests (given mission and technical/market opportunities) Reasons for poor performance at 3 levels Where to increase transfer of tacit knowledge Evaluation implications Build on existing output measures and peer review Gather comparative sector data to establish knowledge transfer with forms of connectedness 23 G. Jordan, June 2009
24
Macro level questions – Resources and modes of coordination Possible blockages and bottlenecks Extent to which dominant mode of coordination (market, state, association) facilitates innovation Extent to which high risk capital is available Extent to which resources (skills, facilities) are available by arena Theory suggests (given mission and technical/market opportunities) Arguments about market mechanisms and alternatives Location and speed of capabilities construction, destruction Evaluation implications Examine what state interventions help form, strengthen networks 24 G. Jordan, June 2009
25
All these work together…Key questions to identify innovation bottlenecks and policy objectives and effectiveness Socio economic outcomes Technical progress Network connectedness Organizational profiles – do attributes match the profile? RTD arenas – are there sufficient funds Portfolios - need more/ less radical, large scope? Modes of coordination – effective? Capabilities – Level, mix, availability High risk capital – available where Basic research Manufacturing research Applied research Development research Quality research Commercialization research Macro- Institutional Rules as they affect the sector Micro - funds allocation by arena and profile INNOVATION Meso - Performance by sector and arena if performance is not as expected, check for bottlenecks 25 G. Jordan, June 2009
26
Complex research teams: What to look for to speed innovation 26 G. Jordan, June 2009
27
Problem: How best to increase the innovativeness of science? The management of innovation literature argues that the complex team stimulates innovation (Brown and Eisenstadt, 1995: Hage 1999; Kanter 1988; Verhaeghe and Kfir 2002; Meeus and Hage 2006) Thus greater innovative advance comes from –More functions in cross-functional teams –Higher rates of communication within a project –Greater cross-fertilization of ideas within a project We expand the definition of complex to include diversity of roles and functions, specialties and disciplines, cultures. We add kinds of complex research teams (Jordan, 2006): –Small or large teams within an organization –Inter-organizational teams working across arenas of research Our NSF-funded project within Science of Science and Innovation Policy 27 G. Jordan, June 2009
28
Our NSF-funded study approaches this problem by Using the Research Environment Survey (Jordan 2003) and interviews to identify and measure Research Profiles on two dimension: radicalness and scope Complexity of the research team and how much communication and critical thinking occurs within the teams and between teams. Attributes of autonomy, managerial control, rewards, agility of investment, organizational strategy and support. Mechanisms that research managers, regardless of level, use to encourage cross-fertilization despite the cognitive gap between disciplines and cultures. 28 G. Jordan, June 2009
29
Choosing projects to study from six disciplines Choosing some projects embedded in “Centers” Using interviews with managers to measure –Nature of the discipline (rate of change, stability of funding, interdisciplinary work, …) –Mechanisms for creating cross-functional teams and diverse external collaborations –Amount of contact with the six arenas of research –Various strategies that public research laboratories use to reach out to external research organizations –Relative success of these measures And 29 G. Jordan, June 2009
30
Innovation, Complex Research Teams, and Problems of Integration: Various ways a research team can be complex Different Functional areas in management or in the doing of research such as methodologist, experimenter, theorist, statistician Roles within these functional areas, e.g. idea person, critic, specialist in dynamic modeling Sub-specialties Specialties Disciplines Arenas of research Organizations, organizational cultures Regional/national cultures 30 √ Check list G. Jordan, June 2009
31
31 Three Degrees of Complexity 1.Small teams within an organization 2.Large teams within an organization 3.Inter-organizational teams working across types/arenas of research √ Check list G. Jordan, June 2009
32
32 A problem for any complex team -- communication requires overcoming cognitive distance Radical innovation is more likely the greater the cognitive distance BUT communication declines with cognitive distance Thus how to combine diverse perspectives is a challenge understandability novelty value learning Optimal cognitive distance Cognitive distance Nooteboom, 2005 Communication Novelty G. Jordan, June 2009
33
33 The problems complex teams may have overcoming cognitive distance Time and resources to develop effective project communication (shared understanding, common language) Reward systems that recognize teams, as well as individuals Mechanisms to encourage collaboration inside the organization (overcome stovepipes, etc.) Building trust and culture where people are comfortable providing critical thinking for each other Managers who can add technical value across the diversity Systematic identification of opportunities for projects, partners, when team or objective is complex √ Check list G. Jordan, June 2009
34
34 Problems of large complex, intra- organizational teams Must integrate more people and resources Integrating teams as well as team members Integrating across intra-organizational boundaries (different goals, cultures) Integrating many parameters, conditions as well as knowledge sets, because they tackle broad-scoped projects which are complex Broad scale requires sustained commitment of large resources, while remaining open to change More radical research needs autonomy but larger, more complex tasks also need coordination Managers must plan and execute given uncertainty √ Check list G. Jordan, June 2009
35
35 Problems of inter-organizational complex teams Differentiation means organizations don’t do work in all areas anymore Teams located in different research contexts must bridge across research arenas Inter-organizational networks must transfer tacit knowledge Have to integrate across different organizations’ processes, culture Tension between organizational autonomy and inter- organizational ties Ties with other organizations bring access to resources but questions over who owns the team’s intellectual property √ Check list G. Jordan, June 2009
36
An example of integrating complex intra- organizational teams Built a new department doing basic and applied research for a manufacturing line Hired people who were flexible about different work styles New hires spent time defining their projects with required input from outside department Kept department small (12) but contracted with other departments for joint work Co-located people with product designers Very competent technical and emotional leadership 36 G. Jordan, June 2009
37
Case study example - continued Our research environment survey showed –Autonomy and resources to pursue new ideas were higher here than in another co-location pilot –Challenge was lower (due to constrained choice of problems and approach) – Time to think was higher Interviews revealed that to achieve integration the manager –Required presentations by external projects –Paved way for joint projects –Guided conflict resolution –Promoted work outside department Although a small case study, this illustrates some general principles for maintaining balance between diversity/complexity and integration. 37 G. Jordan, June 2009
38
Summary and conclusions Strengths of our innovation systems approach Theories-based, captures the process of innovation Useful for policy makers for reformulating policies Balances complexity and focus Able to connect micro with macro levels Indicators help identify organizational, network, and institutional bottlenecks and suggests how these occur Raises questions, will help build theory, including effectiveness of market mechanism for transfer of tacit knowledge and ways to break path dependency Testing the micro level of the system Research environment survey, research profiles Characteristics of complex teams and speeding innovation through management action 38 G. Jordan, June 2009
39
39 Selected References Jordan, G. B., Hage, J., & Mote, J. 2008. A theories-based systemic framework for evaluating diverse portfolios of scientific work, part 1: Micro and meso indicators. In C.L.S. Coryn & Michael Scriven (Eds.), Reforming the evaluation of research. New Directions for Evaluation, 118, 7–24. Mote, J., Y. Whitestone, G. Jordan and J. Hage. 2008. Innovation, Networks and the Research Environment: Examining the Linkages. International Journal of Foresight and Innovation Policy 4(3): 246-264. Hage, Jerry, G.B. Jordan and J. Mote (2007). A Theories-Based Innovation Systems Framework for Evaluating Diverse Portfolios of Research: Part Two - Macro Indicators and Policy Interventions. Science and Public Policy, 34(10): 731-741. Hage, J.; Jordan, G., Mote, J.; Whitestone, Y. 2008. Designing and facilitating R&D collaboration: The balance of diversity and integration. Journal of Engineering and Technology Management 25(4): 256-268. Jordan, G.B. 2006. Factors Influencing Advances in Basic and Applied Research: Variation Due to Diversity in Research Profiles. In Innovation, Science, and Institutional Change: A Handbook of Research, J. Hage and M. Meeus (eds). Oxford University Press: Oxford, 173-195. Jordan, G. B., J. Hage, J. Mote and B. Hepler. 2005. Investigating Differences Among Research Projects and Implications for Managers. R&D Management, 35 (5): 501-511. Jordan, Gretchen, 2005. “What is Important to RTD Workers”, Research Technology Management, Vol. 48 No. 3, May-June. Jordan, Gretchen, L. Devon Streit, and J. Stephen Binkley, 2003. “Assessing and Improving the Effectiveness of National Research Laboratories,” IEEE Transactions in Engineering Management, 50, no.2 (2003): 228-235. Jordan, G.B. and L.D. Streit. 2003. “Recognizing the Competing Values in Science and Technology Organizations: Implications for Evaluation,” in Learning From Science and Technology Policy Evaluation, Shapira, Philip and Kuhlman, Stefan, Eds., Edward Elgar, Cheltenham, UK and Northampton, Mass. 2003. 39 G. Jordan, June 2009
40
40 Contact Information Gretchen Jordan gbjorda@sandia.gov 505-844-9075 Jerry Hage HAGE@socy.umd.edu 301-405-6437 Jonathan Mote Jmote@socy.umd.edu 301-405-9746 We welcome comments, suggestions, examples G. Jordan, June 2009
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.