Meeting an Evaluation Challenge: Identifying and Overcoming Methodological Problems Joint CEA/AEA Evaluation 2005 “Crossing Borders, Crossing Boundaries”

Slides:



Advertisements
Similar presentations
NATIONAL ASSOCIATION OF AUDITORS, COMPTROLLERS AND TREASURERS 2005 ANNUAL CONFERENCE Portland, Oregon ERP Systems – Independent Auditor Perspective Presented.
Advertisements

Course: e-Governance Project Lifecycle Day 1
HORIZON 2020 The EU Framework Programme for Research and Innovation Europe in a changing world – inclusive, innovative and reflective Societies Albert.
Evaluating Work: Job Evaluation
Ray C. Rist The World Bank Washington, D.C.
Science of Science and Innovation Policy (SciSIP) Presentation to: SBE Advisory Committee By: Dr. Kaye Husbands Fealing National Science Foundation November.
Capturing the impact of research Briony Rayfield.
TIA Consulting, Inc. Portfolio Evaluation: Toward a Common Evaluation Methodology for Funding Decisions Across Multiple Interdependent R&D Portfolios of.
2011 Governance, Risk, and Compliance Conference August 29 – 31, 2011 / Orlando, FL, USA The Top Four Essential Objectives to Auditing ERM Stephen E. McBride,
Session 1. Group 1: Implications of multiple- purpose forestry for information providers (Horrendous, unfamiliar, stressful) Wide audience – wide net.
1 SOCIAL RESEARCH ASSOCIATION IRELAND/IRISH EVALUATION NETWORK SEMINAR GOOD PRACTICE IN COMMISSIONING RESEARCH AND EVALUATION 10 TH JANUARY 2006.
IT Planning.
Stakeholders Analysis as an Innovative Methodology for Building Health Research Capacity in Saskatchewan Though the main objective of the study was to.
SCIENTIFIC ARTICLE WRITING Professor Charles O. Uwadia At the Conference.
BPT 3113 – Management of Technology
MARCH 2010Developed by Agency Human Resource Services, DHRM1 Organizational Design What Is It? Organizational Design is the creation of roles, processes,
UNEP Barriers to Technology Transfer - Environmentally Sound Technologies and Implementation of the Kyoto Protocol Dr Steve Halls Director International.
Regional Innovation Strategies José Luís Simões 2001/03/30 Reflections on US economic development policies: Meeting the ‘new economy’ challenge by Mikel.
The Scope and Challenge of International Marketing
Voorburg Group Classification Input Process Expert Group on International Economic and Social Classifications May 13-15, 2013.
What is Business Analysis Planning & Monitoring?
LEADING ONLINE: An autoethnography focused on leading an instructional focus on student learning in an online school DOCTOR OF EDUCATION WASHINGTON STATE.
Strengthening the quality of research for policy engagement in the African context – achievements and aspirations Tebogo B. Seleka Botswana Institute for.
1 Toward a standard benefit-cost methodology for publicly-funded S&T programs Crossing Borders, Crossing Boundaries 2005 Joint CES/AEA Conference Toronto,
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Don Von Dollen Senior Program Manager, Data Integration & Communications Grid Interop December 4, 2012 A Utility Standards and Technology Adoption Framework.
18 th Annual LAPFF Conference December 2013 Social Impact Investing Brian Bailey.
Nov. 18, 2003 TIA Consulting, Inc. Measuring Returns to Research in the Public Sector Rosalie Ruegg TIA Consulting, Inc. Research Money.
1 Phases in Software Development Lecture Software Development Lifecycle Let us review the main steps –Problem Definition –Feasibility Study –Analysis.
Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies.
Creating a Shared Vision Model. What is a Shared Vision Model? A “Shared Vision” model is a collective view of a water resources system developed by managers.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
Researchers in Europe without Barriers, April th 2009 Postdoctoral Research Careers Project.
IT Requirements Management Balancing Needs and Expectations.
1 st INFASA Symposium and Workshop Synthesis March 16 and 17, 2006 Bern, Switzerland As presented at the Symposium and Workshop by Dr. Fritz Häni, SHL.
Training Course on “Training of Trainers from the Greater Mekong Sub- Region on Decentralized Education Planning in the Context of Public Sector Management.
A Focus on CME and Grants Nancy Coddington, PhD Senior Director, Compliance Operations AstraZeneca Pharmaceuticals LP And Terry Hisey Deputy Managing Principal.
Characteristics of a National Innovation System
Evaluating the impact of health research: Revisiting the Canadian Institutes of Health Research Impact Assessment Framework Nicola Lauzon, Marc Turcotte.
A Credible Approach to Benefit-Cost Evaluation for Federal Research &Technology Programs: A U.S. Department of Energy Approach Presented at American Evaluation.
Identifying the Impacts of Technology Transfer Beyond Commercialization FPTT National Meeting, June 12, 2007.
South Africa in the global knowledge arena: implications for academic libraries Andrew M. KANIKI Executive Director: Knowledge Management and Strategy.
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
Mapping New Strategies: National Science Foundation J. HicksNew York Academy of Sciences4 April 2006 Examples from our daily life at NSF Vision Opportunities.
Agencia Nacional de Evaluación de la Calidad y Acreditación Methods and approaches for a management and evaluation of research at the Higher Education.
7 th Global RCE Conference Teacher Education and Better Schools Thematic Discussions.
Dr Aniyan Varghese eGovernment Unit eGovernment Unit Directorate General Information Society Dr Aniyan Varghese eGovernment.
Strengthening the Strategic Cooperation between the EU and Western Balkan Region in the field of ICT Research Key Barriers & Challenges in ICT Research:
Tünde Kovač-Cerović Ministry of Education and Science Serbia EVIDENCE-BASED POLICY MAKING IN EDUCATION REGIONAL CLUSTER OF KNOWLEDGE TURNING RESEARCH INTO.
IMPACT 3-5th November 20044th IMPACT Project Workshop Zaragoza 1 Investigation of extreme flood Processes and uncertainty IMPACT Investigation of Extreme.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Part-financed by the European Union Workshop 1 Fostering innovations Moderation: Susann Henning, Managing Authority Baltic Sea Region Programme
Meeting an Evaluation Challenge: Identifying and Overcoming Data and Measurement Difficulties AEA Evaluation 2006 “The Consequences of Evaluation” RTD.
National Institute of Standards and Technology U.S. Department of Commerce Technology Program Evaluation: Methodologies from the Advanced Technology Program.
TIA Consulting, Inc. Starting an Evaluation Program: Challenges and Lessons Learned Technology Program Evaluation: Methodologies from the Advanced Technology.
Benefit-Cost Analysis: ATP Experience Jeanne Powell Economic Consultant Technology Program Evaluation: Methodologies from the Advanced.
What is policy surveillance? What are the methods? Why is it important? November 2015.
Evaluating Research Portfolios I: An Analytical Perspective Impact Evaluation of Energy R&D Portfolios* Rosalie Ruegg, TIA Consulting, Inc.
RESEARCH METHODS Lecture 8. REVIEW OF LITERATURE.
EVIDENCE-BASED POLICY MAKING IN EDUCATION REGIONAL CLUSTER OF KNOWLEDGE.
Building America: Retrospective Evaluation of a Unique DOE R&D and Market Diffusion Program As Jack and Daniel noted, the Guide for Evaluating Realized.
Support for the AASHTO Committee on Planning (COP) and its Subcommittees in Responding to the AASHTO Strategic Plan Prepared for NCHRP 8-36, TASK 138.
RESEARCH METHODS Lecture 8
Third International Seville Conference on Future-Oriented Technology Analysis (FTA): Impacts and implications for policy and decision-making 16th- 17th.
Rankings from the perspective of European universities
RESEARCH METHODS Lecture 8
EM&V Planning and EM&V Issues
Collaborative regulation in the digital economy
Presentation transcript:

Meeting an Evaluation Challenge: Identifying and Overcoming Methodological Problems Joint CEA/AEA Evaluation 2005 “Crossing Borders, Crossing Boundaries” RTD TIG, Think Tank Session Toronto, Canada October 26, 2005 Rosalie T. Ruegg Managing Director TIA Consulting, Inc. Connie K.N. Chang Supervisory Economist Advanced Technology Program, NIST U.S. Department of Commerce

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 2 The 3 rd in a series of Think Tanks on barriers to evaluation 2003: Identification of 6 Categories of Barriers to Evaluation 2004: Focus on Institution and Cultural Barriers—Feedback Loops 2005: Focus on Methodological Barriers

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 3 Review of ‘03 Think Tank Main Finding Striking commonality of evaluation barriers among programs and across countries

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 4 Review of ‘03 Think Tank, cont’d Six categories of barriers identified 1. Institutional/cultural 2. Methodological 3. Resources 4. Communications 5. Measurement/data 6. Conflicting stakeholder agendas

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 5 Review of ‘03 Think Tank, cont’d Barriers were said to impede:  Demand for evaluation  Planning and conducting evaluation  Understanding of evaluation studies  Acceptance and interpretation of findings  Use of results to inform  program management  budgetary decisions  public policy

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang Think Tank Six Categories of Barriers Identified 1. Institutional/cultural 2. Methodological – today’s focus 3. Resources 4. Communications 5. Measurement/data 6. Conflicting stakeholder agendas

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 7 Methodological issues identified in 2003  Problems in measuring the value of knowledge creation  Lack of standardization leading to comparability problems  Inability to replicate studies  Difficulties in apportioning benefits to complex contributing factors

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 8 Methodological issues identified in 2003, cont’d  Difficulty in defining success against multiple program goals  Selecting the appropriate method for a given purpose  Reliability and acceptance of new methods  Adherence of studies to best practices

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 9 Additional methodological issues? Potential new issues to consider:  Problems in measuring commercialization and its economic value  What are best practices in methodology?  What constitutes market failure or market imperfection?  Question of additionality effect of public funding  What is the state-of-the-art of evaluating R&D portfolios and collections of R&D portfolios?

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 10 Methodological issues identified – today’s focus  Problems in measuring the value of knowledge creation  Lack of standardization leading to comparability problems  Inability to replicate studies  Apportioning benefits to complex contributing factors  Defining success against multiple program goals  Selecting the appropriate methods for a given use  Reliability and acceptance of new methods  Adherence of studies to best practices

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 11 Measuring value of knowledge creation How do we define value?  By quality or significance of knowledge created?  By economic value of the knowledge created?

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 12 Measuring value of knowledge creation, cont’d Methodology depends on the definition of value: For value in terms of quality, use bibliometrics (counts—with adjustment for quality variation among journals, citation analysis, content analysis); research value mapping; network analysis; third-party recognition through awards for scientific merit For value in terms of economics, follow multiple paths of knowledge flow to downstream users, using indicator metrics, historical tracing supported by citation analysis, then, benefit-cost analysis with “snow-ball” technique

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 13 Lack of standardization leading to comparability problems Standardization to whose standards?  By program practitioner? By research field? By country? Standardization of what methodological terms?  Performance indicators? Rate of return on investment? Net present values? Social vs. public rate of return? Underlying Assumptions? Other? How do we define comparability?  Across studies? Within a program? Across programs? Across national borders?

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 14 Lack of standardization leading to comparability problems, cont’d Some ways to promote standardization for improved comparability: Providing a standard format and guidelines for similar studies (e.g., NAS/NRC matrix and guidance for evaluating DOE EE & FE R&D projects) Commissioning clusters of studies to be performed within a common framework (e.g., ATP’s cluster studies of component-based software projects, tissue engineering projects, and composite manufacturing technologies) Other?

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 15 Ability to replicate studies What do we mean by ability to replicate?  Simple meaning: Like scientists, allow others to run the experiment using the same data to see if the same results are obtained  Another meaning: Re-do studies, testing for robustness by varying key data and assumptions in sensitivity analysis, and possibly refining the approach  Yet another meaning: Confirm prospective benefit estimates with later retrospective cost-benefit studies

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 16 Ability to replicate studies, cont’d Key is to avoid “black box” Transparency and clarity of approach are essential

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 17 Summary Six Categories of Barriers Identified Institutional/cultural Methodological Resources 4. Communications 5. Measurement/data 6. Conflicting stakeholder agendas

2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang 18 Contact information Rosalie T. Ruegg Managing Director TIA Consulting, Inc. Connie K.N. Chang Supervisory Economist Advanced Technology Program