National Institute of Standards and Technology U.S. Department of Commerce Technology Program Evaluation: Methodologies from the Advanced Technology Program.

Slides:



Advertisements
Similar presentations
Options appraisal, the business case & procurement
Advertisements

UCSC History. UCSC: A brief history 60s University Placement Committee A lot of field trips/interaction with employers.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Excellence with Impact Declan Mulkeen January 2011.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
Facilities Management 2013 Manager Enrichment Program U.Va.’s Strategic Planning Initiatives Colette Sheehy Vice President for Management and Budget December.
Ray C. Rist The World Bank Washington, D.C.
Science of Science and Innovation Policy (SciSIP) Presentation to: SBE Advisory Committee By: Dr. Kaye Husbands Fealing National Science Foundation November.
Wider economic impacts from R&D investments Arild HervikResearch seminar
Transportation leadership you can trust. presented to FHWA’s Talking Freight Seminar presented by Michael Williamson Cambridge Systematics, Inc. April.
Measuring for Success NCHER Legislative Conference Sophie Walker September 26, 2013.
March 5, 2002 Lessons Learned from GAO’s Evaluation of the Outcomes of R&D Programs Presentation to ORNL’s Conference on Estimating the Benefits of Government-Sponsored.
Comprehensive M&E Systems
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
The Evolution of Evaluation Practices in the Canadian Federal Government Presented at CES / AEA Conference Toronto, Ontario, Canada October.
The 5 Characteristics Successful Nonprofits Have in Common
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
HEInnovate A self-assessment tool for higher education institutions (HEIs) wishing to explore their entrepreneurial and innovative potential.
SMEs Division National IP Action Plan for Entrepreneurs and SMEs March 2008 Small and Medium-Sized Enterprises Division World Intellectual Property Organization.
Sub-Regional Workshop for GEF Focal Points in Asia Bangkok, Thailand 7-8 April 2009 Tracking national portfolios and assessing results.
National Institute of Standards and Technology Technology Administration U.S. Department of Commerce Rosalie Ruegg, Director Economic Assessment Office.
Competitive Funding for Higher Education Richard Hopper Senior Education Specialist The World Bank Baku, Azerbaijan – May 13, 2009.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Energy Issues in Peru and the Andes: Environmental and Social Aspects George Washington University January 28, 2005 Dr. Robert H. Montgomery Head, Environmental.
The WIPO Development Agenda: An Overview Geneva May, 2009 Esteban Burrone World Intellectual Property Organization.
Fifth Overall Performance Study (OPS5).  Objective  Analytical framework  Key issues to be covered  OPS5 audience  Organizational issues  Group.
What is an Inventory Program for? Dr. Emilio Moceo Ph.D Director of Studies Meet international obligations and expectations Inform international, national,
Evaluation in the GEF and Training Module on Terminal Evaluations
1 Bilbao, February 2007 “European Curricula for Economic Animator in The Enlarging Europe” Testing Phase.
GBA IT Project Management Final Project - Establishment of a Project Management Management Office 10 July, 2003.
Discussion From Republic of Science to Audit Society, Irwin Feller S. Charlot ASIRPA, Paris, France; June 13, 2012.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
Tracking of GEF Portfolio: Monitoring and Evaluation of Results Sub-regional Workshop for GEF Focal Points Aaron Zazueta March 2010 Hanoi, Vietnam.
1 Mid-Term Review of the Hyogo Framework for Action Roadmap to Disaster Risk Reduction in the Americas & HFA Mid-Term Review.
UNFPA-UNICEF Joint Programme on Female Genital Mutilation/Cutting: Accelerating Change Management Response and Key Actions.
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
National Institute of Standards and Technology Technology Administration U.S. Department of Commerce Accelerating Emerging Technologies to the Marketplace.
Identification of national S&T priority areas with respect to the promotion of innovation and economic growth: the case of Russia Alexander Sokolov State.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Technology Transfer Execution Framework. 2 © 2007 Electric Power Research Institute, Inc. All rights reserved. Relationship Between Your EPRI Value and.
African Centre for Statistics United Nations Economic Commission for Africa Addressing Data Discrepancies in MDG Monitoring: The Role of UN Regional Commissions.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
Pilot Projects on Strengthening Inventory Development and Risk Management-Decision Making for Mercury: A Contribution to the Global Mercury Partnership.
1 Federal Lab Technology Transfer, Annual Reporting, and Gauging Performance Mark Boroush Senior Policy Analyst, Office of Technology Policy, Technology.
Objectives and Strategies of RRSF The RRSF has been prepared with an overall objective and four specific objectives to overcome the identified problems.
WP1: IP charter Geneva – 23rd June 2009 Contribution from CERN.
NOAA Cooperative Institutes John Cortinas, Ph.D. OAR Cooperative Institute Program, Program Manager NOAA Cooperative Institute Committee, Chairperson.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
A project implemented by the HTSPE consortium This project is funded by the European Union SECURITY AND CITIZENSHIP RIGHT AND CITIZENSHIP
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Meeting an Evaluation Challenge: Identifying and Overcoming Methodological Problems Joint CEA/AEA Evaluation 2005 “Crossing Borders, Crossing Boundaries”
Kathy Corbiere Service Delivery and Performance Commission
Catholic Charities Performance and Quality Improvement (PQI)
TEN-T Executive Agency and Project Management Anna LIVIERATOU-TOLL TEN-T Executive Agency Senior Programme and Policy Coordinator European Economic and.
SEL1 Implementing an assessment – the Process Session IV Lusaka, January M. Gonzales de Asis and F. Recanatini, WBI
National Institute of Standards and Technology U.S. Department of Commerce Technology Program Evaluation: Methodologies from the Advanced Technology Program.
Benefit-Cost Analysis: ATP Experience Jeanne Powell Economic Consultant Technology Program Evaluation: Methodologies from the Advanced.
Status Reports: Measuring against Mission National Institute of Standards and Technology U.S. Department of Commerce 1 Technology Program Evaluation: Methodologies.
‘Real Options’ Framework to Assess Public Research Investments Nicholas S. Vonortas Center for International Science and Technology Policy & Department.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
GEO Implementation Mechanisms Giovanni Rum, GEO Secretariat GEO Work Programme Symposium Geneva, 2-4 May 2016.
Organization  As a member of the Strategy & Business Development team, this position will support the development and execution of Corporate, Sector,
CDP-GIZ research project – Paris Workshop Gesellschaft für internationale Zusammenarbeit GmbH André Lammerding, Head of Programme International Water Stewardship.
Workshop on Research Methods to Study Productivity Determinants Within Firms and the Role of Policy November 1, 2012 P olicy setting and firm-level focus.
PROPOSED STRATEGIC PLANNING PROCESS DECEMBER 3, 2014 Port of Port Angeles.
Principles 7 Main obstacles articulated in implementing the leading ESF Principles  Uncertainty on advantages  Assumed higher administrative costs 
Design of foresight-based evaluation in Tekes Activities
13th Governing Council of SIAP 4-5th December,2017 Chiba, Japan
Measuring the Benefits of the Advanced Technology Program
Presentation transcript:

National Institute of Standards and Technology U.S. Department of Commerce Technology Program Evaluation: Methodologies from the Advanced Technology Program Richard N. Spivack Robert Sienkiewicz Impact Analysis Office Technology Innovation Program National Institute of Standards and Technology Collaborative Expedition Workshop National Science Foundation March 18, 2008

National Institute of Standards and Technology U.S. Department of Commerce ATP Mission To accelerate the development of innovative technologies for broad national benefit through partnerships with the private sector

National Institute of Standards and Technology U.S. Department of Commerce Economic Rationale for ATP Addresses market failure or market imperfections in early-stage, generic technologies –Due to high technical risk and uncertainty about outcome of R&D activity, long expected time to commercialization, and uncertain potential markets. –Also due to spillover effects and inability of innovator to fully appropriate returns on investment. –The above leads to disincentive for private sector to invest in technologies that could benefit society as a whole.

National Institute of Standards and Technology U.S. Department of Commerce Need to Measure Success “Scientists do, of course, make judgments all the time about promising lines of research … It makes sense for the world’s largest sponsor of research, the U.S. government, to want to make such choices as wisely as the most productive scientists do … But is it possible to decide rationally when to enhance or to terminate a project if we do not possess a way of measuring its success?” John Marburger, President’s Science Advisor (keynote speaker at the 2002 American Association for the Advancement of Science’s 27 th annual Colloquium on Science and Technology Policy)

National Institute of Standards and Technology U.S. Department of Commerce Evaluation is Important for Public Programs An agency should demonstrate that it has: a plan and organizational capacity to conduct assessment of its R&D activities; an explicit set of performance metrics; an organizational commitment to evaluate its performance; and credible results that it can produce. These are important elements in annual budget reviews by the executive and legislative branches in the United States as well as other countries where performance management is a priority.

National Institute of Standards and Technology U.S. Department of Commerce Evaluation Began Early ATP evaluation activities began early –pre-GPRA 1993, pre-President’s Management Agenda 2001 Economic Assessment Office charged with measuring the impact of the program. –Staff of 16 economists, statisticians, information specialists, plus contractors. ATP budget for evaluation had grown from $25K to $2-5M per year. EAO uses multiple methods to measure direct and indirect impacts of project funding.

National Institute of Standards and Technology U.S. Department of Commerce Multiple Approaches to Evaluation Statistical profiling of –applicants, projects, participants, technologies Real time monitoring of research and business activities –ATP project teams –technical and business progress reports Progress measures collected through –surveys –Business Reporting System –other databases Statistical and econometric analysis –research productivity –collaboration effects Developing and testing new tools and methods

National Institute of Standards and Technology U.S. Department of Commerce Multiple Approaches to Evaluation, con’t Microeconomic case studies of projects or clusters of projects –benefit and cost estimates –private, public, and social return on investment –follow on macroeconomic impact studies Policy/special issue studies –early-stage tech funding –regional effects Status reports on completed projects –mini case studies –star rating for portfolio analysis –cross-cutting analyses Expert reviews –ATP Advisory Committee –National Research Council Comparisons with foreign counterpart programs

National Institute of Standards and Technology U.S. Department of Commerce Evaluation Activities tied to Timing of Expected Results Ex ante peer review for project selection Survey tools to monitor project progress Performance measures Expert reviews Portfolio-wide analysis –status reports of completed projects that rate progress against ATP mission. Post-project surveys and data analyses In-depth and cluster case studies—return on investment –net present value –benefit-cost ratio –internal rate of return social, public, private Econometric analysis Macroeconomic analysis short and mid-term longer term

National Institute of Standards and Technology U.S. Department of Commerce Portfolio-wide Analysis Status reports of completed projects –descriptive mini-case studies. Provides a portfolio analysis of project performance. Each project receives a rating between 0 and 4 stars on how well it met mission objectives –overall project performance = knowledge creation and dissemination + commercialization progress and diffusion + future outlook. Aggregation of stars provides portfolio of program success. Patent trees show knowledge spillovers.

National Institute of Standards and Technology U.S. Department of Commerce Benefit-Cost Studies Microeconomic analytical framework using a multidisciplinary approach. –technology assessment; industry structure; competitive dynamics; microeconomic analysis; and corporate finance measures Uses widely accepted economic reasoning to develop estimates of impacts of program funding on project timing and success, based on: –spillovers; counterfactuals; sensitivity analysis; valuation metrics Calculates net present values, internal rates of return, and benefit-cost ratios. Aggregation of project impact studies point to an 8:1 return on program investments of $2.1B.

National Institute of Standards and Technology U.S. Department of Commerce Evaluation Results are Used for Program Management Fully integrated program evaluation –Design –Implementation –Assessment –Learning and Feedback Result: Continuous program improvement –Project selection –Project management –Outreach

National Institute of Standards and Technology U.S. Department of Commerce Additional Uses for Evaluation Results … Meet external stakeholder requests for program results and official requirements; Gain new insights into key relationships; Improve understanding of ATP’s contribution to the U.S. innovation system; Document performance; and Promote and maintain external support.

National Institute of Standards and Technology U.S. Department of Commerce Evaluation Best Practices Management and institutional commitment to performance evaluation. Dedicated and appropriate mix of expert staff. Committed and steady budget. Multi-faceted approach to evaluation. Commissioning external studies with experts.

National Institute of Standards and Technology U.S. Department of Commerce Evaluation Best Practices, con’t Matching assessment methods to questions posed. Systematic data collection and regular reporting systems. Pursuit of development and testing of innovative methods to evaluate impact. Examination of successful and unsuccessful projects. Strategic communication of results.

National Institute of Standards and Technology U.S. Department of Commerce Continuing Challenges for Program Evaluation Managing change. –Data continuity versus evolving information needs Driving survey redesign through key output measures. Balancing program needs with broader economic research objectives. Balancing stories and case studies with portfolio analysis. Effective internal feedback mechanisms for project management and program design. Benchmarking to other programs and industry. Knowledge management.

National Institute of Standards and Technology U.S. Department of Commerce Final Thoughts “The simple act of measuring and reporting on results will itself promote improvements.” “Too much monitoring can cripple efforts in the early stage; too little and the value will be lost.” Wholey, Hatry, Newcomer Handbook of Practical Evaluation, pp. 102, 110

National Institute of Standards and Technology U.S. Department of Commerce Contact Information Richard N. Spivack, PhD Technology Innovation Program National Institute of Standards and Technology U.S. Department of Commerce phone: (301) fax: (301) Robert Sienkiewicz, PhD Technology Innovation Program National Institute of Standards and Technology U.S. Department of Commerce phone: (301) fax: (301)