What is the Current State of Evaluation Policy and Methodologies for Monitoring Program Performance? OR GPRA & PART: Through a Glass Darkly Irwin Feller.

Slides:



Advertisements
Similar presentations
Reasons for Monitoring and Evaluation at the Project Level
Advertisements

Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Gathering Evidence of Impact: A Continued Conversation Jan Middendorf Cindy Shuman Office of Educational Innovation and Evaluation.
1 Performance Assessment An NSF Perspective MJ Suiter Budget, Finance and Awards NSF.
Business & Operations Advisory Committee -- March 31, Performance Assessment.
Queensland Treasury Department Role and Function of Treasury Financial Framework Charter of Fiscal and Social Responsibility and Priorities in Progress.
9 th Annual Public Health Finance Roundtable November 3, 2012 Boston, MA Peggy Honoré.
Ray C. Rist The World Bank Washington, D.C.
U.S. Science Policy Cheryl L. Eavey, Program Director
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Pittsburgh, PA Software Engineering Institute Carnegie Mellon University Pittsburgh, PA Sponsored by the U.S. Department of Defense.
The Information Systems Audit Process
Virginia Teacher Performance Evaluation System
MEANS TO AN END: the OECD Approach for Effective Implementation of Public Procurement Systems Getting really strategic Paulo Magina Head of the Public.
Strategic Planning with Appreciative Inquiry
EVALUATION IN THE GEF Juha Uitto Director
Competency Models Impact on Talent Management
Effectively applying ISO9001:2000 clauses 5 and 8
Corporate-level Evaluation on Gender Equality and Women’s Empowerment Preliminary Findings 63 rd Session of the Evaluation Committee July 2010.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Strengthening the quality of research for policy engagement in the African context – achievements and aspirations Tebogo B. Seleka Botswana Institute for.
The U.S. Federal Budget in Science and Technology Kei Koizumi April 14, 2008 for the International Seminar on Policies of Science, Technology and Innovation.
Biology in the Federal Science Enterprise Howard Garrison Federation of American Societies for Experimental Biology April 17, 2008.
Evaluation in the GEF and Training Module on Terminal Evaluations
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
NIST Special Publication Revision 1
Strategic Planning Session David Rudder, Ph.D. Rudder Consultants, LLC. May 17, 2006.
Planning for Sustainability National Child Traumatic Stress Network All Network Meeting February 6, 2007.
Moving Toward Quantitative Evidence-based Science Policy: Science of Science Policy Developmental Efforts In Theory, Evaluation Methods, and Data Infrastructure.
ADD Perspectives on Accountability Where are We Now and What does the Future Hold? Jennifer G. Johnson, Ed.D.
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
Discussion From Republic of Science to Audit Society, Irwin Feller S. Charlot ASIRPA, Paris, France; June 13, 2012.
Research Program Overview National Institute on Disability and Rehabilitation Research Robert J. Jaeger, Ph.D. Interagency and International Affairs Interagency.
Performance Assessment Assessment of Organizational Excellence NSF Advisory Committee for Business and Operations May 5-6, 2005.
Strategic Planning Proposal for West Tennessee College Presented by: Joyce Kidd Chandra Harkins Steve Kreider SJC STRATEGIC PLANNING CONSULTANTS.
NSF GRFP Workshop Sept 16, 2016 Dr. Julia Fulghum
President’s Office of Science and Technology Policy Deborah D. Stine Specialist in Science and Technology Policy December 3, 2008.
Biological Sciences in the Federal R&D Portfolio Kei Koizumi April 17, 2008 for the NSF BIO Advisory Committee AAAS R&D Budget and Policy Program
Identification of national S&T priority areas with respect to the promotion of innovation and economic growth: the case of Russia Alexander Sokolov State.
Administrative Review & Restructuring. 1 The President’s Charge Review administrative organization and delivery of administrative services at all levels.
The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005.
Introduction & NSF Overview NSF Tribal College Workshop November 14, 2008.
Office of Management and Budget NDIA Program Management Systems Committee May 3, 2005 EVMS Compliance Requirements David Muzio.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
NSF – HSI Workshop 1 Introduction & NSF Overview NSF Workshop for Sponsored Project Administrators at Hispanic Serving Institutions April 13, Miami,
WP1: IP charter Geneva – 23rd June 2009 Contribution from CERN.
Week 12: Performance Management and Performance Budgeting Discuss Eureka exercise Review mid-term Conceptual Origins of Performance Management Government.
EGovOS Panel Discussion CIO Council Architecture & Infrastructure Committee Subcommittee Co-Chairs March 15, 2004.
Climate Change Education Interagency Working Group FY 2008 Potential Increase Funding and the Need for a Coordinating Interagency Education Working Group.
Kathy Corbiere Service Delivery and Performance Commission
STRATEGIC PLANNING & WASC UPDATE Tom Bennett Presentation to Academic Senate February 1, 2006.
Catholic Charities Performance and Quality Improvement (PQI)
TEN-T Executive Agency and Project Management Anna LIVIERATOU-TOLL TEN-T Executive Agency Senior Programme and Policy Coordinator European Economic and.
1 © 2004 ForeSee Results Best Practices for Managing Citizen Satisfaction On Your Website WebShop 2004 July 28, 2004.
12-CRS-0106 REVISED 8 FEB 2013 APO (Align, Plan and Organise)
New Framework for Strategic Goal Assessment NSF Advisory Committee for Business and Operations November 8-9, 2006 Tom Cooley, Director, BFA.
Developing Effective Performance Measures: A Skill-Building Workshop David J. Bernstein, Ph.D., Senior Study Director, Westat Sponsored by the Government.
Office of Research and Development Photo image area measures 2” H x 6.93” W and can be masked by a collage strip of one, two or three images. The photo.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluating Rural Advisory Services
GEF Familiarization Seminar
Impact Evaluation for Real Time Decision Making
NSF/NIH Review Processes University of Southern Mississippi
NSF/NIH Review Processes University of Southern Mississippi
US STI Policy: Continuity and Change October 24, 2003 STEPI Forum
Unidata Policy Committee Meeting
DEPARTMENT OF SCIENCE AND TECHNOLOGY
Evaluation in the GEF and Training Module on Terminal Evaluations
BSBI 622 PROJECT MANAGEMENT
63rd Session of the Evaluation Committee July 2010
Presentation transcript:

What is the Current State of Evaluation Policy and Methodologies for Monitoring Program Performance? OR GPRA & PART: Through a Glass Darkly Irwin Feller Senior Visiting Scientist American Association for the Advancement of Science WREN Workshop Washington, DC, June 6, 2008

Evaluation Modalities Assessing Merit and Worth Program and Organizational Improvement Oversight and Compliance Knowledge Development Focus Support of judgment about value Enhancement of program services Compliance with formal expectations Generation or testing of social science theory Typical mode of inquiry Causal analysis and values inquiry Description, with timely observation and feedback Description, including program activities and outcomes Classification and causal analysis Adapted from Mark, M. M., G. T. Henry, & G. Julnes (2000). Evaluation: An Integrated Framework for Understanding, Guiding, and Improving Public and Nonprofit Policies and Programs. San Francisco, CA: Jossey-Bass

“Here Comes Performance Assessment-and It Might Even be Good for You” (Behn, 1994) Having objectives (“knowing where you want to go”) is helpful Objectives provide useful baseline for assessing each of 4 modalities of accountability (finance; equity; use of power; performance) Well defined objectives and documentation of results facilitate communication with funders, performers, users, and others R&D mangers ought to devise their own performance measures

R&D Investment Criteria Consistent with President’s priorities Focus on activities that require a federal presence to attain national goals Maximize quality of the research process and efficiency of public r&d programs through use of competitive, merit-based processes where appropriate (Exceptions must be justified) Reduce or eliminate funding for programs that have completed their mission or that are redundant or obsolete

Research Agency Officials Question White House’s Review of Basic Science “The cure for cancer can’t be compared to the delivery of a FedEx package, but right now its being put in the same mold” “Science can’t tell you about what the results and outcomes will be in the time frame they want” D. Duran, WREN Workshop, Chronicle of Higher Education, 12/19/2003, p. A25

FY 2008 R&D Investment Budget Priorities In general, the Administration favors Federal R&D Investments that: Advance fundamental scientific discovery;… Support high-leverage basic research to spur technological innovation, economic competitiveness and new job growth;…………………………. Maximize the efficiency and effectiveness of the S&T enterprise though expansion of competitive, merit-base peer review processes and phase-out of programs that are only marginally productive or are not important to an agency’s mission…

PART: Empirical Estimates Effects on small programs may be large—as high as 20% change (over previous year budget) Weighted by program size, effect is 3% Findings relate to all surveyed programs; not r&d programs (Olsen and Levy, 2004)

R&D Performance Metrics (for Basic Research)? Justifying its recommendation that the US act to expand its investments in particle physics research: “The committee affirms the intrinsic value of elementary particle physics as part of the broader scientific and technological enterprise and identifies it as a key priority within the physical sciences” National Research Council, Revealing the Hidden Nature of Space and Time

Evaluating the Evaluators “I know nothing of the licenser, but that I have his own hand here for his arrogance; who shall warrant me his judgment?” (Milton, Areopagitica, 1644)

How to Evaluate Input Additionality? Output Additionality? Behavioral Additionality?

Counterfactual History What would the national s&t enterprise have looked like, or performed as, between FY1994/FY2008 in the absence of GPRA and/or PART?

Input Additionality? What impacts have GPRA/PART had on: total level of Federal r&d? allocation of r&d by functional fields/programs? allocation of r&d by agencies?

Dominant Trends in Executive Budget Proposals for Science and Technology Initial increases, then steady 5 year decline in real terms (FY2009 down by 9.1% from FY2004; AAAS) Strong support for basic research, Discontinuous priority shifts (between biomedical and physical sciences) – NIH Roller Coaster – COMPETES (NSF, NIST, DOE-OFFICE OF SCIENCE) Erosion of funding for environmental/climate/”regulatory” - related research across agencies Longstanding, manifest antipathy to civilian technology programs (ATP/MEP) Strong support for competitive, merit-based allocations

Output Additionality? What effects have GPRA/PART had on the outputs generated from federally funded r&d with respect to: r&d priorities (feedback loops); r&d strategies; effectiveness/efficiency; choice of performers?

R&D Performance Metrics Agency/ProgramMetric DOE Hydrogen Fuel Cell Density of hydrogen storage state technologies, in weight percent NIH AIDS Research (Deliverable) vaccine by 2010 NSF nanotechnology research Knowledge base (ExpectMore.gov)

Behavioral Additionality? What effects have GPRA/PART had on: agency r&d priorities r&d management strategies interactions with OMB relationships with performers relationships with Congress/other intra-agency units/”stakeholders”/constituencies?

Evaluation in the PART Process Evaluation—evidence that the program has independent and quality evaluations indicating effectiveness and achievement of results— constitutes only one portion of the PART score

Feller (Revised)Theorem on Use of Evaluation/Evidence in Policy Making Tolstoy: “Doing good will not make you happy but doing bad will surely make you unhappy”. Feller: “A good (well-done) evaluation showing bad (program) results may or may not kill a program. A good (well-done) evaluation showing good (program) results may or may not save a program”.

PART’s Method(s) for Evaluating R&D Programs Extensive reliance on National Research Council “expert” assessments of basic research programs Familiarity with relevant published evaluations Few self-initiated independent assessments or methodological innovations