1 In-depth Evaluation of R&D Programs – how could it be accountable? Seung Jun Yoo, Ph.D. R&D Evaluation Center KISTEP, KOREA Symposium on International.

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

Understanding Student Learning Objectives (S.L.O.s)
Science Subject Leader Training
Mutual accountability and aid transparency Mutual accountability and aid transparency Republic of Moldova 1IATI meeting, OECD Conference center.
Introduction to VET Quality Assurance in the UK Mark Novels 6 th December 2011 Quality Assurance in Technical and Vocational Education and Skills Study.
1 POLICY ON SCIENCE AND TECHNOLOGY BY: M.B. WILLIAMS DIRECTOR, DEPARTMENT OF SCIENCE AND TECHNOLOGY.
1 Mateja Bizilj PEMPAL BCOP KEY CONCEPTS AND APPROACHES TO PROGRAMS AND PERFORMANCE Tbilisi, June 28, 2007.
Mirjam Woutersen, Nancy van San and Henri Ponds
Research and Innovation Why does ERA Need to Flourish ERA - State of Play Octavi Quintana Trias Brussels, 19th April 2013.
EU-Regional Policy Structural actions 1 GROWING EVALUATION CAPACITY THE MID TERM EVALUATION IN OBJECTIVE 1 AND 2 REGIONS 8 OCTOBER 2004.
RD4S Exercise Pillar C: How can the contribution of research to SD be measured ? By L. Esterle, Cermes and Ifris, France.
Results Based Monitoring (RBM)
ROM reviews Saskia Van Crugten
María Muñoz General Directorate for Community Funds Ministry of Economy and Finance SPAIN María Muñoz General Directorate for Community Funds Ministry.
1 State of play Evaluations undertaken for DG REGIO.
Evaluation Plan in Hungary Dr. Tamás Tétényi Head of Department for Strategy and Evaluation National Development Agency.
Italian Good Practice Projects Michela Arnaboldi and Giovanni Azzone
WMO Monitoring & Evaluation System (Measuring our Performance/Success)
1 Performance Indicators: Selection, Application, and Reporting Presented by John M Rodgers Federal Aviation Administration.
1 Establishing Performance Indicators in Support of The Illinois Commitment Presented to the Illinois Board of Higher Education December 11, 2001.
1 Mid-Term Review of The Illinois Commitment Assessment of Achievements, Challenges, and Stakeholder Opinions Illinois Board of Higher Education April.
0 - 0.
Environment & national PRSs - directions and dilemmas EPD Seminar Series May 2002.
Project Appraisal Module 5 Session 6.
TRP Chapter Chapter 6.8 Site selection for hazardous waste treatment facilities.
1 Performance Management Challenges and Opportunities Harry P. Hatry The Urban Institute Washington DC.
1 World Bank Support TFSCB STATCAP Monitoring systems / Core Welfare Indicators Questionnaire (CWIQ) Readiness Assessment.
2009 Strategic Planning playbook
A-16 Portfolio Management Implementation Plan Update
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
New Paradigms for Measuring Savings
Evaluating administrative and institutional capacity building
PARTICIPATION AND ADOPTION OF THE COMMON CORE STANDARDS INITIATIVE 1 Transforming Education in Kentucky Felicia Cumings Smith Associate Commissioner Michael.
Towards a model M&E system for AIDS programs Kampala April
Strategic Financial Management 9 February 2012
1 Evaluation Plan for Geotargeted Efficiency Programs November 19, 2008.
RES-H Policy Background and Objectives of the Project 1. Project Meeting Freiburg, 21 October 2008 Veit Bürger
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Donald T. Simeon Caribbean Health Research Council
1 Budgets and Budgetary Control Prepared and Presented By Gladstone K. Hlalakuhle.
Introduction to Operations Management
Mid-term Evaluation Implementation of the EU Structural Funds in R&DI and Higher Education, Stage 1: Strategic view
Comparing environmental performance Environmental benchmarking for small and medium-sized enterprises in the Nordic tourism industry Anne Maria Sparf.
SYSTEM OF EVALUATION AND MANAGEMENT CONTROL RESULTS-BASED BUDGETING THE CHILEAN EXPERIENCE Heidi Berner H Head of Management Control Division Budget Office,
Queensland Treasury Department Role and Function of Treasury Financial Framework Charter of Fiscal and Social Responsibility and Priorities in Progress.
Performance Evaluation of National R&D Program in Korea Performance Evaluation of National R&D Program in Korea AEA Annual Conference Nov. 13, 2009.
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to:  funding agencies,
New Direction of National R&D Evaluation System in Korea Changwhan Ma Director of Performance Policy Division EVALUATION Anaheim, CA,
The Evolution of Evaluation Practices in the Canadian Federal Government Presented at CES / AEA Conference Toronto, Ontario, Canada October.
Monitoring, Review and Reporting Project Cycle Management A short training course in project cycle management for subdivisions of MFAR in Sri Lanka.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
1 An Overview of R&D Budgeting in Korea Symposium on International Comparison of the Budget Cycle in Research Development and Innovation Policies Madrid,
Evaluation in the GEF and Training Module on Terminal Evaluations
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
国 家 科 技 部 评 估 中 心国 家 科 技 部 评 估 中 心 National Center for S&T Evaluation Recent Experiences and Challenges of Research Program Evaluation in China: An Introduction.
Objective: Estimating Government R&D Program Efficiencies as a part of our assistants to the Government R&D Budget Compilation Process Programs are units,
Evaluation of EU Structural Funds information and publicity activities in Lithuania in Implementing recommendations for Dr. Klaudijus.
IN-DEPTH EVALUATION OF R&D PROGRAM IN KOREA Seung Jun Yoo, Boo-jong Gill, Woo Chul Chai.
Senior Evaluation Officer GEF Independent Evaluation Office Minsk, Belarus September 2015 Evaluation in the GEF and Training Module on Terminal Evaluations.
Chapter 3 Strategic Information Systems Planning.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Practical Experiences - Evaluation of Program 1 Geneva January 29, 2016.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Quality Assurance MENTEP Yves Beernaert, Educonsult
Strategic Planning for Learning Organizations
Evaluation in the GEF and Training Module on Terminal Evaluations
Helene Skikos DG Education and Culture
Federal states of Germany: Multitude of educational systems
Presentation transcript:

1 In-depth Evaluation of R&D Programs – how could it be accountable? Seung Jun Yoo, Ph.D. R&D Evaluation Center KISTEP, KOREA Symposium on International Comparison of the Budget Cycle in Research Development and Innovation Policies Madrid (Spain), 3-4 July 2008 OECD/GOV/PGC/SBO

2 Contents 1. Overview of Current Evaluation 2. Architecture of In-depth Evaluation 3. Procedure of In-depth Evaluation 4. Evaluation with Accountability 5. Challenges and Discussion

3 Overview of Current Evaluation 1 - All R&D Programs (191) evaluated every year! - specific evaluation (mainly using checklists) : 27 programs - self/meta evaluation : 164 programs - in-depth evaluation (4 horizontal programs (pilot run)) : climate change related R&D programs : university centers of excellence R&D programs : infrastructure (facility/equipments) R&D programs : genome research R&D programs

4 Overview of Current Evaluation 2 - Efficiency & Effectiveness of Evaluation? - evaluating 191 programs every year? - efficiency & effectiveness of evaluation itself is questionable, considering characteristics of R&D programs.. - too much loads of evaluation to evaluators, program managers and researcher, etc. - not enough time to prepare and perform evaluation for all R&D programs and to communicate with stakeholders ( might yield poor accountability?)

5 Architecture of In-depth Evaluation 1 - Main Players *NSTC (National Science & Technology Council) MOSF …… MEST MOEMIK MIFAFF MW KISTEP (Evaluators) Evaluation Supporting Groups R&D programs of each ministry Decision maker for R&D Evaluation and Budget Allocation Agency NSTC

6 Architecture of In-depth Evaluation 2 - Evaluation & Budget allocation R&D BudgetSurvey/Analysis Programs/Projects implemented In-depth Evaluation Programs Feedback Evaluation group formed To (re)plan and/or improve program Input for budget allocation

7 Architecture of In-depth Evaluation 3 - Budget Process 5 yr plan Ministry Budget Ceiling 1 st Budget Review Program Budget 2 nd Budget Review with Evaluation Results Ministry of Strategy & Finance (MOSF) Budget Committee of National Assembly (Dec.) NSTC

8 Procedure of In-depth Evaluation month schedule (suggested!) - Selected by selection committee based on special issue, etc. (month 0) - In-depth evaluation procedure for selected program(s) - month 1 : form evaluation group, gather program(s) data, study target R&D program(s), find major evaluation points - month 2 : develop logic model (with system dynamics, etc.) - month 3/4 : perform in-depth analysis (relevance, efficiency, effectiveness, program design & delivery, etc)

9 Procedure of In-depth Evaluation 2 - month 5 : interview (researchers, program managers, etc.) - month 6 : report interim evaluation result (MOSF, department(s)) - month 7 : report final evaluation result & recommendations

10 Evaluation with Accountability 1 - Responsibility 1 balance between quantitative and qualitative evaluation is important - systematic approach for qualitative evaluation is challenging - program goals vs projects implemented vs output enough time for evaluation is essential (7-month schedule) - to achieve the goal of evaluation with accountability - give enough time for PM to cope with evaluation process suitable only for limited # of programs to evaluate

11 Evaluation with Accountability 2 - Responsibility 2 qualitative assessment is needed to achieve the purpose of evaluation - measuring simple # of publications and patents? - publications : impact factor (1-2 yrs), citation index (more than 3 yrs) - patents : commercial purpose technology value evaluation - selected projects with excellent performance : consistent funding is required irregardless of its program evaluation!

12 Evaluation with Accountability 3 - Acceptability 1 understand well characteristics of program, sharing with stakeholders - performance indicators are useful tools to get stakeholders agreement - researchers, program managers, MOSF, etc. - to set up an evaluation strategy and points! - important especially for acceptability and for improving program delivery

13 Evaluation with Accountability 4 - Acceptability 2 (Understand & Change!) communication with stakeholders - interview with stakeholders is important to increase accountability : researchers, program managers, MOSF : evaluation strategy would better to share at the beginning - number of interviews are also important : lack of understanding evaluation is key inhibitor for accountability! : interviews at major steps such as evaluation strategy, survey weak/strong point of the program, report interim evaluation results, etc.

14 Challenges and Discussion 1 - Understand & Change & Improve! - Stakeholders should understand their program(s) : otherwise, rigid and too much defensive for keeping unchanged - Systematic way to understand diverse aspects of programs : goal, contents, projects, design & delivery, etc. - Sharing of program information to change : change and improve (to all stakeholders)

15 Challenges and Discussion 2 - Scientific, Socio-economic Interest - Technology impact evaluation for socio-economic understanding - Results of technology level evaluation are also useful

16 Challenges and Discussion 3 - Communication Consultation - Communication among stakeholders (Ministry/Agency, Researchers, MOSF, KISTEP, etc.) - For better evaluation practices, communication should be transformed to a way of consultation

17 Muchas gracias!