1 UNICA WORKING GROUPS ON RESEARCH EVALUATION AND ADMINISTRATIVE EXCELLENCE Prof. Véronique Halloin, GS of Fund for Scientific Research.- FNRS Prof. Philippe.

Slides:



Advertisements
Similar presentations
Strengthening the Strategic Cooperation between the EU and Western Balkan Region in the field of ICT Research Input to panel discussion: Concrete actions.
Advertisements

Academic Network of European Disability experts (ANED) – VT/2007/005.
European Commission DG Research SMcL Brussels SME-NCP 23 October 2002 THE 6th FRAMEWORK PROGRAMME Economic & Technological Intelligence S. McLaughlin.
The Operational P The Operational Programme adopted by the European Commission The ESPON 2013 Programme EUROPEAN UNION Part-financed by the European Regional.
DELOS Highlights COSTANTINO THANOS ITALIAN NATIONAL RESEARCH COUNCIL.
Workshop Mapping Estonian Universities Frans Kaiser & Marike Faber, Tartu (Estonia) 21 March 2011.
MIRA - WP 2 Observatory of Euro-Med S&T cooperation White Paper Coord. IRD (France) CNRS (Lebanon) MIRA Mediterranean Innovation and Research coordination.
UNICA Core Group on Evaluation in Research and Higher Education 29 October 2010 Vienna 1.
AGENDA General Assembly Paris, November 6, 2009 UNICA General Assembly 2009-UPMC, Paris.
U-Multirank – The implementation of a multidimensional international ranking IREG Forum on University Rankings – Methodologies under scrutiny Warsaw,
Mapping Diversity – The U-Multirank Approach to Rankings Gero Federkeil Workshop Universidade Nova de Lisboa, 29th June 2012.
Ranking - New Developments in Europe Gero Federkeil CHE – Centre for Higher Education Development The 3rd International Symposium on University Rankings.
3. Challenges of bibliometrics DATA: many problems linked to the collection of data FIELDS – DISCIPLINES : various classifications, not satisfactory INDICATORS.
RUSSIA Linking Russia to the ERA: Coordination of MS/AC S&T programmes Towards and with Russia ERA.NET RUS WORKSHOP Lessons learned.
“ERA-NET” 24 April Cooperation and coordination of national or regional research and innovation activities (i.e. programmes) FP6 Launch Conference.
1 Gregory P. Prastacos Rector, Athens University of Economics & Business November 2009 Evaluating Administrative Excellence of Universities A proposal.
UNICA Rectors’ Seminar Dubrovnik, July 2009 Which way to quality? Rankings vs Benchmarks Stavros A. Zenios Professor of Management Science Rector, University.
Ranking universities: The CHE Approach Gero Federkeil CHE – Centre for Higher Education Development International Colloquium “Ranking and Research Assessment.
The world’s first global, multi-dimensional, user-driven university* ranking (* includes all higher education institutions) Jordi Curell Director Higher.
Adapting to Climate Change: Canada’s Experience and Approach Elizabeth Atkinson Climate Change Impacts and Adaptation Directorate Natural Resources Canada.
The CHE ranking The multi-dimensional way of Ranking Isabel Roessler CHE – Centre for Higher Education Development International Conference “Academic Cooperation.
U-Multirank – The implementation of a multidimensional international ranking Higher Education Conference Rankings and the Visibility of Quality Outcomes.
Quality Assurance in the Bologna Process Fiona Crozier QAA
Ministry of Education and Religious Affairs General Secretariat for Research and Technology EEA Financial Mechanism Research within Priority.
The evaluation of research units at HCERES
Quality Assurance at the University St. Kliment Ohridski Elizabeta Bahtovska National Bologna promoter TEMPUS SCM C-032B06 West Balkan Bologna Promoters.
Danube Rectors’ Conference. University of Excelence. Teaching, Learning, Research and Community Services 4 th - 7 th November, 2010, Cluj-Napoca Peer evaluation.
Stages of Commitment to Change: Leading Institutional Engagement Lorilee R. Sandmann, University of Georgia Jeri Childers, Virginia Tech National Outreach.
Assuring the quality of distance education at higher education institutions at Western Balkan Radojka Krneta University of Kragujevac, Technical Faculty.
Company LOGO NELLIP Network of Language Label Projects and Initiatives Intercultural Horizons Conference, Siena, 7 October 2013.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Institutional Evaluation of medical faculties Prof. A. Сheminat Arkhangelsk 2012.
What Can National Rankings Learn from the U-Multirank-Project ? Gero Federkeil, CHE, Germany IREG-Forum: National University Rankings on the.
European Conference on Quality in Official Statistics, Rome 8-11 July Satisfying User and Partner Needs- the Use of Specific Reviews at Eurostat.
The Concept of the European Platform of Women Scientists An instrument of support and a way to become active in the policy debate Isabel Beuter, M.A. Center.
ESPON Seminar 15 November 2006 in Espoo, Finland Review of the ESPON 2006 and lessons learned for the ESPON 2013 Programme Thiemo W. Eser, ESPON Managing.
Gero Federkeil Expert Seminar „Quality Assurance and Accreditation in Lifelong Learning“, Berlin, February 2011 Rankings and Quality Assurance.
THOMSON REUTERS RESEARCH IN VIEW Philip Purnell September 2011 euroCRIS symposium Brussels.
Classifying higher education institutions: why and how? EAIR Forum ‘Fighting for Harmony’, Vilnius August 2009 Frans Kaiser Christiane Gaehtgens.
Towards a European network for digital preservation Ideas for a proposal Mariella Guercio, University of Urbino.
1 BE-TWIN Technical Meeting Brussels 15 June 2009 Simonetta Bettiol U.S.R. Veneto – Ufficio II° COD.ID:LLP- LDV/T01/2007/IT/188.
Berlin 3 Open Access University of Southampton February 28 th – March 1 st, 2005 Study on the economic and technical evolution of the scientific publication.
Transparency in Searching and Choosing Peer Reviewers Doris DEKLEVA SMREKAR, M.Sc.Arch. Central Technological Library at the University of Ljubljana, Trg.
CLARIN work packages. Conference Place yyyy-mm-dd
Participation in 7FP Anna Pikalova National Research University “Higher School of Economics” National Contact Points “Mobility” & “INCO”
Project financed under Phare EUROPEAN UNION MERI/ NCDTVET - PIU Material produced under Phare 2006 financial support Phare TVET RO RO2006/
A project implemented by the HTSPE consortium This project is funded by the European Union SECURITY AND CITIZENSHIP RIGHT AND CITIZENSHIP
ENCHASE “ENHANCING ALBANIAN SYSTEM OF QUALITY ASSURANCE IN HIGHER EDUCATION: APPLICATION OF THE PROCESS AND OUTCOME BASED METHODOLOGY ”
E u r o p e a n C o m m i s s i o nCommunity Research Global Change and Ecosystems EU environmental research : Part B Policy objectives  Lisbon strategy.
UNICA EC PROJECTS Arthur Mettinger. PROJECTS Joint LLL and Tempus service contract: Information Project on Higher Education Reform III Leonardo da Vinci:
Bologna Process in Croatia Melita Kovačević University of Zagreb Consortia Meeting of the Tempus Project UM-JEP Moving Ahead with the Bologna Process.
Regional Programme of Statistics in the Mediterranean Region MEDSTAT Phase II This project is funded by the European Union 1 Workshop on Data Compilation.
China July 2004 The European Union Programmes for EU-China Cooperation in ICT.
Technology Needs Assessments under GEF Enabling Activities “Top Ups” UNFCCC/UNDP Expert Meeting on Methodologies for Technology Needs Assessments
UNIVERSITIES EVALUATIONS AND RANKINGS Philippe VINCKE Rector of the Université Libre de Bruxelles.
Research Activities in Response to IPCC TAR John Christensen UNEP.
United Nations Economic Commission for Europe Statistical Division WHAT MAKES AN EFFECTIVE AND EFFICIENT STATISTICAL SYSTEM Lidia Bratanova, Statistical.
The importance of engaging in Health systems strengthening to ensure Nutrition interventions are truly delivered within the health system TECHNICAL MEETING.
Pentti Pulkkinen Programme Manager Academy of Finland Research funding and administration in Finland
The European Science Foundation is a non-governmental organisation based in Strasbourg, France.
Classification & Ranking in Higher Arts Education New EU developments and the role of ELIA.
Strengthening the foundations of ERA
U-Multirank – The first Multidimensional Global University Ranking
13th Governing Council of SIAP 4-5th December,2017 Chiba, Japan
Benchmarking and best practices EXPP/2011/07 EN
Overview of global initiatives and their relevance for the EU context
HPH Strategy Seminar Beijing, China 7 september 2013 Andrea Limbourg
Helene Skikos DG Education and Culture
WG Transparency PLA Noël Vercruysse February 16th 2011
CHEMICAL AND PHYSICAL CHARACTERISATION OF MULTIFUNCTIONAL MATERIALS
Presentation transcript:

1 UNICA WORKING GROUPS ON RESEARCH EVALUATION AND ADMINISTRATIVE EXCELLENCE Prof. Véronique Halloin, GS of Fund for Scientific Research.- FNRS Prof. Philippe Vincke, Rector of ULB (Université Libre de Bruxelles) Prof. Gregory Prastacos, Rector of Athens University of Economics and Business Prof. Stavros A. Zenios, Rector of University of Cyprus UNICA General Assembly Paris,6 novembre 2009

2 1.CONTEXT Evolution of the higher education Evaluations, comparisons, rankings Existing rankings are not scientific: impossibility to reconstruct and verify the results choice of the criteria and indicators? quality and validity of data? Ranking need a multi-dimensional framework UNICA Rectors’ Seminar 2009 (Dubrovnik )  complexity of evaluation in research and education At European level: an expert group on assessment of university based-research (09/08 – 09/09) the Commission universities ranking project (CHERPA)

3 2. UNICA WORKING GROUPS the interest for developing a methodology accepted by its members Objectives of working groups to pool resources in learning from each other about best practices in the context of benchmarking activities to develop a methodology accepted by Unica members, and based on expertise sharing, complementarities and harmonization of databases to address evaluation of 1) research; 2) administration; 3) learning Missions of the working groups WG1: to provide to the UNICA network a toolbox of research evaluation methodologies adapted to different “cases” identified as interesting by the UNICA members WG2: to establish a UNICA Observatory on Administrative Excellence

4 3. WG1 ON RESEARCH EVALUATION: COMPOSITION Co-chair by V. Halloin and Ph. Vincke 10 to 12 members from different universities of UNICA network (equilibrium between countries, sizes, present rankings…) Meetings every 6 months Small ad-hoc subgroups for practical cases (see step 2 below )

5 4. WG1 ON RESEARCH EVALUATION: FIRST STEP 3 main preliminary questions must be treated before developing any research evaluation procedure 1. To list the possible objects to evaluate Projects Researchers Teams, faculties, centres, … … 2. To list the possible goals of an evaluation Typology Ranking Swot analysis … 3. To list the possible users of an evaluation Rectors Researchers Governments, funding agencies Media, public, … One case is defined by a combination of answers to each of the 3 questions To focus on some 10 cases in priority Construction of ad- hoc sub-groups (one per case?)

6 5. WG1 ON RESEARCH EVALUATION: 2ND STEP Aspects to be studied for each case by ad-hoc sub-groups 1. Analysis of the eventual existing tools Benchmark study 2. List of pertinent indicators Bibliometric indicators Peer review … 3. Analysis of the accessibility and quality of the needed data Web of Science, Scopus, … Institutional depositories … 4. Proposition of a methodology Development of a toolbox 5. Illustration on some concrete cases (restricted first to the universities to which belong the members of the WG)

7 6. WG1 ON RESEARCH EVALUATION: DEADLINES 1. COMPOSITION of the working group  mid-November FIRST STEP: answers to the 3 questions, and definition of the first 10 cases  end February SUB-GROUPS: constitution  end March SECOND STEP: survey of literature, list of indicators, first elements of results for some of the 10 cases  end June 2010

8 7. LINK WITH CHERPA A European network charged with developing a ranking system to overcome the limitations of the Shangai and Times rankings 7 Partners: CHE Germany, CHEPS Twente, CWTS Leiden, KUL, OST Paris, 2 European Federation and Foundation Objective to develop and test an instrument that: overcomes the methodological problems of existing rankings (  multidimensional) enables field-based rankings (not only institutional rankings) enables a wide range of HE institutions and programmes to find their place  sta keholders driven, set of appropriate indicators 2 pilot fields : Business and Engineering Diversity (level of variety of entities) is taken into account U-MAP (horizontal diversity) a mapping of institutional profiles based on 6 dimensions (educational, student, research involvement, knowledge exchange, international orientation, regional engagement) U-MULTIRANKING (vertical diversity): performance indicators (under discussion)

9 U MAP