Download presentation
Presentation is loading. Please wait.
1
1 UNICA WORKING GROUPS ON RESEARCH EVALUATION AND ADMINISTRATIVE EXCELLENCE Prof. Véronique Halloin, GS of Fund for Scientific Research.- FNRS Prof. Philippe Vincke, Rector of ULB (Université Libre de Bruxelles) Prof. Gregory Prastacos, Rector of Athens University of Economics and Business Prof. Stavros A. Zenios, Rector of University of Cyprus UNICA General Assembly Paris,6 novembre 2009
2
2 1.CONTEXT Evolution of the higher education Evaluations, comparisons, rankings Existing rankings are not scientific: impossibility to reconstruct and verify the results choice of the criteria and indicators? quality and validity of data? Ranking need a multi-dimensional framework UNICA Rectors’ Seminar 2009 (Dubrovnik ) complexity of evaluation in research and education At European level: an expert group on assessment of university based-research (09/08 – 09/09) the Commission universities ranking project (CHERPA)
3
3 2. UNICA WORKING GROUPS the interest for developing a methodology accepted by its members Objectives of working groups to pool resources in learning from each other about best practices in the context of benchmarking activities to develop a methodology accepted by Unica members, and based on expertise sharing, complementarities and harmonization of databases to address evaluation of 1) research; 2) administration; 3) learning Missions of the working groups WG1: to provide to the UNICA network a toolbox of research evaluation methodologies adapted to different “cases” identified as interesting by the UNICA members WG2: to establish a UNICA Observatory on Administrative Excellence
4
4 3. WG1 ON RESEARCH EVALUATION: COMPOSITION Co-chair by V. Halloin and Ph. Vincke 10 to 12 members from different universities of UNICA network (equilibrium between countries, sizes, present rankings…) Meetings every 6 months Small ad-hoc subgroups for practical cases (see step 2 below )
5
5 4. WG1 ON RESEARCH EVALUATION: FIRST STEP 3 main preliminary questions must be treated before developing any research evaluation procedure 1. To list the possible objects to evaluate Projects Researchers Teams, faculties, centres, … … 2. To list the possible goals of an evaluation Typology Ranking Swot analysis … 3. To list the possible users of an evaluation Rectors Researchers Governments, funding agencies Media, public, … One case is defined by a combination of answers to each of the 3 questions To focus on some 10 cases in priority Construction of ad- hoc sub-groups (one per case?)
6
6 5. WG1 ON RESEARCH EVALUATION: 2ND STEP Aspects to be studied for each case by ad-hoc sub-groups 1. Analysis of the eventual existing tools Benchmark study 2. List of pertinent indicators Bibliometric indicators Peer review … 3. Analysis of the accessibility and quality of the needed data Web of Science, Scopus, … Institutional depositories … 4. Proposition of a methodology Development of a toolbox 5. Illustration on some concrete cases (restricted first to the universities to which belong the members of the WG)
7
7 6. WG1 ON RESEARCH EVALUATION: DEADLINES 1. COMPOSITION of the working group mid-November 2009 2. FIRST STEP: answers to the 3 questions, and definition of the first 10 cases end February 2010 3. SUB-GROUPS: constitution end March 2010 4. SECOND STEP: survey of literature, list of indicators, first elements of results for some of the 10 cases end June 2010
8
8 7. LINK WITH CHERPA http://www.u-map.eu/ A European network charged with developing a ranking system to overcome the limitations of the Shangai and Times rankings 7 Partners: CHE Germany, CHEPS Twente, CWTS Leiden, KUL, OST Paris, 2 European Federation and Foundation Objective to develop and test an instrument that: overcomes the methodological problems of existing rankings ( multidimensional) enables field-based rankings (not only institutional rankings) enables a wide range of HE institutions and programmes to find their place sta keholders driven, set of appropriate indicators 2 pilot fields : Business and Engineering Diversity (level of variety of entities) is taken into account U-MAP (horizontal diversity) a mapping of institutional profiles based on 6 dimensions (educational, student, research involvement, knowledge exchange, international orientation, regional engagement) U-MULTIRANKING (vertical diversity): performance indicators (under discussion)
9
9 U MAP
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.