UNIVERSITIES EVALUATIONS AND RANKINGS Philippe VINCKE Rector of the Université Libre de Bruxelles.

Slides:



Advertisements
Similar presentations
Allyn & Bacon 2003 Social Work Research Methods: Qualitative and Quantitative Approaches Topic 7: Basics of Measurement Examine Measurement.
Advertisements

Erasmus Mundus Master's programme in Information Technology for Business Intelligence.
UNICA Core Group on Evaluation in Research and Higher Education 29 October 2010 Vienna 1.
LANGUAGE A1: NATURE OF THE SUBJECT The Language A1 programme is primarily a pre-university course in literature. It is aimed at students who intend to.
Teachers development. It can not be longer accepted that academic career gives automatically qualifications for new didactic tasks It is anachronic to.
Project Selection (Ch 4)
In Europe, When you ask the VET stakeholders : What does Quality Assurance mean for VET system? You can get the following answer: Quality is not an absolute.
8. Evidence-based management Step 3: Critical appraisal of studies
Evidence based policy making Seminar FP7 Work Programme December 2010, Paris, Université Paris Dauphine Maria Geronymaki DG INFSO.H.2 ICT for.
Psychological Aspects of Risk Management and Technology – G. Grote ETHZ, Fall09 Psychological Aspects of Risk Management and Technology – Overview.
1 UNICA WORKING GROUPS ON RESEARCH EVALUATION AND ADMINISTRATIVE EXCELLENCE Prof. Véronique Halloin, GS of Fund for Scientific Research.- FNRS Prof. Philippe.
Using Rankings to Drive Internal Quality Improvements Dr. Kevin Downing City University of Hong Kong & Ms. Mandy Mok QS Asia.
An e-learning action project supported by the European Community D. Peraya, University of Geneva (CH) B. Jaccaz, University of Geneva (CH) I. Masiello,
© Tefko Saracevic, Rutgers University 1 EVALUATION in searching IR systems Digital libraries Reference sources Web sources.
Use of bibliometry for the evaluation of researchers and teams in medicine and biology A practical view from a School of Medicine Dean Patrick Berche Universitary.
Ranking universities: The CHE Approach Gero Federkeil CHE – Centre for Higher Education Development International Colloquium “Ranking and Research Assessment.
Journal Status* Using the PageRank Algorithm to Rank Journals * J. Bollen, M. Rodriguez, H. Van de Sompel Scientometrics, Volume 69, n3, pp , 2006.
Assessing and Evaluating Learning
Human Resources Training and Individual Development February 11: Training Evaluation.
Edouard Mathieu Head of the Benchmarking Center Invest in France Agency * ARWU: Academic Ranking of World Universities 2005 A few remarks on ARWU*
Journal Impact Factors and H index
Writing a Research Proposal
Richard West & Elena Frumina.  The Russian context the new global context  ESP/EAP teaching in Russia – 2002 to 2012  The need for change 
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
How to Critically Review an Article
Culture Programme - Selection procedure Katharina Riediger Infoday Praha 10/06/2010.
Wojciech Fenrich Interdisciplinary Centre for Mathematical and Computational Modelling (ICM) University of Warsaw Prague, KRE 12,
Why are economic and financial instruments needed? A presentation made by Noma Neseni, IWSD.
Quality Assurance in Higher Education and Vocational Education and Training WS 8/Panel 1: Reflection on the demands on quality in HE Gudrun Biffl/AUT/WIFO.
Institutional Evaluation of medical faculties Prof. A. Сheminat Arkhangelsk 2012.
Sex and Gender Some definitions.
Decision Making Matrix
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
RESEARCHING & EVALUATING Summer 2008 Melanie Wilson Academic Success Center MSC 207.
Gero Federkeil Expert Seminar „Quality Assurance and Accreditation in Lifelong Learning“, Berlin, February 2011 Rankings and Quality Assurance.
A.ABDULLAEV, Director of the Public Fund for Support and Development of Print Media and Information Agencies of Uzbekistan.
AEP: 2010 Spring Forum Building EdSteps A resource for teaching and assessment Writing Global Competence Creativity Problem Solving Analyzing Information.
The European Credit system The European Credit system for Vocational Education and Training (ECVET)
Making Good Use of Research Evaluations Anneli Pauli, Vice President (Research)
Berlin 3 Open Access University of Southampton February 28 th – March 1 st, 2005 Study on the economic and technical evolution of the scientific publication.
Division Of Early Warning And Assessment MODULE 5: PEER REVIEW.
FOURTH EUROPEAN QUALITY ASSURANCE FORUM "CREATIVITY AND DIVERSITY: CHALLENGES FOR QUALITY ASSURANCE BEYOND 2010", COPENHAGEN, NOVEMBER IV FORUM-
Prof. Fernando Peña López University of A Coruña (Spain) 9-11 April 2014 – Kishinev, Moldova.
A risk based model of pensions supervision The Argentine experience Gustavo Demarco Superintendencia de AFJP, Argentina.
Dr Ritva Dammert Director Brussels May 27, 2009 Evaluation of the Finnish Centres of Excellence Programmes
Final International Conference Irkutsk, October 15-19, 2012 Prof. Stefano Elia, Department of Experimental Medicine and Surgery, Rome Tor Vergata University,
Faculty of Psychology and Education Science Designing school-level indicators in French Speaking Belgium Alix Dandoy, Marc Demeuse, Alexandra Franquet,
EUA Institutional Evaluation Programme Georges Verhaegen Former Rector Université Libre de Bruxelles.
The weight of research internationalization indicators Ülle Must Estonian Research Council SCIENTOMETRICS Status and Prospects for development.
Preliminary Analysis of Alternatives for the Long Term Management of Mercury John Vierow Science Applications International Corp. Reston, VA May 1, 2002.
Photo : ©Allsport Setting the stage for hosting a major sporting event SportAccord Lausanne, 19 May 2004 Gilbert FELLI IOC Olympic Games Executive Director.
Lamsade – Université Paris IX Dauphine Indexes socially "understandable" 4th December 2008.
Qualifications Update: Higher Media Qualifications Update: Higher Media.
Erasmus Mundus Joint Master courses How to write a good proposal ? Hélène Pinaud- 18 December 2015.
Strategic Planning and Future of Cohesion Policy after 2020 Panel 2 “Strategic Planning as an X-factor for effective management of ESI funds“ (V4+4 Conference;
THOMSON REUTERS INCITES Marta Plebani – Country Account Manager – Italy, Slovenia, Croatia 12 May 2011.
PEFA FRAMEWORK FOR ASSESSING PUBLIC FINANCIAL MANAGEMENT Module 9: Comparisons over time & between countries.
Rigor and Transparency in Research
Academic Ranking of World Universities
NS4540 Winter Term 2016 Government Performance Indicators (GPIs)
Where Should I Publish? Journal Ranking Tools
by Anthony W. Hill & Course Technology
EU Expert Group Altmetrics
UC policy states:  "Superior intellectual attainment, as evidenced both in teaching and in research or other creative achievement, is an indispensable.
GLOBAL PUBLIC GOODS. gonca erdoğan
Engleski jezik struke 3 Sreda,
NS4540 Winter Term 2018 Government Performance Indicators (GPIs)
University of Natural Resources and Applied Life Sciences – Vienna
Presentation transcript:

UNIVERSITIES EVALUATIONS AND RANKINGS Philippe VINCKE Rector of the Université Libre de Bruxelles

Evolution of higher Education  New Actors of higher education and research  Increasing mobility of students and researchers  Accountability of the universities, transparency  Evaluations, comparisons, rankings

Criticisms of the existing rankings (1)  Competencies of the authors of the rankings  Impossibility, for the reader, to reconstruct and verify the results (rankings are not« scientific »)  No information about the goals, the intended uses, the aimed public  Precise definition of « university » : are they all comparable ?

 Choice of the criteria and of their relative importance  Research  Education  Costs  Services  Social aspects  National context, legislation  Financial ressources  Choice of the indicators  Data validation Criticisms of the existing rankings (2)

Criticisms of the existing rankings (3)  Bibliometry  Quality of the data  Discrimination among the scientific fields  Different traditions (journals, books, proceedings, number of authors, time span of valid research)  Supremacy of the publications in English  Which indicators ? (IF, citation index, h index, …)  Experts  Do they exist ?  How to choose them ?  Which questions ? How to treat the answers ?

Numerical « manipulations » (1)  How is it possible to imagine that complex objects such as universities can be characterized by one number ?  The weighted mean can exclude good candidates Example : A4197 B10038 C6868  Curious effects of normalization

(1)(2) A B C D E F G H640305

(1)(2) Global score A100 B563545,5 C D80944,5 E F88747,5 G H326146,5 RANKING : A, F, C, H, E,B,G,D

(1)(2) A B C D E F G H640305

(1)(2) Global score A100 B703552,5 C257449,5 D100954,5 E554851,5 F108748,5 G852253,5 H406150,5 RANKING : A,D,G,B,E,H,C,F

Before : A, F, C, H, E, B, G, D One modification of the score of A on one criterion. No change in the scores of the other universities After : A, D, G, B, E, H, C, F Inverse ranking !!

Other comments  Rankings are contested but used  Rankings have an influence on reality  Excesses are possible (financial bonus, or incitements,…)  Standardization effect

Conclusions  The rankings relayed by the media are not scientifically valid at this stage  Evaluation of research and higher education is a necessity  But it must be realized by competent people in the context of a clear policy and with explicit goals  There does not exist a unique method applicable in all institutions

Main questions (1)  Wich « objects » ?  Universities (definition?)  Education programmes  Diploma’s  Research centers  Research programmes  …

Main questions (2)  What does one want to do ?  To compare  To select the « best(s) »  To rank  To define « homogene » categories  To detect strong and weak points  To assign ressources  …

Main questions (3)  For whom ?  External autorities, government, …  Potential partners (universities, research centers, companies,…)  External teachers or researchers  Potential students  Funding agencies  Sponsors  Public opinion, media  Alumni  Internal authorities  Internal teachers or researchers  Internal students  …

Main questions (4) For each « situation » (characterized by the answers to the 3 previous questions):  Which indicators ?  Quality of the data ?  Numerical treatment of the data !

Different approaches for different concrete questions  Choose the « best » education programme for this student ? (« best » for him)  Allocate financial resources to research centers  Select the universities which could be « good » partners for this company  Identify the strong points of these universities for students interested in studies in that field  Necessity of an interactive decision-aid toolbox for each possible user and question.