Communication Challenges for Nordic Universities NUAS Conference 12-14 August, 2012 Prof. Andrejs Rauhvargers, EUA Senior Adviser Session A2b: Rankings.

Slides:



Advertisements
Similar presentations
Using Rankings to Drive Internal Quality Improvements
Advertisements

Building a European Classification of Higher Education Institutions Workshop ‘New challenges in higher education research and policy in Europe and in CR’,
Workshop Mapping Estonian Universities Frans Kaiser & Marike Faber, Tartu (Estonia) 21 March 2011.
National university rankings and the evolution of global rankings Kazimierz Bilanow Managing Director, Perspektywy Education Foundation, Poland How quality.
Mapping Diversity – The U-Multirank Approach to Rankings Gero Federkeil Workshop Universidade Nova de Lisboa, 29th June 2012.
Presented by: Michael A Mabe Director ofVisiting Professor Academic Relations Dept Information Science Elsevier City University, London Globality & Disciplinarity.
Information Retrieval to Informed Action with Research Metrics Thomson Scientific Research Services Group 2007.
Ranking - New Developments in Europe Gero Federkeil CHE – Centre for Higher Education Development The 3rd International Symposium on University Rankings.
INCITES PLATFORM NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION (NOAA)
Using Rankings to Drive Internal Quality Improvements Dr. Kevin Downing City University of Hong Kong & Ms. Mandy Mok QS Asia.
Scholarly Publication and Research Policy Rector Georg Winckler University of Vienna.
1 Academic Ranking of World Universities Methodologies and Problems May 15, 2007 By Professor Nian Cai Liu Institute of Higher Education and Center for.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
GLOBAL UNIVERSITY RANKINGS AND THEIR IMPACT EUA Rankings Review Lesley Wilson EUA Secretary General SEFI Conference, 28 September 2011, Lisbon.
The world’s first global, multi-dimensional, user-driven university* ranking (* includes all higher education institutions) Jordi Curell Director Higher.
T H O M S O N S C I E N T I F I C Editorial Development James Testa, Director.
Edouard Mathieu Head of the Benchmarking Center Invest in France Agency * ARWU: Academic Ranking of World Universities 2005 A few remarks on ARWU*
Journal Impact Factors and H index
Rating and Ranking: Pros and Cons Dr. Mohsen Elmahdy Said Professor, Mechanical Design and Production Department Faculty of Engineering – Cairo University.
The CHE ranking The multi-dimensional way of Ranking Isabel Roessler CHE – Centre for Higher Education Development International Conference “Academic Cooperation.
Excellence in scientific research Prof
Orientation to Web of Science Dr.Tariq Ashraf University of Delhi South Campus
Danube Rectors’ Conference. University of Excelence. Teaching, Learning, Research and Community Services 4 th - 7 th November, 2010, Cluj-Napoca Peer evaluation.
Ranking effects upon students National Alliance of Student Organization in Romania (ANOSR) Member of European Students' Union (ESU) Academic cooperation.
Difficulties and Possibilities of University Rankings in Hungary Magdolna Orosz (Eötvös Loránd University Budapest, Hungary) Academic cooperation and competitiveness.
The Web of Science database bibliometrics and alternative metrics
The Role of Citations in Warwick’s Strategy and Improving Them Nicola Owen (Academic Registrar) Professor Mark Smith (PVC Research: Science and Medicine)
Bibliometrics toolkit: ISI products Website: Last edited: 11 Mar 2011 Thomson Reuters ISI product set is the market leader for.
Quality Assurance in Higher Education and Vocational Education and Training WS 8/Panel 1: Reflection on the demands on quality in HE Gudrun Biffl/AUT/WIFO.
RANKINGS WORKSHOP STRATEGIES FOR MARKETING, BRANDING, AND IMPROVING INTERNATIONAL RANKINGS MICHAEL FUNG DIRECTOR OF PLANNING & INSTITUTIONAL RESEARCH THE.
Innovation for Growth – i4g Universities are portfolios of (largely heterogeneous) disciplines. Further problems in university rankings Warsaw, 16 May.
University Ranking and benchmarking: How can we get in to the league table? Dr Muhammad Sohail Microbiology.
Uwe Brandenburg Options and limits of measurability: the experience from the Excellence Ranking in the light of the global ranking discussion.
Baltic Sea Region University Network Marketing and Networking for Internationalization Seminar at Vilnius University, 25 November 2011.
What Can National Rankings Learn from the U-Multirank-Project ? Gero Federkeil, CHE, Germany IREG-Forum: National University Rankings on the.
Bibliometrics: coming ready or not CAUL, September 2005 Cathrine Harboe-Ree.
Beyond the RAE: New methods to assess research quality July 2008.
Maastricht University Prof. dr. M. Paul, President.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
THOMSON REUTERS—GLOBAL INSTITUTIONAL PROFILES PROJECT DR. NAN MA SCIENCE AND SOLUTION CONSULTANT THOMSON REUTERS OCT 19 TH, 2010.
Quality Assurance & University Rankings. Shanghai Ranking (Shanghai Jiao Tong University) THES (Times Higher Education Supplement) CHE Ranking »Centrum.
Role of University Rankings in Kazakhstan Prof. Sholpan Kalanova BRATISLAVA 2011.
Policy drivers in Australian higher education research NSCF September 7, 2015 Dr Tim Cahill Research Strategies Australia
An overview of main bibliometric indicators: Dag W. Aksnes Data sources, methods and applications Nordic Institute for Studies in Innovation, Research.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.
Bibliometric assessment of research performance in social sciences and humanities Henk F. Moed Centre for Science and Technology Studies (CWTS), Leiden.
ESSENTIAL SCIENCE INDICATORS (ESI) James Cook University Celebrating Research 9 OCTOBER 2009 Steven Werkheiser Manager, Customer Education & Training ANZ.
1 Making a Grope for an Understanding of Taiwan’s Scientific Performance through the Use of Quantified Indicators Prof. Dr. Hsien-Chun Meng Science and.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
CEIHE II CONFERENCE SANTANDER APRIL 2008 Dr Peter W A West Secretary to the University.
Rosie Drinkwater & Professor Lawrence Young Group Finance Director, Pro Vice-Chancellor (Academic Planning & Resources) League Tables Where are we, why.
الله الرحيم بسم الرحمن علیرضا صراف شیرازی دانشیار و مدیر گروه دندانپزشکی کودکان رئیس کتابخانه مرکزی و مرکز علم سنجی دانشگاه علوم پزشکی مشهد.
7-11 September 2015 International Staff Training W eek in Tartu 2015.
Duncan Ross Director, data and analytics Times Higher Education.
Classification & Ranking in Higher Arts Education New EU developments and the role of ELIA.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Academic Ranking of World Universities
Which University Ranking is best for you?
STRATEGIC ACADEMIC UNIT “PEOPLE & TECHNOLOGIES”
Prof.Dr. Melih Bulu, Istinye University March 23
Johannes Sorz, Bernard Wallner, Horst Seidler and Martin Fieder
On the feasibility of a new approach
CESAER Task Force Benchmarking: SHARING / MONITORING / INFLUENCING
Political Science and Geography (TR020)
Advanced Scientometrics Workshop
Serbia in the Rankings Professor Ellen Hazelkorn
Dr. Edriss Ali, AGU - Dubai
Matching Skills for the Amsterdam region’s Development
Finalization of the Action Plans and Development of Syllabus
Presentation transcript:

Communication Challenges for Nordic Universities NUAS Conference August, 2012 Prof. Andrejs Rauhvargers, EUA Senior Adviser Session A2b: Rankings and Performance Indicators

Purpose and principles of review Addresses the most popular global university rankings Providing universities with analysis of the methodologies Only publicly accessible information was used Efforts were made to discover what is actually measured, how the scores for indicators are calculated how the final scores are calculated, and what the results actually mean.

…3… Global rankings cover not more than 3-5% of world’s universities

…4… Decrease of scores within the Top 500 universities How big can be the scores of the remaining for 16’500 universities? Score Rank Data from 2011/12 ranking tables

Simple arithmetics have huge influence on scores Where a composite score is calculated from several indicators, ranking providers assign weights to each indicator in the overall score. If a ranking predominantly uses absolute values (ARWU, Webometrics), its scores are size- dependent, i.e. the ranking favours large universities. This means that the ranking provider’s subjective judgement determines which indicators are more important (e.g. citations – 10%, reputation – 40%) …5…

Covarage of the research mission of universities Indicators: Publication count SCI &SSCI, Scopus: - production Publication count in Nature & Science - excellence Publications per staff - staff research productivity Citations (count) – overall force of HEI Citations - per paper or per staff - impact Citations to articles in the top impact journals – excellence Research income Research reputation surveys But there are also biases and flaws … …6…

Rankings and the teaching. Indicators: Alumni who have been awarded a Nobel Prize Staff/Student ratio Reputation surveys (academics, students, employers) Teaching income Dropout rate Time to degree PhD/ undergraduate ratio All of the above are distant proxies, some questionable Learning outcomes – are we there yet? …7…

BIASES AND FLAWS Natural sciences and medicine vs. social sciences and humanities bias Bibliometric indicators primarily cover journal publications and conference proceedings Natural and life scientists primarily publish in journals, Engineering scientists - in conference proceedings, Social scientists and humanists – in books Several indicators count by 22 broad areas …8…

22 broad subject areas as defined by ISI 1.Agricultural Sciences 2.Biology & Biochemistry 3.Chemistry 4.Clinical Medicine 5.Computer Science 6.Ecology/Environment 7.Economics & Business 8.Engineering 9.Geosciences 10.Immunology 11.Materials Science 12. Mathematics 13. Microbiology 14. Molec. Biol.& Genetics 15. Neuroscience 16. Pharmacology 17. Physics 18. Plant & Animal Science 19. Psychology/Psychiatry 20. Social Sciences, General 21. Space Sciences 22. Multi-disciplinary …9…

Different publication and citation cultures in different fields Source: presentation of Cheng at IREG 2010 conference in Berlin …10…

Field normalisation – solutions and issues Field-normalised citations per publication indicator (Leiden ‘Crown indicator’) C i is the number of citations of the publication i e i is the expected number of citations of publication i given the field and the year Criticisms – prefers older publications, - blurs the picture …11…

Mean-normalisation – solutions and issues New attempt (2010) - mean-normalised citation score (MNCS) Good idea, but: now the results are unstable for the very newest publications To avoid the new flaw MNCS indicator is used which leaves out publications of the last year And then it appears that a single publication may substantially change univerity’s ccore But after all, it just improves mathematics, not the issue that WoS and Scopus insufficiently cover books …12…

Other issues Language bias Regional biases Calculation of total score reflects ranking provider’s understanding of quality The secondary effects of previous ranking positions Global league tables do not cover ‘third mission’ …13…

RANKERS THEMSELVES ISSUE WARNINGS Percentages of «strongly agree» and «somewhat agree combined» Basis: 350 respondents in 30 countries Source: Thomson-Reuters, Survey, 2011

Rankings encourage perverse actions Improving performance specifically in areas measured in rankings Concentrate funds and efforts to the above aspects Pay less attention to issues that are not rewarded in ranking scores such as: quality of teaching, regional involvement, widening access, lifelong learning, social issues of students and staff etc. Merging HEIs solely for bretter scores in rankings Linking immigration policies with rankings Linking wider state HE policy solely on rankings Linking recognition of qualifications to rankings …15…

Can rankings be improved? There will be no improvement from extending 5 distant proxies to 25 – they will still remain proxies... Improve coverage of teaching – most probably through measuring learning outcomes, Lift biases, eradicate flaws of bibliometric indicators: field, language, regional, but first of all – address non-journal publications properly! Change rankings so that they in reality help students to make their choices. Addressing elite only, ranking results impact life all universities – it is time to produce rankings that cover more universities! …16…

U-map classification profiles (EU) Neutral process indicators – not for value judgments or ranking Teaching and learning – levels and orientation of degrees, subject range Student profile - mature, distance, part-time, Research activity, Knowledge exchange, International orientation, Regional engagement.

Multi-indicator tool U-Multirank (EU) Combination of performance indictors Ranking based on one indicator, scores in other indicators displayed No overall score Default-15 indicators One can create own set of indicators …18…

Multirank: default set of 15 indicators Source: Multirank presentation, 2011

U-Multirank Teaching/ learning – same distant proxies – but many Still to be seen How well self-reported will work in international context how well student satisfaction data will work in international context, whether (or when?) other parties will turn Multirank into a league table and what will be the consequences …20…

Who will open possibility to involve greater number of universities? Global rankings? – not without changing methodology Subject rankings? – may be but not if they stay barely ‘side products’ of global league tables and not without extending data collection National/regional rankings? Multi-indicator profiling and ranking tools? …21…

New visualisations of global rankings incl. clasifications and multi-indicator tools ARWU «Rankings Lab»: possibility to chose indicators and asign various weights ARWU GRUP Global Research University Profiles self-submitted data collection, 231 universities ARWU «Ranking by indicator (22)»: resembles Multirank Times Higher Education - subject rankings …22…

New visualisations, clasifications & multi-indicator tools ARWU «Ranking by indicator (22)» Thomson-Reuters Profiles Project (reputation, funding, faculty characteristics THE Reputation ranking THE iPhone tool THE and QS competing new universities rankings QS subject rankings QS Classifications (size, subject range, research intensity, age of university) QS Stars: (8 criteria) QS City rankings

http.// Thank you for attention! Andrej Rauhvargers