Presentation is loading. Please wait.

Presentation is loading. Please wait.

Rankings from the perspective of European universities

Similar presentations


Presentation on theme: "Rankings from the perspective of European universities"— Presentation transcript:

1 Rankings from the perspective of European universities
Tia Loukkola Director of Institutional Development EUA General Assembly, April, 2015

2 EUA’s work on rankings in short
Council working group Two reviews with main focus on ranking methodologies 2011 Global University Rankings and Their Impact 2013 Global University Rankings and Their Impact – Report II RISP project 2014 Rankings in Institutional Strategies and Processes: Impact or Illusion? Mapping EUA members’s participation in U-Multirank 2015 Report on the experiences from the first round …2…

3 Summary of key RISP findings
While highly critical of rankings, HEIs still use rankings: Fill information gap Benchmark Inform institutional decision-making Develop marketing material Institutional processes affected by rankings fall into 4 categories: Mechanisms to monitor rankings Clarification of institutional profile and adapting core activities Improvement to institutional data collection Investment in enhancing institutional image

4 Conclusions from RISP Institutions need to improve their capacity to generate comprehensive, high-quality data and information: to underpin strategic planning and decision-making to provide meaningful, comparative information about institutional performance to the public Rankings can be an important ingredient in strategic planning… nevertheless, it is vital that each university stays “true” to its mission and should not be “diverted or mesmerised” by rankings The report ends with guidelines on how institutions could use the rankings for strategic purposes

5 Background to UMR survey
A variety of views among membership EUA represented in the Advisory Board First results published in May 2014 A survey to map views and experiences of individual members on the initiative …5…

6 EUA members in UMR *47,1% of EUA members in 2014 **59.4% of EUA members in 2015 2014 2015 UMR participants in total 879 1209 EUA members included, total 369* 454* EUA members active participation 286 357 EUA members publicly available data 83 97 EUA member not included 414 310 2014 survey results: 76.5% of those not included intended to take part in > new edition shows that they did so Motivations similar to those already included For those not intending to contribute the main concern is the resources required by the data collection …6…

7 UMR in short Multidimensional ranking with seed-funding from the EC
Indicators cover 5 areas: teaching and learning, research, knowledge transfer, international orientation and regional engagement. Data provided by the institutions directly international bibliometric and patent databases student surveys (completed by students at participating institutions) Fields covered by UMR 2014: business studies, electrical engineering, mechanical engineering and physics 2015: psychology, computer science and medicine …7…

8 Survey results: 85 universities having actively participated in dataprovision
…8…

9 Reasons for active participation
More than seven in ten universities indicated that increasing their visibility and improving their profile abroad was the main motivation for participating in UMR. Over half of the universities in the sample also saw participation in UMR as an opportunity to benchmark the institution at an international level (Figure 1). These responses are in line with another recent EUA study, which showed that the main uses universities have for rankings are to fill an information gap about other universities, benchmark with other universities, in particular internationally, inform institutional decision-making and – last but not least – for marketing purposes (EUA 2014, pp ). …9…

10 Views on UMR indicators
…10…

11 Resources required Considerable resources used to provide data
Only 12% had less than 5 persons involved Half involved 5-15 persons 29.2% used more than 30 working days 20% spent less than 10 days “The field data collection process is very time consuming. There were some difficulties in interpreting some definitions and to adjust them to the national context.” (University from Portugal) …11…

12 Using results 60 % are using UMR results for something, out of them
…12…

13 Cooperation with UMR consortium
…13…

14 Survey results: 7 universities included in UMR through publicly available data
“We cannot see how U-Multirank can overcome the differences in how data is interpreted among universities and between countries.” (University from Sweden) “We had concerns about the validity of the exercise, the cost of data collection and the difficulty of quality assurance for this self-reported data.” (University from United Kingdom) Because the sample is very small, perhaps it is better to just show these two quotes, they give an idea of the issues raised by this group. …14…

15 Survey results: 34 universities not included in UMR
…15…

16 Reasons for not contributing to the data collection
…16…

17 Key findings There is increasing interest among EUA members to take part in UMR Cooperation with UMR consortium worked quite well Benefits of participation or use of UMR results unclear Data collection required considerable resources Concerns over validity of data following difficulties in interpreting indicators UMR struggles with reliability and comparability of the data -> how to overcome this? Whether a university took part in UMR or not, all expressed major concerns regarding the interpretation of the UMR indicators across different institutions and countries and thus the validity of the data provided. This concern is backed up by the response from those actively providing data about challenges in collecting the data in the format requested. Collecting data for UMR required considerable resources and the amount of work surprised many of those providing data for UMR. The adequacy of the indicators in different institutional contexts was a concern. Cooperation with the UMR consortium worked smoothly although a small minority was not happy with the way their data was presented in the final results. The benefits of participation for an individual university are still rather unclear: four in 10 universities have no plans to use the results of UMR or do not yet know how they would do so. …17…

18 Conclusions Use of rankings at institutional level not systematic
Developing institutional research capacity is vital Would we need international or European common dataset? The survey results reflect some of the findings of the RISP study on the impact of rankings on institutional strategies and processes (EUA 2014): the approaches on how to use rankings for the benefit of universities are not very systematic or thought-through; developing institutional research capacity so to be able to respond swiftly and efficiently to requests for data is vital and; there is a need to discuss whether an international common dataset on higher education would be possible so to overcome the challenges in applying the indicators used for various purposes. Furthermore, the survey showed that UMR is still struggling with many of the same challenges as other rankings with regards to the comparability and reliability of data. Also, many of its indicators, in particular those related to teaching and learning are rather remote proxies to quality or performance (EUA 2013). It will be interesting to see how UMR will attempt to address and overcome these challenges in the future. …18…

19 All publications are available at
…19…


Download ppt "Rankings from the perspective of European universities"

Similar presentations


Ads by Google