Rankings from the perspective of European universities

Slides:



Advertisements
Similar presentations
Usage statistics in context - panel discussion on understanding usage, measuring success Peter Shepherd Project Director COUNTER AAP/PSP 9 February 2005.
Advertisements

World Heritage Periodic reporting Latin America and the Caribbean Carolina Castellanos / Mexico.
Mapping Diversity – The U-Multirank Approach to Rankings Gero Federkeil Workshop Universidade Nova de Lisboa, 29th June 2012.
ECVET WORKSHOP 2 22/23/24 November The European Quality Assurance Reference Framework.
The world’s first global, multi-dimensional, user-driven university* ranking (* includes all higher education institutions) Jordi Curell Director Higher.
FLCC knows a lot about assessment – J will send examples
How can citizen’s participate? Purpose and levels of participation: approach, methods, techniques, tools. Technical Assistance for Civil Society Organisations.
TORINO PROCESS. TORINO PROCESS 2014 THE TORINO PROCESS 2 THE TORINO PROCESS IS a participatory process leading to an evidence-based analysis of VET policies.
Quality Assurance in Europe: Challenges and Opportunities Maria Helena Nazaré EUA President Former Rector Universidade de Aveiro, Portugal.
TEMPLATE DESIGN © The Homework Effect: Does Homework Help or Harm Students? Katherine Field EdD Candidate, Department.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
What Can National Rankings Learn from the U-Multirank-Project ? Gero Federkeil, CHE, Germany IREG-Forum: National University Rankings on the.
ESPON Seminar 15 November 2006 in Espoo, Finland Review of the ESPON 2006 and lessons learned for the ESPON 2013 Programme Thiemo W. Eser, ESPON Managing.
Educational Products and Services Offered Abroad by Canadian Universities: An AUCC Survey Robert White International Relations AUCC.
MANAGEMENT SYSTEM AND STRUCTURAL ORGANIZATION FOR THE QUALITY ASSURANCE OF SPs ■ Slovak University of Technology in Bratislava ■ Faculty of Civil Engineering.
KEY ACTION 2 Cooperation for Innovation and the Exchange of Good Practices Strategic Partnerships in the field of education, training and youth L.E.D.
Session 2: Developing a Comprehensive M&E Work Plan.
Some business of External QA: Transparency (reports), measuring impacts, follow up implementation, expected benefits, strategies for the future Josep Grifoll.
European Higher Education in a Global Perspective Lesley Wilson Secretary General European University Association Oslo, 11 March 2008.
Project: EaP countries cooperation for promoting quality assurance in higher education Maria Stratan European Institute for Political Studies of Moldova.
Academic Ranking of World Universities
The search for new ways of Quality Assurance The project “European Quality Audit” of the Universities of Bremen and Siegen Dr. Anke Rigbers.
BLM Decision Making Process
IRIS Education and Outreach
A systematic literature review of empirical evidence on computer games and serious games Wakana Ishimaru Leo Liang.
Projects, Events and Training
Where We Are and Where We Want to Be
The most represented stakeholders within the NAPA process were governments, followed by research institutions, UN Agencies and local communities. Private.
The assessment process For Administrative units
Module 9 Designing and using EFGR-responsive evaluation indicators
Arancha Oviedo EQAVET Secretariat
Department of Political Science & Sociology North South University
Duncanville ISD Curriculum Update
Achim Hopbach President ENQA
Strengthening of Internationalisation in B&H Higher Education
Every Student Succeeds Act
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Programme Board 6th Meeting May 2017 Craig Larlee
WP1. Methodology and structure of questionnaires
U-Multirank – The first Multidimensional Global University Ranking
Visions for Open Innovation:
Programme Review Dhaya Naidoo Director: Quality Promotion
Reporting Approaches and Best Practices Jennifer Benjamin NCQA
CESAER Task Force Benchmarking: SHARING / MONITORING / INFLUENCING
DISCUSSION AND CONCLUSIONS
CILIP Professional Registration & Portfolio Building
N Classification of Dutch and Flemish Higher Education Institutions.
Internationalisation in Europe
Social Studies 10-1 Chapter 14
A National research council of belize? Preliminary considerations
International Reflections on TVET Governance
Internal Quality Assurance in Higher Education in Europe
Measuring Data Quality and Compilation of Metadata
Statistics Governance and Quality Assurance: the Experience of FAO
Recognising and Rewarding Successful Teaching
EPHA – PE & School Sport presentation.
…and still actual for a post-2010 strategy!
Matching Skills for the Amsterdam region’s Development
Item 7.5 (2012-ETS-16) – Statistics on Special Needs Education
State of World’s Cash Report:
Standard for Teachers’ Professional Development July 2016
Modernisation of Statistics Production Stockholm November 2009
ERASMUS+ Teachex: Teaching excellence in Israel
ESF monitoring and evaluation in Draft guidance
Funding Accelerator Programs
Internal and External Quality Assurance Systems for Cycle 3 (Doctoral) programmes "PROMOTING INTERNATIONALIZATION OF RESEARCH THROUGH ESTABLISHMENT AND.
YOUTH WORKERS AND LEARNING IN NON-FORMAL CONTEXTS
Valued-Based Leadership and the 7 Outs
Being a Local University: Towards New Assessment Tools and Indicators Dr John H Smith Senior Adviser, European University Association (EUA) Brussels Member,
Decimals: Connections to the Common Core and the IES Practice Guide
Presentation transcript:

Rankings from the perspective of European universities Tia Loukkola Director of Institutional Development EUA General Assembly, April, 2015

EUA’s work on rankings in short 2009-2010 Council working group Two reviews with main focus on ranking methodologies 2011 Global University Rankings and Their Impact 2013 Global University Rankings and Their Impact – Report II 2012-2015 RISP project 2014 Rankings in Institutional Strategies and Processes: Impact or Illusion? Mapping EUA members’s participation in U-Multirank 2015 Report on the experiences from the first round …2…

Summary of key RISP findings While highly critical of rankings, HEIs still use rankings: Fill information gap Benchmark Inform institutional decision-making Develop marketing material Institutional processes affected by rankings fall into 4 categories: Mechanisms to monitor rankings Clarification of institutional profile and adapting core activities Improvement to institutional data collection Investment in enhancing institutional image

Conclusions from RISP Institutions need to improve their capacity to generate comprehensive, high-quality data and information: to underpin strategic planning and decision-making to provide meaningful, comparative information about institutional performance to the public Rankings can be an important ingredient in strategic planning… nevertheless, it is vital that each university stays “true” to its mission and should not be “diverted or mesmerised” by rankings The report ends with guidelines on how institutions could use the rankings for strategic purposes

Background to UMR survey A variety of views among membership EUA represented in the Advisory Board First results published in May 2014 A survey to map views and experiences of individual members on the initiative …5…

EUA members in UMR *47,1% of EUA members in 2014 **59.4% of EUA members in 2015 2014 2015 UMR participants in total 879 1209 EUA members included, total 369* 454* EUA members active participation 286 357 EUA members publicly available data 83 97 EUA member not included 414 310 2014 survey results: 76.5% of those not included intended to take part in 2015 -> new edition shows that they did so Motivations similar to those already included For those not intending to contribute the main concern is the resources required by the data collection …6…

UMR in short Multidimensional ranking with seed-funding from the EC Indicators cover 5 areas: teaching and learning, research, knowledge transfer, international orientation and regional engagement. Data provided by the institutions directly international bibliometric and patent databases student surveys (completed by students at participating institutions) Fields covered by UMR 2014: business studies, electrical engineering, mechanical engineering and physics 2015: psychology, computer science and medicine …7…

Survey results: 85 universities having actively participated in dataprovision …8…

Reasons for active participation More than seven in ten universities indicated that increasing their visibility and improving their profile abroad was the main motivation for participating in UMR. Over half of the universities in the sample also saw participation in UMR as an opportunity to benchmark the institution at an international level (Figure 1). These responses are in line with another recent EUA study, which showed that the main uses universities have for rankings are to fill an information gap about other universities, benchmark with other universities, in particular internationally, inform institutional decision-making and – last but not least – for marketing purposes (EUA 2014, pp. 47-48). …9…

Views on UMR indicators …10…

Resources required Considerable resources used to provide data Only 12% had less than 5 persons involved Half involved 5-15 persons 29.2% used more than 30 working days 20% spent less than 10 days “The field data collection process is very time consuming. There were some difficulties in interpreting some definitions and to adjust them to the national context.” (University from Portugal) …11…

Using results 60 % are using UMR results for something, out of them …12…

Cooperation with UMR consortium …13…

Survey results: 7 universities included in UMR through publicly available data “We cannot see how U-Multirank can overcome the differences in how data is interpreted among universities and between countries.” (University from Sweden) “We had concerns about the validity of the exercise, the cost of data collection and the difficulty of quality assurance for this self-reported data.” (University from United Kingdom) Because the sample is very small, perhaps it is better to just show these two quotes, they give an idea of the issues raised by this group. …14…

Survey results: 34 universities not included in UMR …15…

Reasons for not contributing to the data collection …16…

Key findings There is increasing interest among EUA members to take part in UMR Cooperation with UMR consortium worked quite well Benefits of participation or use of UMR results unclear Data collection required considerable resources Concerns over validity of data following difficulties in interpreting indicators UMR struggles with reliability and comparability of the data -> how to overcome this? Whether a university took part in UMR or not, all expressed major concerns regarding the interpretation of the UMR indicators across different institutions and countries and thus the validity of the data provided. This concern is backed up by the response from those actively providing data about challenges in collecting the data in the format requested. Collecting data for UMR required considerable resources and the amount of work surprised many of those providing data for UMR. The adequacy of the indicators in different institutional contexts was a concern. Cooperation with the UMR consortium worked smoothly although a small minority was not happy with the way their data was presented in the final results. The benefits of participation for an individual university are still rather unclear: four in 10 universities have no plans to use the results of UMR or do not yet know how they would do so. …17…

Conclusions Use of rankings at institutional level not systematic Developing institutional research capacity is vital Would we need international or European common dataset? The survey results reflect some of the findings of the RISP study on the impact of rankings on institutional strategies and processes (EUA 2014): the approaches on how to use rankings for the benefit of universities are not very systematic or thought-through; developing institutional research capacity so to be able to respond swiftly and efficiently to requests for data is vital and; there is a need to discuss whether an international common dataset on higher education would be possible so to overcome the challenges in applying the indicators used for various purposes.   Furthermore, the survey showed that UMR is still struggling with many of the same challenges as other rankings with regards to the comparability and reliability of data. Also, many of its indicators, in particular those related to teaching and learning are rather remote proxies to quality or performance (EUA 2013). It will be interesting to see how UMR will attempt to address and overcome these challenges in the future. …18…

All publications are available at http://www.eua.be/Publications.aspx …19…