THE Rankings Methodology

Slides:



Advertisements
Similar presentations
Key Performance Indicators in Measuring Institutional Performance Case Study Use of Board Level KPIs John Lauwerys Secretary & Registrar.
Advertisements

Understanding College Rankings Amy VanSurksum, International Officer (USA Midwest & Northeast) University of Glasgow.
The Higher Education Innovation Fund Vinnova and British Embassy seminar 21 March 2006.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
THOMSON REUTERS—GLOBAL INSTITUTIONAL PROFILES PROJECT DR. NAN MA SCIENCE AND SOLUTION CONSULTANT THOMSON REUTERS OCT 19 TH, 2010.
Quality Assurance & University Rankings. Shanghai Ranking (Shanghai Jiao Tong University) THES (Times Higher Education Supplement) CHE Ranking »Centrum.
April 9, 2003Santiago, Chile The ISI Database: Reflecting the Best of International and Regional Research Keith R. MacGregor Sr. Vice President The Americas,
About me Phil Baty Rankings Editor
What is impact? What is the difference between impact and public engagement? Impact Officers, R&IS.
INCITES TM INSTITUTIONAL PROFILES David Horky Country Manager – Central & Eastern Europe Informatio Scientifica / Informatio.
MARKO ZOVKO, ACCOUNT MANAGER STEPHEN SMITH, SOLUTIONS SPECIALIST JOURNALS & HIGHLY-CITED DATA IN INCITES V. OLD JOURNAL CITATION REPORTS. WHAT MORE AM.
Rosie Drinkwater & Professor Lawrence Young Group Finance Director, Pro Vice-Chancellor (Academic Planning & Resources) League Tables Where are we, why.
Duncan Ross Director, data and analytics Times Higher Education.
Role of librarians in improving the research impact and academic profiling of Indian universities J. K. Vijayakumar Ph. D Manager, Collections & Information.
Academic Ranking of World Universities
Management, Marketing, International Business, Business Information Systems.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Why Cambridge? A presentation for parents & students Presenter’s Name
Current R& KE Issues David Sweeney
Where Should I Publish? Journal Ranking Tools
HEInnovate A self-assessment tool for higher education institutions (HEIs) wishing to explore their entrepreneurial and innovative potential.
Research Canada’s 2016 Annual General Meeting
Outline of Quality assurance and accreditation
Why Cambridge? A presentation for parents & students.
Which University Ranking is best for you?
AAU Membership Metrics
Key Performance Indicators in Measuring Institutional Performance Case Study Use of Board Level KPIs John Lauwerys Secretary & Registrar.
Bibliometrics toolkit: Thomson Reuters products
Prof.Dr. Melih Bulu, Istinye University March 23
Impact of EU structural funds in research and innovation: the experience of the Lithuanian 'Valleys’ April, 2016.
Welcome. I’m Terrie Alafat….
Teaching Excellence at Newcastle
Rainhill High School Media Arts College
University in Haryana No longer young in years we remain young in spirit, committed to pioneering research and preparing the next generation of talented.
Release of PARCC Student Results
Mariya International School
It Takes a Community to Cultivate the Assessment Crop
Assessing What Matters:
OUTCOME MEASUREMENT TRAINING
Exeter Model of ITE Induction for ITE Coordinators,
Indian Higher Education: Role of Library Professionals
Ross O. Love Oklahoma Cooperative Extension Service
UGC RAE /9/20.
Simon Pawley Market Research, Oxford University Press
BESS – The Business, Economics and Social Studies Programme (TR081)
Introduction to the NSU Write from the Start QEP
WP2. Excellent university for the researchers
Derek Herrmann & Ryan Smith University Assessment Services
An introduction to middle leadership
All Staff Meeting Monday 24 October 2016
SciVal to support building a research strategy

A Developers Perspective
BESS – The Business, Economics and Social Studies Programme (TR081)
Building Knowledge about ESD Indicators
Internationalisation of higher education in the UK
Why Cambridge? A presentation for parents & students Presenter’s Name
Tapping Into the Power of Top Performing Boards
Quality, innovation and internationalization
Context: Increase in upper degrees UK-wide
DUAL SUPPORT DUEL FOR SUPPORT
Finalization of the Action Plans and Development of Syllabus
Category management capability and partnerships in 2014
How does practice research fit into HEFCE’s future research policy?
South Asia Challenges and benefits of research collaboration in a diverse region March 2019 Maria de Kleijn-Lloyd.
Applying for promotion on learning and teaching 1 Gathering Evidence
Steph Kirkham, Development Consultant, sparqs
Being a Local University: Towards New Assessment Tools and Indicators Dr John H Smith Senior Adviser, European University Association (EUA) Brussels Member,
South Asia Challenges and benefits of research collaboration in a diverse region March 2019 Maria de Kleijn-Lloyd.
Presentation transcript:

THE Rankings Methodology THE WORLD UNIVERSITY RANKINGS THE Rankings Methodology What makes a great university? At Times Higher Education magazine we have been reporting on higher education for 40 years, and we have been publishing our World University Rankings for the last eight. So we have had a lot of time to reflect on the question of what makes a great university. And as our rankings have grown in their reach and influence – informing student choices, helping to shape university and government strategy, even informing immigration laws in some countries – the question has become more and more vexed. With rankings so powerful, we of course get an awful lot of – often rather passionate, often rather partisan -- feedback on how excellence in higher education should be defined – and much more controversially, how it should be measured. Phil Baty Editor Times Higher Education World University Rankings

What makes a world class university? “Everyone wants one, no one knows what it is, and no one knows how to get one”. Philip Altbach, Boston College, US on the “World Class University” Well, to start with, we had to go back to first principles, and asked ourselves: what makes a great university? Phil Altbach – from Boston College -- rather unhelpfully summed up our problem. Speaking about what he called the “world class university”, he said: Quote Of course, one of the strengths of higher education is the sheer diversity of the sector. As I said earlier, there are many different university missions. There are excellent regional and national teaching-led institutions, for example, excellent business-facing institutions, social enterprise universities, private providers, some focussed on vocational skills and knowledge transfer. But for the world university rankings, we agreed we needed to focus on a specific type of university, to make sure we are comparing like with like as much as possible, across all national boarders. The type of universities that Phil Altbach would call “world class universities”. Of course, world class universities can have different histories, structures and subject mixes, but the universities we rank have common characteristics: they all push the boundaries of knowledge with highly influential original research, published in the world’s leading journals. They all compete in the global market for the very best students, and for the best scholars and university administrators. They all enjoy high incomes, allowing them to invest in the infrastructure needed to sustain a world-class environment and to compete to offer attractive salaries to staff.

Back to first principles: What makes a world class university? The key pillars: Teaching Knowledge Transfer Global outlook Research. So, we took as our starting point four inarguable key pillars: Teaching Knowledge Transfer Diversity/global outlook And of course, research But it is a real challenge to turn these core components into a rigorous and balanced ranking, given the limited availability of data that can be collected on a global scale, and fairly compared across nations

World University Rankings methodology Teaching – the learning environment (30%) Reputation survey – Teaching (15%) Staff-to-student ratio (4.5%) PhDs awarded / Undergraduate degrees awarded (2.25%) PhDs awarded / Academic staff (6%) Institutional income / Academic staff (2.25%) For teaching, we have to accept that we can’t measure teaching quality. There are no globally comparable indicators of teaching outputs. But teaching is surely a fundamental role of any university. So we use a series of proxies. The key is our academic reputation survey. The was carried out in Spring 2011, and received 17,500 responses (up from 13,388 in 2010) from every corner of the world and from every discipline. More on that later SSRs. Not perfect. Waiters in a restaurant. But valued by our survey respondents. Some rankings give it a weighting as high as 20 per cent. We give it just 4.5 per cent. PhDs per bachelors degrees. This goes back to the Humboldtian idea that good teaching is informed by research. We believe that a high density of research students is a sign of a research-led teaching environment, and suggests a more knowledge-intensive environment. Similarly, high proportion of doctoral students for the size of the institution suggests teaching and supervision at the highest level. Income. Controversial – different nations different economic circumstances. But we adjusted for purchasing price parity. So it may be controversial but there’s no getting away from the fact that an institution that spends $10,000 per student is a different proposition to one that spends $20,000

World University Rankings methodology “I welcome the way the Times Higher Education is also trying to measure teaching and is recognising that that’s a crucial part of the university experience”. David Willetts, UK universities minister And I’m glad to say that our commitment to examine teaching has been praised by the universities minister, David Willetts.

World University Rankings methodology International Outlook – staff, students and research (7.5%) International students / Total students (2.5%) International academic staff / Total academic staff (2.5%) Scholarly papers with one or more international co-authors / Total scholarly papers (2.5%) Clear to us that the ability of an institution to compete in a highly competitive global market for the best staff and best students from all around the world is crucial. This is another controversial indicator, as geography comes into play (Switzerland vs America etc) so it is not over-weighted. Half the amount given compared to our old system 2004-2009. Also, new for 2011-12. International co-authorship of research papers. Ideas know no national boarders, best researchers seek out the best in their field, from whatever country. We believe it is a fair proxy for excellent research, and shows an institution operating at the global level.

World University Rankings methodology Industry Income – innovation (2.5%) Research income from industry / Academic staff (2.5%) We also feel it is essential to recognise the so-called “third mission” of a university, after teaching and research. Working with industry in today’s knowledge economy, is a crucial element of a modern university. This indicator is still something of a work in progress. We would like to include industrial co-authorship of research papers in future to enhance it. Industiral co-authorship of papers? For now, given a low weighting.

World University Rankings methodology Research – volume, income and reputation (30 %) Reputation survey – Research (18%) Research income (PPP) / Academic staff (6%) Scholarly papers / Academic staff + Research staff (6%) But with universities at the heart of national knowledge and innovation economises, in any serious global ranking of global universities, research is the dominant issue. In this indicator, we look at a university’s reputation for research excellence amongst informed and experienced peers within each discipline – based on the 17,500 responses to our reputation survey. But we also look at a university’s ability to attract research funding, in what in most countries is a competitive environment. A crucial factor here is that we normalise the research income data to reflect global averages by discipline – a grant for nuclear physics, of course, will be bigger than one for a piece of social science. We also believe there is a need for an indicator of research productivity. Here we simply look at the volume of papers published in the leading academic journals indexed by Thomson Reuters, scaled for a university’s size.

World University Rankings methodology Citations – research influence (30%) Citation impact (normalized average citations per paper) (30%) But our “research influence” indicator is the flagship. This one indicator is worth a total of 30 per cent of the overall ranking score. We put it at the heart of the rankings. This is a really special indicator, because in a nutshell, it looks at the role of universities in spreading new knowledge and ideas. To do this for the 2011-12 rankings, Thomson Reuters analysed over 50 million citations of over 6 million research journal articles published over five years. These citation counts do not examine which inventions have the most patents, or which discoveries make the most money. In broad terms these millions of citations help show us simply how much each university is contributing to the sum of human knowledge. They tell us whose research has stood out, has been picked up and built on by other scholars, and most importantly, has been shared around the global scholarly community to push further the boundaries of our collective understanding. They tell us which research has simply been the most influential in its field -- whether it has deepened our understanding of the human condition in the arts and humanities, enhanced our democracy through free enquiry in the social sciences, or has taken forward the fight against cancer in the life sciences. I explained in this morning’s first session how important it is to properly nomalise the citations data. It is meaningless if it is not normalised.

World University Rankings methodology Just let me quickly go back to that reputation survey I have mentioned, because this is a controversial area. Like I said earlier today, the reputation indicators is one of the weakest elements of the QS system. We had to work very hard to make sure our reputation survey was robust and transparent, and was a true piece of social science. The results, split over teaching and research, make up 33 per cent of a university’s total score in the Times Higher Education World University Ranking But it is a serious piece of work. 17,500 responses from all over the world. But also very well balanced for discipline – see above. And…

World University Rankings methodology Also well balanced for geography Distributed in multiple languages. Unesco data etc.

Thomson Reuters data collection/analysis So behind the scenes. Thomson Reuters collect every piece of data we need, and analyse it for us. Three main types of data: Thomson Reuters own bibliometric data, the results of the academic reputation survey, and the data institutions submit themselves to Thomson Reuters through what they call the “Global Institutional Profiles Project”.

Thomson Reuters data collection/analysis So behind the scenes. Thomson Reuters collect every piece of data we need, and analyse it for us. Three main types of data: Thomson Reuters own bibliometric data, the results of the academic reputation survey, and the data institutions submit themselves to Thomson Reuters through what they call the “Global Institutional Profiles Project”. http://ip-science.thomsonreuters.com/globalprofilesproject/

Methodology in full So here’s the final methodology for 2011-12. This just helps to show the balance we’ve achieved between the 13 different indicators – two thirds research, but a healthy 30 per cent dedicated to the teaching environment. I’m pleased that the total weight given to the reputation survey – across teaching and research – is now down to 33 per cent. In one ranking, subjective surveys are worth 50 per cent, but we are keen to rely more on objective measures.

Reactions to the THE WUR system “Times Higher rankings -- now increasingly seen as the gold standard” Ferdinand Von Prondzynski, vice chancellor, Robert Gordon University “The new methodology employed by Times Higher Education is less heavily weighted towards subjective assessments of reputation and uses more robust citation measures. This bolstered confidence in the evaluation method.” Steve Smith, president, Universities UK “I congratulate THE for reviewing the methodology to produce this new picture of the best in higher education worldwide.” David Willetts, UK minister for higher education and science “This year Times Higher Education consulted widely to pinpoint weaknesses in other ranking systems and in their previous approach…. These are welcome developments.” David Naylor, president, University of Toronto I will not pretend we do not have our critics, but the early reactions have been very positive . I like the first one!

2011-12: The results And here is a brief summary the results for 2011-12. The top ten. US take 7 of top 10.

Over to you • Visit the Global Institutional Profiles Project website: http://science.thomsonreuters.com/globalprofilesproject • See the results in full, with our interactive tables: http://bit.ly/thewur • Join our rankings Facebook group. www.facebook.com/THEWorldUniRank • Keep up to date with all the rankings news on Twitter: @THEWorldUniRank But to make sure we’re as accountable as it is possible to be, we need constant criticism and input. Only with the engagement of the higher education sector will we achieve a tool that is as rigorous and as transparent and as useful as the sector needs and deserves. So please use the sites and tools above to make sure you have your say and tell us what you think.

Thank you. Stay in touch. Phil Baty Times Higher Education T Thank you. Stay in touch. Phil Baty Times Higher Education T. 020 3194 3298 E. phil.baty@tsleducation.com