Presentation is loading. Please wait.

Presentation is loading. Please wait.

THE Rankings Methodology

Similar presentations


Presentation on theme: "THE Rankings Methodology"— Presentation transcript:

1 THE Rankings Methodology
THE WORLD UNIVERSITY RANKINGS THE Rankings Methodology What makes a great university? At Times Higher Education magazine we have been reporting on higher education for 40 years, and we have been publishing our World University Rankings for the last eight. So we have had a lot of time to reflect on the question of what makes a great university. And as our rankings have grown in their reach and influence – informing student choices, helping to shape university and government strategy, even informing immigration laws in some countries – the question has become more and more vexed. With rankings so powerful, we of course get an awful lot of – often rather passionate, often rather partisan -- feedback on how excellence in higher education should be defined – and much more controversially, how it should be measured. Phil Baty Editor Times Higher Education World University Rankings

2 What makes a world class university?
“Everyone wants one, no one knows what it is, and no one knows how to get one”. Philip Altbach, Boston College, US on the “World Class University” Well, to start with, we had to go back to first principles, and asked ourselves: what makes a great university? Phil Altbach – from Boston College -- rather unhelpfully summed up our problem. Speaking about what he called the “world class university”, he said: Quote Of course, one of the strengths of higher education is the sheer diversity of the sector. As I said earlier, there are many different university missions. There are excellent regional and national teaching-led institutions, for example, excellent business-facing institutions, social enterprise universities, private providers, some focussed on vocational skills and knowledge transfer. But for the world university rankings, we agreed we needed to focus on a specific type of university, to make sure we are comparing like with like as much as possible, across all national boarders. The type of universities that Phil Altbach would call “world class universities”. Of course, world class universities can have different histories, structures and subject mixes, but the universities we rank have common characteristics: they all push the boundaries of knowledge with highly influential original research, published in the world’s leading journals. They all compete in the global market for the very best students, and for the best scholars and university administrators. They all enjoy high incomes, allowing them to invest in the infrastructure needed to sustain a world-class environment and to compete to offer attractive salaries to staff.

3 Back to first principles: What makes a world class university?
The key pillars: Teaching Knowledge Transfer Global outlook Research. So, we took as our starting point four inarguable key pillars: Teaching Knowledge Transfer Diversity/global outlook And of course, research But it is a real challenge to turn these core components into a rigorous and balanced ranking, given the limited availability of data that can be collected on a global scale, and fairly compared across nations

4 World University Rankings methodology
Teaching – the learning environment (30%) Reputation survey – Teaching (15%) Staff-to-student ratio (4.5%) PhDs awarded / Undergraduate degrees awarded (2.25%) PhDs awarded / Academic staff (6%) Institutional income / Academic staff (2.25%) For teaching, we have to accept that we can’t measure teaching quality. There are no globally comparable indicators of teaching outputs. But teaching is surely a fundamental role of any university. So we use a series of proxies. The key is our academic reputation survey. The was carried out in Spring 2011, and received 17,500 responses (up from 13,388 in 2010) from every corner of the world and from every discipline. More on that later SSRs. Not perfect. Waiters in a restaurant. But valued by our survey respondents. Some rankings give it a weighting as high as 20 per cent. We give it just 4.5 per cent. PhDs per bachelors degrees. This goes back to the Humboldtian idea that good teaching is informed by research. We believe that a high density of research students is a sign of a research-led teaching environment, and suggests a more knowledge-intensive environment. Similarly, high proportion of doctoral students for the size of the institution suggests teaching and supervision at the highest level. Income. Controversial – different nations different economic circumstances. But we adjusted for purchasing price parity. So it may be controversial but there’s no getting away from the fact that an institution that spends $10,000 per student is a different proposition to one that spends $20,000

5 World University Rankings methodology
“I welcome the way the Times Higher Education is also trying to measure teaching and is recognising that that’s a crucial part of the university experience”. David Willetts, UK universities minister And I’m glad to say that our commitment to examine teaching has been praised by the universities minister, David Willetts.

6 World University Rankings methodology
International Outlook – staff, students and research (7.5%) International students / Total students (2.5%) International academic staff / Total academic staff (2.5%) Scholarly papers with one or more international co-authors / Total scholarly papers (2.5%) Clear to us that the ability of an institution to compete in a highly competitive global market for the best staff and best students from all around the world is crucial. This is another controversial indicator, as geography comes into play (Switzerland vs America etc) so it is not over-weighted. Half the amount given compared to our old system Also, new for International co-authorship of research papers. Ideas know no national boarders, best researchers seek out the best in their field, from whatever country. We believe it is a fair proxy for excellent research, and shows an institution operating at the global level.

7 World University Rankings methodology
Industry Income – innovation (2.5%) Research income from industry / Academic staff (2.5%) We also feel it is essential to recognise the so-called “third mission” of a university, after teaching and research. Working with industry in today’s knowledge economy, is a crucial element of a modern university. This indicator is still something of a work in progress. We would like to include industrial co-authorship of research papers in future to enhance it. Industiral co-authorship of papers? For now, given a low weighting.

8 World University Rankings methodology
Research – volume, income and reputation (30 %) Reputation survey – Research (18%) Research income (PPP) / Academic staff (6%) Scholarly papers / Academic staff + Research staff (6%) But with universities at the heart of national knowledge and innovation economises, in any serious global ranking of global universities, research is the dominant issue. In this indicator, we look at a university’s reputation for research excellence amongst informed and experienced peers within each discipline – based on the 17,500 responses to our reputation survey. But we also look at a university’s ability to attract research funding, in what in most countries is a competitive environment. A crucial factor here is that we normalise the research income data to reflect global averages by discipline – a grant for nuclear physics, of course, will be bigger than one for a piece of social science. We also believe there is a need for an indicator of research productivity. Here we simply look at the volume of papers published in the leading academic journals indexed by Thomson Reuters, scaled for a university’s size.

9 World University Rankings methodology
Citations – research influence (30%) Citation impact (normalized average citations per paper) (30%) But our “research influence” indicator is the flagship. This one indicator is worth a total of 30 per cent of the overall ranking score. We put it at the heart of the rankings. This is a really special indicator, because in a nutshell, it looks at the role of universities in spreading new knowledge and ideas. To do this for the rankings, Thomson Reuters analysed over 50 million citations of over 6 million research journal articles published over five years. These citation counts do not examine which inventions have the most patents, or which discoveries make the most money. In broad terms these millions of citations help show us simply how much each university is contributing to the sum of human knowledge. They tell us whose research has stood out, has been picked up and built on by other scholars, and most importantly, has been shared around the global scholarly community to push further the boundaries of our collective understanding. They tell us which research has simply been the most influential in its field -- whether it has deepened our understanding of the human condition in the arts and humanities, enhanced our democracy through free enquiry in the social sciences, or has taken forward the fight against cancer in the life sciences. I explained in this morning’s first session how important it is to properly nomalise the citations data. It is meaningless if it is not normalised.

10 World University Rankings methodology
Just let me quickly go back to that reputation survey I have mentioned, because this is a controversial area. Like I said earlier today, the reputation indicators is one of the weakest elements of the QS system. We had to work very hard to make sure our reputation survey was robust and transparent, and was a true piece of social science. The results, split over teaching and research, make up 33 per cent of a university’s total score in the Times Higher Education World University Ranking But it is a serious piece of work. 17,500 responses from all over the world. But also very well balanced for discipline – see above. And…

11 World University Rankings methodology
Also well balanced for geography Distributed in multiple languages. Unesco data etc.

12 Thomson Reuters data collection/analysis
So behind the scenes. Thomson Reuters collect every piece of data we need, and analyse it for us. Three main types of data: Thomson Reuters own bibliometric data, the results of the academic reputation survey, and the data institutions submit themselves to Thomson Reuters through what they call the “Global Institutional Profiles Project”.

13 Thomson Reuters data collection/analysis
So behind the scenes. Thomson Reuters collect every piece of data we need, and analyse it for us. Three main types of data: Thomson Reuters own bibliometric data, the results of the academic reputation survey, and the data institutions submit themselves to Thomson Reuters through what they call the “Global Institutional Profiles Project”.

14 Methodology in full So here’s the final methodology for This just helps to show the balance we’ve achieved between the 13 different indicators – two thirds research, but a healthy 30 per cent dedicated to the teaching environment. I’m pleased that the total weight given to the reputation survey – across teaching and research – is now down to 33 per cent. In one ranking, subjective surveys are worth 50 per cent, but we are keen to rely more on objective measures.

15 Reactions to the THE WUR system
“Times Higher rankings -- now increasingly seen as the gold standard” Ferdinand Von Prondzynski, vice chancellor, Robert Gordon University “The new methodology employed by Times Higher Education is less heavily weighted towards subjective assessments of reputation and uses more robust citation measures. This bolstered confidence in the evaluation method.” Steve Smith, president, Universities UK “I congratulate THE for reviewing the methodology to produce this new picture of the best in higher education worldwide.” David Willetts, UK minister for higher education and science “This year Times Higher Education consulted widely to pinpoint weaknesses in other ranking systems and in their previous approach…. These are welcome developments.” David Naylor, president, University of Toronto I will not pretend we do not have our critics, but the early reactions have been very positive . I like the first one!

16 : The results And here is a brief summary the results for The top ten. US take 7 of top 10.

17 Over to you • Visit the Global Institutional Profiles Project website: • See the results in full, with our interactive tables: • Join our rankings Facebook group. • Keep up to date with all the rankings news on But to make sure we’re as accountable as it is possible to be, we need constant criticism and input. Only with the engagement of the higher education sector will we achieve a tool that is as rigorous and as transparent and as useful as the sector needs and deserves. So please use the sites and tools above to make sure you have your say and tell us what you think.

18 Thank you. Stay in touch. Phil Baty Times Higher Education T
Thank you. Stay in touch. Phil Baty Times Higher Education T E.


Download ppt "THE Rankings Methodology"

Similar presentations


Ads by Google