1 Benchmarking Universities Worldwide: Existing Results and Future Efforts of Academic Ranking of World Universities Presented By Dr. Ying CHENG Institute.

Slides:



Advertisements
Similar presentations
Using Rankings to Drive Internal Quality Improvements
Advertisements

GLOBAL RANKINGS OF UNIVERSITIES John O’Leary I Editor I Times Higher Education Supplement.
1 “China’s Remarkable Progress in Science and Technology” Akito Arima P resident of Japan Science Foundation, Former Minister of Science and Technology.
Presented by: Michael A Mabe Director ofVisiting Professor Academic Relations Dept Information Science Elsevier City University, London Globality & Disciplinarity.
A Review of Canadian Publications and Impact in the Natural Sciences and Engineering, 1996 to 2005 OST Colloquium/Colloque April 26, 2007 Barney Laciak.
Ranking - New Developments in Europe Gero Federkeil CHE – Centre for Higher Education Development The 3rd International Symposium on University Rankings.
Using Rankings to Drive Internal Quality Improvements Dr. Kevin Downing City University of Hong Kong & Ms. Mandy Mok QS Asia.
1 Academic Rankings of Universities in the OIC Countries April 2007 April 2007.
1 Academic Ranking of World Universities Methodologies and Problems May 15, 2007 By Professor Nian Cai Liu Institute of Higher Education and Center for.
Department of Computer Science, Tsinghua University Introduction to the PhD Program of the Department of Computer Science and Technology at Tsinghua.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
Ghent University.
THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION Philip Purnell September 2010.
RESEARCH EVALUATION WORKSHOP UNITED KINGDOM OCTOBER 2010.
1 Strategics for Nurturing the International View of Young Scientists at National Taiwan University NSC Exchange Activities for Asia-Pacific on Science.
Edouard Mathieu Head of the Benchmarking Center Invest in France Agency * ARWU: Academic Ranking of World Universities 2005 A few remarks on ARWU*
Using Rankings to Drive Internal Quality Improvements
Searching for Globally Feasible Indicators from Domestic Rankings Ya Lan Xie and Ying Cheng Graduate School of Education, Shanghai Jiao Tong University,
Using the H-index to Measure Czech Economic Research and Czech Researchers’ Habits Related to Research Papers T. Cahlík, H. Pessrová.
1. 2 Islamic World Contribution to Scientific Publication in WOS during By: Marzieh Goltaji Member of Islamic World Science Citation Center.
Excellence in scientific research Prof
Wojciech Fenrich Interdisciplinary Centre for Mathematical and Computational Modelling (ICM) University of Warsaw Prague, KRE 12,
Difficulties and Possibilities of University Rankings in Hungary Magdolna Orosz (Eötvös Loránd University Budapest, Hungary) Academic cooperation and competitiveness.
DIGEST OF KEY SCIENCE AND ENGINEERING INDICATORS 2008 Presentation Slides National Science Board.
1 Helsinki University of Technology Systems Analysis Laboratory Rank-Based Sensitivity Analysis of Multiattribute Value Models Antti Punkka and Ahti Salo.
RANKINGS WORKSHOP STRATEGIES FOR MARKETING, BRANDING, AND IMPROVING INTERNATIONAL RANKINGS MICHAEL FUNG DIRECTOR OF PLANNING & INSTITUTIONAL RESEARCH THE.
Innovation for Growth – i4g Universities are portfolios of (largely heterogeneous) disciplines. Further problems in university rankings Warsaw, 16 May.
Uwe Brandenburg Options and limits of measurability: the experience from the Excellence Ranking in the light of the global ranking discussion.
THOMSON REUTERS RESEARCH IN VIEW Philip Purnell September 2011 euroCRIS symposium Brussels.
Quality Assurance & University Rankings. Shanghai Ranking (Shanghai Jiao Tong University) THES (Times Higher Education Supplement) CHE Ranking »Centrum.
April 9, 2003Santiago, Chile The ISI Database: Reflecting the Best of International and Regional Research Keith R. MacGregor Sr. Vice President The Americas,
Measuring Value and Outcomes of Reading Dr. Carol Tenopir University of Tennessee
Access to electronic scientific information: policies, strategies and programmes The Brazilian experience Elenara Chaves Edler de Almeida Brazilian Federal.
Communication of Research Results, Local Science Journals, and Evaluation of Scientific Performance: One or Three Divergent Issues? Anna María Prat Head.
League tables as policy instruments: the political economy of accountability in tertiary education Jamil Salmi and Alenoush Saroyan CIEP, June 2006.
Science and higher education in a more global era and how Russia is positioned SIMON MARGINSON University of Melbourne, Australia after 28 October: Institute.
Bibliometrics toolkit Website: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Further info: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Scopus Scopus was launched by Elsevier in.
Working Draft Last Modified 12/09/ :58:38 Romance Standard Time Printed 26/01/ :36:39 Romance Standard Time PARIS INSTITUTE OF TECHNOLOGY Towards.
G3 Francophone Academic Alliance. Founding of G3 alliance  Created in September 2012 in Brussels.  Three founding members:  Université de Genève 
ESSENTIAL SCIENCE INDICATORS (ESI) James Cook University Celebrating Research 9 OCTOBER 2009 Steven Werkheiser Manager, Customer Education & Training ANZ.
The Scientific Impact of Italy Giuseppe De Nicolao – Università di Pavia I Centri ministeriali di Eccellenza istituiti presso le Università tra il 2001.
1 Making a Grope for an Understanding of Taiwan’s Scientific Performance through the Use of Quantified Indicators Prof. Dr. Hsien-Chun Meng Science and.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
Round Table Discussion Bibliometric Indicators in Research Evaluation and Policy Colloque Evolution des publications scientifiques Académie des sciences,
League tables as policy instruments: the political economy of accountability in tertiary education Jamil Salmi and Alenoush Saroyan 2 nd IREG Meeting Berlin,
OMICS international Contact us at: OMICS International through its Open Access Initiative is committed to make genuine and.
INCITES TM INSTITUTIONAL PROFILES David Horky Country Manager – Central & Eastern Europe Informatio Scientifica / Informatio.
RWTH Aachen University 2015 The Big Picture in Figures.
Rosie Drinkwater & Professor Lawrence Young Group Finance Director, Pro Vice-Chancellor (Academic Planning & Resources) League Tables Where are we, why.
The Thomson Reuters Journal Selection Policy – Building Great Journals - Adding Value to Web of Science Maintaining and Growing Web of Science Regional.
Communication Challenges for Nordic Universities NUAS Conference August, 2012 Prof. Andrejs Rauhvargers, EUA Senior Adviser Session A2b: Rankings.
THOMSON REUTERS INCITES Marta Plebani – Country Account Manager – Italy, Slovenia, Croatia 12 May 2011.
Publication Pattern of CA-A Cancer Journal for Clinician Hsin Chen 1 *, Yee-Shuan Lee 2 and Yuh-Shan Ho 1# 1 School of Public Health, Taipei Medical University.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
UNIVERSITY RANKINGS AND THEIR IMPACT Hamed Niroumand, Post-Doc, PhD, P. Eng. Buein Zahra Technical University.
Academic Ranking of World Universities
1 11 Study of the Young Academics in 19 Higher Education Systems in Comparative and Empirical Perspectives Study of the Young Academics in 19 Higher Education.
The swedish research barometer 2016
Industry Fast Facts Information INTERNAL ONLY SLIDE
RWTH Aachen University
Prof.Dr. Melih Bulu, Istinye University March 23
ASSESSMENT OF ACADEMIC PERFORMANCE OF AZERBAIJAN’S UNIVERSITIES
Benchmarking Pilot Results and Next Steps
ASSESSMENT OF ACADEMIC PERFORMANCE OF AZERBAIJAN’S UNIVERSITIES
Bibliometric Analysis of Water Research
Advanced Scientometrics Workshop
Dr. Edriss Ali, AGU - Dubai
MIT- Massachusetts Institute of Technology Number 1 University in World By. Prof. B.A.Khivsara.
Indication of Publication Pattern of Scientometrics
Bibliometric Analysis of Quality of Life Publication
Presentation transcript:

1 Benchmarking Universities Worldwide: Existing Results and Future Efforts of Academic Ranking of World Universities Presented By Dr. Ying CHENG Institute of Higher Education, Shanghai Jiao Tong University, China Observatoire des Sciences et des Techniques (OST), France December 12, 2007 Université Libre de Bruxelles, Belguim

2 Outline Purpose Purpose Methodologies & Results Methodologies & Results Problems and Discussion Problems and Discussion Ranking by Broad Subject Fields Ranking by Broad Subject Fields Future Efforts Future Efforts

3 Purpose of ARWU

4 Dream of Chinese for WCU  World-class university (WCU) is a dream for generations of Chinese. It’s not only for pride, but also for the future of China.  Recently, Chinese government has launched several initiatives for research universities. The best-known one is specially designed to build WCU (985 Project).

5 Goals of Top Chinese Universities  Many top Chinese universities have setup their strategic goals as WCU.  Most of them have also set time tables for reaching the goal of WCU. For example: 2016 for Peking University 2020 for Tsinghua University

6 Questions About WCU  Is there a clear definition for WCU?  How many WCU should there be in the world?  What are the positions of top Chinese universities in the world?  How can Chinese universities improve themselves to reach the goal of WCU?

7 Academic Ranking of World Universities  Our original purpose of doing the Academic Ranking of World Universities (ARWU) was to find out the position of Chinese universities in the world and the gap between them and WCU.  ARWU was put on the internet upon the encouragement of colleagues from all over the world. There have been an average of 2000 visitors every day since 2003.

8 Features of ARWU  ARWU uses a few carefully selected, objective criteria and internationally comparable data that everyone could verify in some way.  It has been carried out by a ranking team (chaired by Professor Nian Cai LIU) in the Institute of Higher Education of Shanghai Jiao Tong University for their academic interests.  It has been done independently without any financial support from any sources outside the Institute of Higher Education.

9 Methodologies & Results of ARWU

10 Selection of Universities  Any university that has any Nobel Laureates, Fields Medals, Highly Cited Researchers, or papers published in Nature or Science.  Major universities of every country with significant amount of papers indexed by Thomson.  Number of universities scanned: >2000  Number of universities actually ranked: >1000  Number of ranked universities on our web: 500

11 Ranking Criteria and Weights

12 Definition of Indicator: Alumni  The total number of the alumni of an institution winning Nobel Prizes and Fields Medals.  Alumni are defined as those who obtain bachelor, Master’s or doctoral degrees from the institution.  Different weights are set according to the periods of obtaining degrees. The weight is 100% for alumni of , 90% for alumni of , 80% for alumni of , and so on.  If a person obtains more than one degrees from an institution, the institution is considered once only.

13 Definition of Indicator: Award  The total number of the staff of an institution winning Nobel prizes in physics, chemistry, medicine and economics and Fields Medal in Mathematics.  Staff is defined as those who work at an institution at the time of winning the prize.  Different weights are set according to the periods of winning the prizes. The weight is 100% for winners since 2001, 90% for winners in , 80% and so on.  If a winner is affiliated with more than one institution, each institution is assigned the reciprocal of the number of institutions.  For Nobel prizes, if a prize is shared by more than one person, weights are set for winners according to their proportion of prize.

14 Definition of Indicator: HiCi  The number of highly cited researchers in 21 broad subject categories in life sciences, medicine, physical sciences, engineering and social sciences.  The definition of categories and detailed procedures can be found at the website of Institute of Scientific Information.  The total number of HiCi is about 6000, about 4000 of which is university staff.

15 Definition of Indicator: N&S  The annual average number of articles published in Nature and Science in the past five years.  To distinguish the order of author affiliation, a weight of 100% is assigned for corresponding author, 50% for first author (second author if the first author is the same as corresponding author), 25% for the next author, and 10% for other authors.  Only publications of article type are considered.

16 Definition of Indicator: SCI  Total number of articles indexed in Science Citation Index-expanded (SCIE) and Social Science Citation Index (SSCI) in the past year.  A weight of 2 is assigned to articles indexed in SSCI to compensate the bias against humanities and social sciences.  Only publications of article type are considered.

17 Definition of Indicator: Size  The sub-total scores of the above five indicators divided by the number of full-time equivalent academic staff.  If the number of academic staff for institutions of a country cannot be obtained, the total scores of the above five indicators is used.  For ranking 2007, the number of full-time equivalent academic staff is obtained for institutions in USA, China, Australia, Italy, Netherlands, Sweden, and Belgium etc.

18 Main Sources of Data  Nobel laureates:  Fields Medals:  Highly-cited researchers:  Articles published in Nature and Science:  Articles indexed in SCIE and SSCI:

19 Results of ARWU  Top 500 universities in the world Top 100 universities in North and Latin America Top 100 universities in Asia/Oceania Top 100 universities in Europe Statistics of top universities by region and country Percentage distribution of top universities by country as compared with the share of global population and GDP

20 Problems and Discussion of ARWU

21 Methodological: Education and Service  Education is the basic function of any university, however, it would be impossible to rank the quality of education due to the huge differences among the national systems.  Contribution to the national economic development is becoming increasingly important for universities, however, it is almost impossible to obtain internationally comparable indicators and data.  The academic or research performance of universities, a good indication of their reputation, can be ranked internationally.

22 Methodological: Humanities & Social Sciences  Many well-known institutions specialized in humanities and social sciences are ranked relatively low.  Since 2004, the indicator of N&S is not considered for institutions specialized in humanities and social sciences, its weight is relocated to other indicators.  Since 2005, a weight of 2 for articles indexed by SSCI is considered.  Nevertheless, if a university specialized in social sciences and humanities had Nobel Laureates in economics and Highly Cited Researchers in social sciences, it should have good standing.

23 Methodological: Language Bias  English is the language of international academic community.  Almost any ranking based on academic performance will be biased towards institutions in English-speaking countries.  One possible solution: papers published in non-native languages are offered a special weight.  Another possible solution: normalization of total articles by the proportion of journal editors of each country.

24 Methodological: Award and Alumni  Universities which started after 1911 do not have a fair chance.  Disciplines not related to the awarding fields do not have a fair chance. Other important awards include Abel, Pulitzer, Turing, Tyler, Pritzker, etc.  Institutions for winning awards and those for doing the researches may not be the same.  Institutions for obtaining degrees and those for pursuing the studies may not be the same.

25 Methodological: Per Capita Performance  The weight of the Size indicator for per capita performance is rather low. Large institutions have relatively high positions in the ranking.  However, it’s very difficult to obtain internationally comparable data on the number of academic staff.  The types of academic staff: such as purely teaching staff, teaching and research staff, purely research staff.  The ranks of academic staff: such as professor, associate professor, reader, lecturer, research scientist etc.

26 Technical: Attributions  Many universities have more than one commonly used names: such as Virginia Tech and Virginia Polytechnic and State University.  Variations due to translation: such as Univ Koln and Univ Cologne, Univ Vienna and Univ Wien.  Abbreviated names: such as ETH Zurich for Swiss Federal Institute of Technology Zurich.

27 Technical: Definition of Institution  University systems: such as Univ California system, Univ London system.  Affiliated institutions and research organizations: such as Ecole Polytechnique Montreal (affiliated to University of Montreal), CNRS Labs (affiliated to French universities).  Teaching and affiliated Hospitals: complex!  Our answer: according to author’s expression.

28 Other Technical Problems Merging, splitting, inheriting, discontinuing, name-changing of institutions such as:  Univ Kwazulu-Natal in South Africa merged from Univ Natal and Univ Durban-Westville.  University of Innsbruck in Austria splitted into Univ Innsbruck and Innsbruck Medical Univ.  Humboldt Univ Berlin and Free Univ Berlin inheriting the Nobel Prizes of the Berlin University before world war II.

29 Ranking by Broad Subject Fields (ARWU-FIELD) (ARWU-FIELD)

30  Requests for Ranking of World Universities by Broad subject fields or schools, colleges and Subject fields or programs, departments  Many top Chinese universities want to learn their positions in the world by broad subject fields or disciplines. Purpose of ARWU-FIELD

31 Definition of Broad Subject Fields  Natural Sciences and Mathematics (SCI)  Engineering/Technology and Computer Sciences (ENG)  Life and Agriculture Sciences (LIFE)  Clinical Medicine and Pharmacy (MED)  Social Sciences (SOC)  Arts and humanities are not ranked

32 ARWU-FIELD Indicators and Weights CodeSCIENGLIFEMEDSOC Alumni10%10%10%10% Award15%15%15%15% HiCi25%25%25%25%25% TOP25%25%25%25%25% PUB PUB25%25%25%25%25% Fund25%

33 Changes in Indicators and Definition  N&S in ARWU is not used in ARWU-FIELD.  TOP is the percentage of articles published in the top 20% journals of each broad subject field.  Fund is the total engineering-related research expenditures. It’s used only for ENG ranking.  Alumni and Award since 1951 are used for all rankings fields except ENG.  PUB is the total number of articles indexed by Thomson in the past year.

34 Results of ARWU-FIELD  Top 100 universities in SCI Top 100 universities in ENG Top 100 universities in LIFE Top 100 universities in MED Top 100 universities in SOC Statistics of top universities by region & country List of top universities by number of top fields

35 Special Problems in ARWU-FIELD  It’s difficult to obtain data on engineering-related research expenditures and make them comparable. For 2007, Fund was obtained only for US and Canadian universities.  It’s difficult to separate the Nobel Laureates in Physiology or Medicine. They are used in both LIFE and MED ranking.

36 Future Efforts of ARWU

37 Study the Methodological and Technical Problems Welcome idea and suggestions!

38 Update ARWU Annually  In order to observe changes of Universities  In order to monitor countries’ performance  Serve as the basis of related academic studies and policy analysis

39 Provide Different Rankings  Ranking by per capita performance  Based on reliable and international comparable data of academic staff  Ranking of universities specialized or strong in engineering, medicine, etc.  Based on the classification of world universities

40 Establish Database of WCU  Basic Characteristics  Students, Staff, Program, Budget, …  Other Indicators for International Comparison  R&D Expenditure, Bibliometric Indicators, International Academic Awards, …  Data at College/Department Level

41 Final Remarks

42 Controversy of Ranking  Any ranking is controversial and no ranking is absolutely objective.  University rankings become popular in many countries. Whether we agree or not, ranking systems clearly are here to stay.  The key issue then becomes how to improve ranking systems for the benefits of higher education.

43 Use of Ranking  Rankings are tools for different purpose.  Rankings should be used with cautions. Their methodologies must be read carefully before reporting or using their results.  Rankings should be used in combination with other types of evaluation whenever possible.