Download presentation
Presentation is loading. Please wait.
Published byTravis Ailes Modified over 10 years ago
1
IREG-4, Astana, 16 June 20091 Rickety Numbers Volatility of international rankings of higher education and implications for policy making ANDREA Saltelli andrea.saltelli@jrc.it European Commission Joint Research Centre (Ispra, Italy)
2
IREG-4, Astana, 16 June 20092 Michaela Saisana Beatrice D’Hombres With: based on: + a submission to: http://crell.jrc.ec.europa.eu/
3
IREG-4, Astana, 16 June 20093 Outline Motivation and objective of the study Overview of the two international university rankings SJTU ranking THES ranking Robustness (uncertainty & sensitivity analysis) Policy implications
4
IREG-4, Astana, 16 June 20094 University rankings are used to judge about the performance of university systems
5
IREG-4, Astana, 16 June 20095 These rankings are relevant to today’s discourse on HE reform in the EU Even academicians use SJTU P. Aghion, M. Dewatripont, C. Hoxby, A. Sapir, A., “Higher aspirations: An agenda for reforming European universities” (Bruegel Blueprint Series N.5, 2008).
6
IREG-4, Astana, 16 June 20096 Motivation and Objective of the study Questions: can we have confidence in university rankings? How much do the university ranks depend on the methodology (weighting scheme, aggregation, indicators)? Uncertainty analysis of the 2008/2007 SJTU and THES rankings Two international university rankings yearly published + Very appealing for capturing a university’s multiple missions in a single number + Allow one to situate a given university in the worldwide context - Can lead to misleading and/or simplistic policy conclusions
7
IREG-4, Astana, 16 June 20097 Motivation and objective of the study Overview of the two international university rankings SJTU ranking THES ranking Robustness (Uncertainty and sensitivity analysis) Policy implications
8
IREG-4, Astana, 16 June 20098 SJTU ranking PROS and CONS 6 « objective » indicators Focus on research performance, overlooks other U. missions. Biased towards hard sciences intensive institutions Favours large institutions METHODOLOGY 6 indicators Best performing institution =100; score of other institutions calculated as a percentage Weighting scheme chosen by rankers Linear aggregation of the 6 indicators
9
IREG-4, Astana, 16 June 20099 THES ranking PROS and CONS Attempt to take into account teaching quality Two expert-based indicators: 50% of total - Subjective indicators, lack of transparency Substantial yearly changes in methodology Measures research quantity METHODOLOGY 6 indicators z-score calculated for each indicator; best performing institution =100; other institutions are calculated as a percentage Weighting scheme: chosen by rankers Linear aggregation of the 6 indicators
10
IREG-4, Astana, 16 June 200910 THES and SJTU rankings Comparisons (2007) 1 - Identify the same top 10 universities: Harvard, Cambridge, Princeton, Cal-tech, MIT and Columbia 2 - Greater variations in the middle to lower end of the rankings 3 - Both SJTU and THES rankings: Europe is lagging behind
11
IREG-4, Astana, 16 June 200911 RED=UK (all under the SJTU=THES line…)
12
IREG-4, Astana, 16 June 200912 Motivation and objective of the study Overview of the two international university rankings SJTU ranking THES ranking Robustness (uncertainty and sensitivity analysis) Policy implications
13
IREG-4, Astana, 16 June 200913 Step 1. Developing a theoretical framework Step 2. Selecting indicators Step 3. Imputation of missing data Step 4. Multivariate analysis Step 5. Normalisation of data Step 6. Weighting and aggregation Step 7. Robustness and sensitivity Step 8. Back to the details (indicators) Step 9. Association with other variables Step 10. Presentation and dissemination Methodology 2 rounds of consultation with OECD high level statistical committee Finally endorsed in March 2008 JRC/OECD Handbook on composite indicators [… and ranking systems]
14
IREG-4, Astana, 16 June 200914 JRC/OECD Handbook on composite indicators [… and ranking systems] Why the ten steps? The three pillars of a well-designed CI: 1.Solid conceptual framework, 2.good quality data, 3.sound methodology (+ robustness assessment) Composite indicators between analysis and advocacy, Saltelli, 2007, Social Indicators Research, 81:65-77 If the 3 conditions are met then CIs can be used for policy
15
IREG-4, Astana, 16 June 200915 Saisana M., Saltelli A., Tarantola S. (2005) Uncertainty and Sensitivity analysis techniques as tools for the quality assessment of composite indicators, Journal of the Royal Statistical Society - A, 168(2), 307- 323. Something more concise
16
IREG-4, Astana, 16 June 200916 Why a robustness assessment? “many indices rarely have adequate scientific foundations to support precise rankings: […] typical practice is to acknowledge uncertainty in the text of the report and then to present a table with unambiguous rankings” [Andrews et al. (2004)]
17
IREG-4, Astana, 16 June 200917 Peter Kennedy, A Guide to Econometrics. One of the ten commandments of applied econometrics <<Thou shall confess in the presence of sensitivity. Corollary: Thou shall anticipate criticism>> Why sensitivity analysis?
18
IREG-4, Astana, 16 June 200918 When reporting a sensitivity analysis, researchers should explain fully their specification search so that the readers can judge for themselves how the results may have been affected.
19
IREG-4, Astana, 16 June 200919 Robustness analysis of SJTU and THES SENSITIVITY ANALYSIS: activate simultaneously different sources of uncertainty that cover a wide spectrum of methodological assumptions Estimate the FREQUENCY of the university ranks obtained in the different simulations imputationweighting normalization Number of indicators Aggregation 70 scenarios
20
IREG-4, Astana, 16 June 200920 Robustness analysis of SJTU and THES Are these 70 scenarios a plausible set? Can we take this as possible description of a specification search? Are Factor Analysis, DEA, and Borda far fetched?
21
IREG-4, Astana, 16 June 200921 SJTU: simulated ranks – Top20 Harvard, Stanford, Berkley, Cambridge, MIT: top 5 in more than 75% of our simulations. Univ California SF: original rank 18 th but could be ranked anywhere between the 6 th and 100 th position Impact of assumptions: much stronger for the middle ranked universities
22
IREG-4, Astana, 16 June 200922 Going into the details: U. of CA, San Francisco Why volatility? Three indicators well above the average, and a single indicator with a zero.
23
IREG-4, Astana, 16 June 200923 THES: simulated ranks – Top 20 Impact of uncertainties on the university ranks is even more apparent. M.I.T.: ranked 9th, but confirmed only in 13% of simulations (plausible range [4, 35]) Very high volatility also for universities ranked 10 th -20th position, e.g., Duke Univ, John Hopkins Univ, Cornell Univ.
24
IREG-4, Astana, 16 June 200924 SJTU: simulated ranks – Full set (503 universities)
25
IREG-4, Astana, 16 June 200925 THES: simulated ranks – Full set (400 universities)
26
IREG-4, Astana, 16 June 200926 Variance-decomposition Objective: to assess the internal consistency (and eventual dominance issues) in the framework of indicators Nominal weights Effective weights [Effective weights terminology according to: Stanley, J.C., Wang, M.D., 1968. Differential weighting: a survey of methods and empirical studies, Johns Hopkins Univ., Baltimore.]
27
IREG-4, Astana, 16 June 200927 Variance-decomposition THES framework Academic review Recruiter review Teacher to student ratio Citations per faculty International staff International students THES Score 0.880.670.430.610.310.40 Nominal weights 40.0%10.0%20.0% 5.0% Effective Weights 52.0%10.6%13.2%18.1%2.8%3.4% SJTU framework Alumni winning Nobel prizes Staff winning Nobel prizes Highly cited researchers Articles in Nature & Science Articles in Science & Social CI Academic performance - size SJTU Score 0.800.850.900.930.790.84 Nominal weights 10.0%20.0% 10.0% Effective Weights 9.5%21.7%22.3%20.4%18.8%7.3%
28
IREG-4, Astana, 16 June 200928 Impact of scenarios 2007 THES (88 universities, 70 models/scenarios). Examples:
29
IREG-4, Astana, 16 June 200929 Impact of scenarios Pick up a scenario, say S25. In S25, how much each U shifts in rank with respect to the original rank? What is the median shift across all U’s? For S25 it is ~ 11 What is the 90 percentile of the shift? For S25 it is ~20 DEA scenarios +
30
IREG-4, Astana, 16 June 200930 Robustness can also be used in the process of building an index … One last on methodology …not only to criticize an existing one!
31
IREG-4, Astana, 16 June 200931 Examples of ‘anticipatory’ use of SA: 2009 Ibrahim Index of African Governance (Ibrahim foundation&Harvard Kennedy School) 2008 Product Market Regulation Index (OECD) 2008 European Lifelong Learning Index (Bertelsmann Foundation, CCL) 2008/2006 Environmental Performance Index (Yale & Columbia University) 2007 Alcohol Policy Index (New York Medical College) 2007 Composite Learning Index (Canadian Council on Learning) 2002/2005 Environmental Sustainability Index (Yale & Columbia University)
32
IREG-4, Astana, 16 June 200932 Motivation and objective of the study Overview of the two international university rankings SJTU ranking THES ranking Robustness (uncertainty and sensitivity analysis) Policy implications
33
IREG-4, Astana, 16 June 200933 Policy Implications 1 - Assigned university rank largely depends on the methodological assumptions made in compiling the two rankings. 9 in 10 universities shift more than 10 positions in the 2008 SJTU ranking: 92 positions (Univ Autonoma Madrid) and 277 positions (Univ Zaragoza) in Spain, 71 positions (Univ Milan) and 321 positions (Polytechnic Inst Milan) in Italy, 22 positions (Univ Paris 06) and 386 positions (Univ Nancy 1) in France. 2 - THES ranking: less robust than the SJTU ranking
34
IREG-4, Astana, 16 June 200934 Policy Implications 3 - The compilation of university rankings should always be accompanied by a robustness analysis. The multi-modeling approach can offer a picture representative of the plurality of opinions on how to assess university performance see personalized indices, but… 4 – Do it yourself indices are good for analysis, but less for advocacy.
35
IREG-4, Astana, 16 June 200935 Policy Implications 5 - Indicators and league tables are enough to start a discussion on higher education issues but not sufficient to conclude it (Aghion et al., 2008). END http://composite-indicators.jrc.ec.europa.eu/
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.