Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using Rankings to Drive Internal Quality Improvements: The Asian Experience Dr. Kevin Downing Director of Knowledge, Enterprise and Analysis City University.

Similar presentations


Presentation on theme: "Using Rankings to Drive Internal Quality Improvements: The Asian Experience Dr. Kevin Downing Director of Knowledge, Enterprise and Analysis City University."— Presentation transcript:

1 Using Rankings to Drive Internal Quality Improvements: The Asian Experience Dr. Kevin Downing Director of Knowledge, Enterprise and Analysis City University of Hong Kong

2 Dominant Global Ranking Systems
Session Flow 1 Rising Asia-Rising Africa? 2 2 Dominant Global Ranking Systems 3 3 Benefits of Rankings 4 4 Conclusion

3 Ranking Systems - Criteria and Weighting
Times Higher Education of World University Ranking (THE) Academic Ranking of World Universities (ARWU) QS World University Rankings (QS-WUR) ARWU (Shanghai Jiaotong) The criteria of total number of alumni/staff winning Nobel Prizes and Fields Medals biased are towards traditional ‘Ivy-league’ universities. These two criteria made it very difficult for younger institutions to enter the elite top 100, and hence the stability of the ARWU rankings over the years. Times Higher education (THE) Five categories (with 13 indicators) are included for the ranking of THES: •Teaching — the learning environment (worth 30 per cent of the final ranking score) •Research — volume, income and reputation (worth 30 per cent) •Citations — research influence (worth 32.5 per cent) •Industry income — innovation (worth just 2.5 per cent) •International mix — staff and students (worth 5 per cent) 1) Teaching — the learning environment (30%) This broad category employs 5 separate indicators: Reputational survey on teaching (15%) The flagship indicator for this category uses the results of a reputational survey on teaching. It examined the perceived prestige of institutions in both research and teaching. PhD award per academic (6%) The teaching category also uses data on the number of PhDs awarded by an institution, scaled against its size as measured by the number of academic staff.As well as giving a sense of how committed an institution is to nurturing the next generation of academics, a high proportion of postgraduate research students also suggests teaching at the highest level that is attractive to graduates and good at developing them. Undergraduate students also tend to value working in a rich environment that includes postgraduates. Undergraduate admitted per academic (4.5%) It measures the number of undergraduates admitted by an institution scaled against the number of academic staff. Essentially a form of staff-to-student ratio, this measure is employed as a proxy for teaching quality — suggesting that where there is a low ratio of students to staff, the former will get the personal attention they require from the institution's faculty. Income per academic (2.25%) It is a simple measure of institutional income scaled against academic staff numbers. This figure, adjusted for purchasing-price parity so that all nations compete on a level playing field, indicates the general status of an institution and gives a broad sense of the general infrastructure and facilities available to students and staff. PhD awarded/Bachelor’s awards (2.25%) The teaching category also examines the ratio of PhD to bachelor's degrees awarded by each institution. We believe that institutions with a high density of research students are more knowledge-intensive, and that the presence of an active postgraduate community is a marker of a research-led teaching environment valued by undergraduates and postgraduates alike. 2) Citations — research influence (32.5%) A university's research influence — as measured by the number of times its published work is cited by academics — is the largest of the broad rankings categories, worth just under a third of the overall score. This weighting reflects the relatively high level of confidence the global academic community has in the indicator as a proxy for research quality. The use of citations to indicate quality is controversial — their use in distributing more than £1.5 billion a year in UK research funding under the forthcoming research excellence framework, for example, has been dramatically scaled back after lengthy consultation. Nevertheless, there is clear evidence of a strong correlation between citation counts and research performance. The data are drawn from the 12,000 academic journals indexed by Thomson Reuters' Web of Science database. The figures are collected for every university; with data aggregated over a five-year period from 2004 to 2008 (there has been insufficient time for the accumulation of such data for articles published in 2009 and 2010). Unlike the approach employed by the old rankings system, all the citations impact data are normalised to reflect variations in citation volume between different subject areas. This means that institutions with high levels of research activity in subjects with traditionally very high citation counts will no longer gain an unfair advantage. 3) Research — volume, income and reputation (30%) As with the teaching category, the most prominent indicator in research volume, income and reputation is based on the results of our reputational survey. Reputation Survey-research (19.5%) Consultation with our expert advisers suggested that confidence in this indicator was higher than in the teaching reputational survey, as academics are likely to be more knowledgeable about the reputation of research departments in their specialist fields. For this reason, it is given a higher weighting. Research Income (5.25%) Is determined by a university's research income, scaled against staff numbers and normalised for purchasing-power parity. This is a controversial measure, as it can be influenced by national policy and economic circumstances. But research income is crucial to the development of world-class research, and because much of it is subject to competition and judged by peer review, our experts suggested it was a valid measure. Paper per academic and research staff (4.5%) The research environment category also includes a simple measure of research volume scaled against staff numbers. We count the number of papers published in the academic journals indexed by Thomson Reuters per staff member, giving an idea of an institution's ability to get papers published in quality peer-reviewed journals. Public research income (0.75%) Some 2.5 per cent of the category — worth just 0.75 per cent overall — is a measure of public research income against an institution's total research income. This has a low weighting to reflect concerns about the comparability of self-reported data between countries. 4) International mix — staff and students (5%) Ratio of International to domestic staff (3%) Our final category looks at diversity on campus — a sign of how global an institution is in its outlook. The ability of a university to attract the very best staff from across the world is key to global success. The market for academic and administrative jobs is international in scope, and this indicator suggests global competitiveness. However, as it is a relatively crude proxy, and as geographical considerations can influence performance. Ratio of international to domestic students (2%) The other indicator in this category is based on the ratio of international to domestic students. Again, this is a sign of an institution's global competitiveness and its commitment to globalisation. As with the staff indicator, our consultation revealed concerns about the inability to gauge the quality of students and the problems caused by geography and tuition-fee regimes. 5) Industry income — innovation (2.5%) This category is designed to cover an institution's knowledge-transfer activity. It is determined by just a single indicator: a simple figure giving an institution's research income from industry scaled against the number of academic staff. We plan to supplement this category with additional indicators in the coming years, but at the moment we feel that this is the best available proxy for high-quality knowledge transfer. It suggests the extent to which users are prepared to pay for research and a university's ability to attract funding in the commercial marketplace — which are significant indicators of quality. However, because the figures provided by institutions for this indicator were patchy, we have given the category a relatively low weighting for the tables: it is worth just 2.5 per cent of the overall ranking score.

4 QS-WUR vs. ARWU (Top 200) Concordance Analysis – Kendall’s W (2007-2010)
The top 200 universities in the QS-WUR and ARWU were grouped into four classes of 50 according to their 2009 rankings, namely Top 50, Top 51 to 100, Top 101 to 150, and Top 151 to 200. The analysis of concordance of rankings over the past 3 years (2007 to 2009) indicated that the ARWU (Shanghai Jiaotong Rankings) is much more stable than the QS-WUR. * Kendall’s W is a non-parametric statistic to assess agreement between rankings. Kendall's W ranges from 0 (no agreement) to 1 (complete agreement).

5 QS-WUR vs. ARWU (Asian Universities in Top 300) Concordance Analysis – Kendall’s W (2007-2010)
Asian universities in the top 300 of QS-WUR and ARWU were grouped into three classes according to their 2009 rankings, namely Top 100, Top 101 to 200, and Top 201 to 300. The analysis of concordance of rankings for Asian universities over the past 3 years (2007 to 2009) indicated that whilst the rankings of Asian universities in the ARWU (Shanghai Jiaotong Rankings) were relatively stable over the past 3 years, moderate fluctuations were found in the rankings of Asian universities in the QS-WUR.

6 QS-WUR vs. ARWU (Top 200) Mean Standard Deviation of Ranks (Over the Past 4 Years)
In all four classes of universities, there are larger fluctuations in the QS rankings over the past 3 years than in the ARWU.

7 Number of Asian Universities in QS World University Rankings (2007 – 2010)

8 Asian Universities in QS-WUR Top 200 (By Country, 2007 – 2010)
Philip Altbach, director of the Centre for Higher Education at Boston College in the US, says several factors are behind the surges by Asian institutions. 1. These countries have invested heavily in higher education in recent years, and this is reflected in the improved quality in their top institutions. 2. They have also attempted to internationalise (globalise) their universities by hiring more faculty from overseas ... this helps to improve their visibility globally. 3. These universities have also stressed the importance of their professors publishing in international journals, which has no doubt increased the visibility of their research.

9 Asian Universities vs. American Universities Mean Rank Comparison (QS-WUR, 2007 to 2010)
Asian and Non-Asian universities were grouped into four classes according to their 2009 QS-WUR rankings, namely Top 100, Top 101 to 200, Top 201 to 300, and Top 301 to 400. Whilst steady rises were observed in all four groups of Asian universities over the past 3 years (2007 to 2009), steady deteriorations were found in all four groups of American universities over the past 3 years. 2) The rise of Asia is in direct contrast to the US' fortunes. This slide lends credence to the predictions of several international higher education experts that the US will soon lose (or have lost) its international ascendancy. At an Organisation for Economic Co-operation and Development conference earlier this year, it was suggested that the US and the UK would be hit far harder than most countries by the need for future public spending cuts because both will need to reduce massive budget deficits. A number of countries in Asia, including Japan and Korea, will face an easier ride. Delegates spoke of a resulting major "redistribution of brains". According to Ben Sowter, the fallout caused by America's economic problems may ultimately result in its institutions sliding even lower in subsequent rankings. As 40 per cent of the overall ranking score is based on a survey of academics' opinions, the US' slip in 2009 may have more to do with the improvement in the reputation of Asian institutions brought about by better marketing and communication, he says. "In the six years of conducting this study, we have seen a drastically increased emphasis on international reputation from institutions in many countries, particularly those in Asia," he notes.

10 Hong Kong Institutions in QS World University Rankings (Top 400) in 2010

11 Comparative Analysis of Universities in Hong Kong (2010)

12 What’s the use of rankings?
Global Market Demand International study trends show that world wide demand for education is on the rise. Higher Education is becoming more global and competitive. Global Market Shaping University rankings shape the global market in higher education as much as (or more than) they describe it. By changing the rankings we alter global competition. Global Market Value Knowledge is the key driver of international competitiveness. Ranking will raise global awareness of those institutions and universities being ranked.

13 Benefits of Rankings Research Reputation Strategic Globalisation and
Visibility Globalisation Research Academic Development Strategic Planning

14 Reputation and Visibility
Publicity Website Press releases Official presentations International conferences/meetings Funding lobbying Local Reputation Overcome bias and traditional consensus of reputation. Global Reputation Enhance global visibility and awareness. Establish institutional brand. Strategic Positioning and Publicity - Those with comparative advantage in the rankings could use their institutional position for publicity purposes – website, press releases, official presentations, international meetings and conferences, funding lobbying, etc. Global rankings can help overcome bias or traditional consensus of institutional reputation. - Global rankings can enhance visibility and help establish brand, both nationally and internationally.

15 Cultural Globalisation Economic Globalisation
Global rankings are powerful weapon in the battle for talent. Attract quality foreign students and faculty members to enhance campus diversity and students’ international perspective. Economic Globalisation Propagate cross-border relationships and continuous global flows of people, information knowledge, technologies, products, and financial capital. Cultural Globalisation - Global rankings are powerful weapon in the battle for talent. They can help attract quality foreign students and faculty members to enhance campus diversity and students’ international perspective. International students are clever consumers of global rankings. Researchers (e.g. Hazelkorn, 2007) have indicated that international students tend to use rankings to short-list institutions, sometimes within an identified country. Economic Globalisation - In global knowledge economies, higher education institutions are important media for a variety of cross-border relationships and continuous global flows of people, information knowledge, technologies, products and financial capital.

16 Research and Academic Development
Research Development Attract international partners to undertake high impact research of potential academic or commercial values. International collaboration help cultivate multicultural perspectives of faculty and encourage engagement with global issues. Academic Development Opportunities to expand academic partnerships with world-leading institutions. Extend the quality, breadth, and impact of academic programmes. International Partnership and Opportunity Research Development - They help attract international partners to undertake joint research and development projects of potential academic or commercial values, which can contribute to addressing critical global issues and concerns of high impact. International collaboration can help cultivate multicultural perspectives of faculty and encourage engagement with global issues to support academic excellence. Academic Development - Global rankings can help increase institutional awareness and provide opportunities to expand academic partnerships with other world-leading institutions to extend the quality, breadth, and impact of university’s research and educational programmes.

17 Strategic Planning Internal Evaluation Benchmarking
Data-driven decision making based on institutional performance indicators. Ranking criteria can help focus on core areas of practice, strategy can be aligned with indicators to improve quality. Benchmarking Benchmark against ‘best practice’ to enhance global competitiveness. Strategic Planning Internal Evaluation Ranking criteria can help institution focus on core areas of practice and encourage an evidence-based approach to quality improvement. Data driven decision making based on institutional performance indicators. Strategy can then be aligned with indicators to improve quality. Benchmarking Use ranking criteria to identify appropriate benchmarks in line with institutional aspirations. Benchmark against ‘best practice’ and learn from peer institutions to enhance global competitiveness.

18 Using Rankings to Improve Institutional Quality
Identify Core Focus Areas Ranking criteria help an institution focus on core areas of practice and encourage an evidence-based approach to quality improvement. Strategic Planning Data driven decision making based on institutional performance indicators. Strategy can then be aligned with indicators to improve quality. Funding Lobbying Rankings can be used to lobby government and funding bodies.

19 What’s the use of rankings? Examples from City University of Hong Kong
Use ranking criteria to identify appropriate benchmarks in line with institutional aspirations. Benchmark against ‘best practice’ and learn from peer institutions. External Benchmarking College/School Level Departmental Level Annual assessment based on quantitative performance indicators for learning and teaching, research, and knowledge transfer. Establish panel of management and external experts to consider anomalous data or representations from departments. Strategy can then be developed to address issues of accountability and improve quality.

20 Performance Indicators
% International Students Input Quality Index Staffing and Resources Index Output Quality Index Average Entry A-Level Score % Faculty to Total Academic Staff % Faculty with PhD or Professional Accreditation % Outbound Exchange Students % Graduates with FT Employment (within 6 months of completion) % Self-financed Students Average Entry English Score Number of Students Per Faculty % International Faculty % Student with Internship Experience

21 Staffing and Resources Index
Staircase Model Threshold  (One star) Towards Excellence  (Two star) Excellence  (Three star) Input Quality Index Staffing and Resources Index Output Quality Index

22 Example Growth Chart (Department X)

23 Example Growth Chart (Department Y)

24 Conclusion Time to Choose your Focus!
The notion of a ‘World Class University’ is becoming ever more important to governments, employers, investors, alumni, students and applicants. Rankings provide some comparative measures of an institutions global standing and it is a catalyst for further healthy competition. Try to identify which rankings might be used to bring about practical positive strategic change which will benefit all stakeholders, not least the ultimate product of our endeavours, the quality of our graduates and our research output. Whilst rankings are necessarily imperfect and will always inspire debate, they are also currently inspiring and creating the opportunity for many Asian institutions to emerge from the long shadows cast by those in the West.


Download ppt "Using Rankings to Drive Internal Quality Improvements: The Asian Experience Dr. Kevin Downing Director of Knowledge, Enterprise and Analysis City University."

Similar presentations


Ads by Google