Combining bibliometric and webometric information Web ranking methodology under scrutiny Isidro F. Aguillo

Slides:



Advertisements
Similar presentations
Why does ERA Need to Flourish
Advertisements

Using Rankings to Drive Internal Quality Improvements
Planning Reports and Proposals
FOR PROFESSIONAL OR ACADEMIC PURPOSES September 2007 L. Codina. UPF Interdisciplinary CSIM Master Online Searching 1.
Information Literacy Instruction in Libraries
Towards Science, Technology and Innovation2/10/2014 Sustainable Development Education, Research and Innovation Vision for Knowledge Economy Professor Maged.
Cross Country Comparison of Reforms The View of Top Executives in 11 European Countries Preliminary Results from the COCOPS Executive Survey Gerhard Hammerschmid.
Assessment of administrative and institutional capacity
Scholarly Communications in Flux Michael Jubb Director, Research Information Network Bloomsbury Conference on E-Publishing and E-Publications 29 June 2007.
1 European benchmarking with the CAF ROME 17-18th of November 2003.
HE in FE: The Higher Education Academy and its Subject Centres Ian Lindsay Academic Advisor HE in FE.
Integrating Academic Skills Development in a First Semester Chemistry Unit Technology Enhanced Curriculum | Sydney Teaching Colloquium | 2012 ASSOCIATE.
WP3. Evaluation, Monitoring and Quality Plan Dr. Luis Sobrado 27 th May 2011.
Monitoring and Evaluation of ICT Use in Rural Schools Ok-choon Park Global Symposium on ICT in Education – Measuring Impact: Monitoring and Evaluation.
New Directions: Online Courses & the University of Californias A-G Subject Area Requirements Monica H. Lin, Ph.D. Associate Director of Undergraduate Admissions.
Evaluating administrative and institutional capacity building
TQA CONCEPTS & CORE VALUES
Building a European Classification of Higher Education Institutions Workshop ‘New challenges in higher education research and policy in Europe and in CR’,
CARMEN Policy Observatory and Dialogue Proposal Presentation to the CARMEN Directing Board Meeting San Juan, Puerto Rico 30 June 2003.
Unit Name Goes Here Data This, Data That Using ISO to Develop Performance Indicators for Library Wide Planning & Data Comparison Presentation for.
Workshop Mapping Estonian Universities Frans Kaiser & Marike Faber, Tartu (Estonia) 21 March 2011.
National university rankings and the evolution of global rankings Kazimierz Bilanow Managing Director, Perspektywy Education Foundation, Poland How quality.
Impact assessment in the funding sector: the role of altmetrics Adam Dinsmore
Fabienne Fortanier Head of Trade Statistics OECD
Korkeakoulujen arviointineuvosto — Rådet för utvärdering av högskolorna — The Finnish Higher Education Evaluation Council (FINHEEC) eLearning and Virtual.
The Five Working Groups Faculty Development Scaling-Up Post-Graduate programmes and 1.Research & Development 2.Innovation 3.Industry - Institute Interaction.
“Facts are stubborn things, but statistics are pliable.”
How the University Library can help you with your term paper
CYPRUS UNIVERSITY OF TECHNOLOGY Internal Evaluation Procedures at CUT Quality Assurance Seminar Organised by the Ministry of Education and Culture and.
Mapping Diversity – The U-Multirank Approach to Rankings Gero Federkeil Workshop Universidade Nova de Lisboa, 29th June 2012.
1 Isidro F. Aguillo Cybermetrics Lab (CSIC), Madrid, Spain International colloquium "Ranking and Research Assessment in Higher Education“.
Using Rankings to Drive Internal Quality Improvements Dr. Kevin Downing City University of Hong Kong & Ms. Mandy Mok QS Asia.
1 Academic Rankings of Universities in the OIC Countries April 2007 April 2007.
Webometrics ranking of web resources of Russian research institutes Designed by Irina Kosinets.
The CHE ranking The multi-dimensional way of Ranking Isabel Roessler CHE – Centre for Higher Education Development International Conference “Academic Cooperation.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE KIEV, 31 JANUARY.
Difficulties and Possibilities of University Rankings in Hungary Magdolna Orosz (Eötvös Loránd University Budapest, Hungary) Academic cooperation and competitiveness.
The Web of Science database bibliometrics and alternative metrics
Isidro F. Aguillo Editor of the Ranking Web of Universities Cybermetrics Lab. CSIC. Spain Mahasarakham University, Maha Sarakham (Thailand), March 14th.
Michalis Adamantiadis Transport Policy Adviser, SSATP SSATP Capacity Development Strategy Annual Meeting, December 2012.
Discovery tools and research assessment solutions APRIL 2012 Shahrooz Sharifrazy Regional Sales Manager.
Gero Federkeil Expert Seminar „Quality Assurance and Accreditation in Lifelong Learning“, Berlin, February 2011 Rankings and Quality Assurance.
MANAGEMENT SYSTEM AND STRUCTURAL ORGANIZATION FOR THE QUALITY ASSURANCE OF SPs ■ Slovak University of Technology in Bratislava ■ Faculty of Civil Engineering.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
Web of Science® Krzysztof Szymanski October 13, 2010.
Why web-based rankings of universities should not be neglected Isidro F. Aguillo World University Rankings Symposium Identifying Excellence and Diversity.
Wakil Rektor Bidang Sarana dan Bisnis Institut Pertanian Bogor Higher Education Web Performance.
Google Scholar as a cybermetric tool Alastair G Smith Victoria University of Wellington New Zealand
Access to electronic scientific information: policies, strategies and programmes The Brazilian experience Elenara Chaves Edler de Almeida Brazilian Federal.
A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.
1 Modern University for the Humanities Moscow, Russia The dynamics of network activity of Russian universities in the international ranking Webometrics.
Isidro F. Aguillo Cybermetrics Lab. CCHS-CSIC
University rankings, dissected Răzvan V. Florian Ad Astra association of Romanian scientists Center for Cognitive and Neural Studies, Cluj, Romania.
Communication Challenges for Nordic Universities NUAS Conference August, 2012 Prof. Andrejs Rauhvargers, EUA Senior Adviser Session A2b: Rankings.
CitEc as a source for research assessment and evaluation José Manuel Barrueco Universitat de València (SPAIN) May, й Международной научно-практической.
Classification & Ranking in Higher Arts Education New EU developments and the role of ELIA.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Academic Ranking of World Universities
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Data Mining for Expertise: Using Scopus to Create Lists of Experts for U.S. Department of Education Discretionary Grant Programs Good afternoon, my name.
Quality Assurance in Egypt and the European Standards and Guidelines
Prof.Dr. Melih Bulu, Istinye University March 23
Advanced Scientometrics Workshop
SciVal to support building a research strategy
WEBOMETRIC RANKING AND THE ROLE OF ICT
Webometrics Team, University of Nigeria, Enugu Campus
Rankings from the perspective of European universities
Bibliometric Services at the Masaryk University
Information Literacy: What is it and Why Should I Care?
Presentation transcript:

Combining bibliometric and webometric information Web ranking methodology under scrutiny Isidro F. Aguillo

A ranking is a ranking, a ranking is not … A tool for accreditation of Higher Education Institutions assessment EUA making deep analysis of EU universities: 100 in ten years, expected finishing before next century?. U-Multirank in-depth analysis of only 500 universities is not scheduled before 2014 A tool for summarizing research performance scientific Bibliometricians had proposed more than 300 indicators, but they were unable to avoid the success of the infamous Impact Factor and the mathematically unreliable h-index A tool for adopting long term national strategies policy oriented Best strategies for improving in ARWU ranking: Wild merging of universities and contracting short visits of Nobel awardees or highly cited researchers in exchange of affiliation 2

Principia guiding the Ranking Web (Webometrics) Every Higher Education Institution global coverage More than Universities plus Research Institutions, including developing countries currently not covered in other rankings All the missions comprehensive Taking into account new media, social tools and the MOOCs revolution, the Third mission (Internationalization, Knowledge & Technology Transfer, Community Engagement) and, of course, Research too End-users oriented useful Indicators supporting policies of transparent governance, excellence in the publication, open access to the results, bottom-up content control, commitment to new teaching virtual environments … 3

4 Advantages –The only ranking that covers almost all the HEIs –High correlation with other ranking, so Webometrics specially useful (trusted?) for ranks beyond the 500 th –Reliable description of the full national higher education systems Top500 could be OK for SG, FI or NL, but not even for JP or UK Shortcomings –Institutional diversity is not taken into account (but DYI: Do it yourself!) Research intensive vs. teaching oriented; national vs. local; generalists vs. specialized; public vs. private –National systems heterogeneity is overlooked (DYI again?) Funding (HE GDP and GDP per capita); centralized vs. autonomous; strong private (profit/non-profit) sector –Efficiency (size related) analysis is not performed (unfeasible?) But it is really needed for ranking purposes? REGARDING … Global coverage

5 Percentage by country regarding the Top 2000 Ranking Web of Universities, January 2013

6 Sources –Current generation of surveys are completely inadequate for benchmarking purposes Data from reputational surveys is highly subjective and not well informed Data provided by universities themselves is probably biased (even false), not following common standards Methods –Composite indicator is reliable Aggregating different dimensions of university performance into a single overall indicator –A theoretical weighting model is needed Empirically tested bibliometric model supports strongly the ratio 1:1 between activity (production) and visibility (impact) –A huge, diverse, representative population is available Interlinked academic webspace (billions of links) REGARDING … Comprehensiveness

7 Proposed ratio 1:1 for the weighting model ACTIVITY IMPACT ARWU (Shanghai)40%60% RANKING QS30%70% THE35%65% NTU-HEEACT20%80% WR (Webometrics)50%

8 Reliability –Google has biases derived from geolocation procedures (results are not the same in the different national/linguistic mirrors) –Coverage of Google Scholar is far larger than other bibliometric sources (WoS, Scopus), but the quality control is not so strict Bad practices –Even the largest universities have no strategic web policies Having more than one central web domain or sharing web addresses strongly penalizes their ranks in Webometrics Websites are randomly organized without reflecting the full organization and hierarchy of the university –Strong Open Access mandates are needed Large number of papers published in international prestigious journals are not collected in the institutional repository Open Learning Management Systems and other teaching supporting contents are uncommon But still problematic

9 How to improve in the Ranking Web –Adopting not only web policies Adopting the all missions model, especially the third mission (technology and knowledge transfer, community engagement, and … … Internationalization Transparent governance –Adopting web policies Involving everybody in the web contents generation Implementing Open Access initiatives Setting up social tools oriented policies How not to improve in the Ranking Web –Allowing the Computer Department to take charge of the task –Unethical behavior regarding visibility (buying or exchanging links) –Populating with empty records the repositories REGARDING … Usefulness

10 Sources for web contents

11 World-class Universities –Strong criteria is excellence in research Following the model of US research-intensive universities. Now it is a core group of only about universities in the world –Best strategies First: Not publishing in local low quality journals. Second: To publish in top international journals Leadership in international research cooperation projects Indicators –Number of highly cited researchers –Papers in prestigious databases: WoS, Scopus, GS (or by faculty member) Number of papers in top (first quartile) journals University h-index (or of the h-indexes of the faculty members) Number of highly-cited (10%, 5%, 1%) papers –Total number of citations (or by faculty member) Why and how to measure Research

12 Measuring research Comparative Analysis Excellence in the ranking Web

13 Size –A tool taking into account the size is not producing a ranking More variables –Most of the variables are strongly correlated and probably are superfluous –Data from many of them are unreliable or unfeasible to obtain –Weighting model become arbitrary with too many variables Quality evaluation –Collecting opinion from large populations is probably the only option –But reputational surveys request info from highly subjective people without true international multidisciplinary knowledge –And bibliographic citations provide small biased sample sizes focusing on only one mission Direct sourcing –Distrust the data provided by the universities themselves FAQ: Justification and explanation

14 FAQ II: Justification and explanation Teaching evaluation –For a global analysis there is no direct way for comparative evaluation –Indirect proxies as the measurement of individual commitment is the only feasible option ranking –Student/faculty ratio difficult to obtain (no international standard definitions), and it is meaningless for small differences (decimals!) Internationalization criteria –Many factors involved in student mobility, perhaps only transcontinental one is really important –Second class academicians mobility not obtaining positions in their national institutions should be discarded Employability –Mostly anecdotal information without any real value Bibliometrics –Leave it to the true experts!

15 Relevant facts about the Ranking Web Authority, Purpose, Coverage, Objectivity, Accuracy –Cybermetrics Lab is a research group belonging to the largest public governmental (non-profit) research body in Spain It ranks universities, not websites –Popularity (number of visits or visitors) are not taken into account / Web design (usability) is irrelevant for the ranking –It focus on the weakest link of the university: Lack of commitment to web publication means bad governance (and services), globalization opportunities missed, reluctance to open peer review, ignoring e-learning possibilities, reducing recruitment capabilities, … no option to be labeled as World-class University The ranks are based on current, not old, data –There are two editions per year (January, July) for easy monitoring and the fast identification and solving of problems Ethical policies are strongly endorsed –Unethical behavior means exclusion from the Ranking

16 Ranking Web: Main January 2013 results

17 Web Ranking as a research project –Stability for allowing inter-years comparisons is not a priority –Scientific analysis are on the way for improving the ranking mainly by the addition of new data sources Strong candidates are added-value social tools (mendeley, facebook, youtube, slideshare, wikipedia, twitter, academia.edu, …) Web data –Cybermetrics Lab is providing or going to provide academic web data as a primary source for building Rankings to: Scimago Institutions Ranking QS university Rankings (Latin America) U21 Rankings of Systems of Higher Education U-Multirank, through ETER project –Cooperation with national public/non-profit rankings is open by request A note about the future

18 Questions? … Thank you! Isidro F. Aguillo, HonDr The Cybermetrics Lab - CSIC Open forum