UNICA Core Group on Evaluation in Research and Higher Education 29 October 2010 Vienna 1.

Slides:



Advertisements
Similar presentations
Stability Pact for South Eastern Europe Task Force Education and Youth South Eastern European Education Reform Implementation Initiative Good Practice.
Advertisements

Climate change and health research 1 |1 | Priorities for climate change and health research Maria Neira, Director Public Health and Environment Department.
1 Bologna Shaping the Agenda Bologna today and tomorrow Lesley Wilson Secretary-General, European University Association.
The progress of the ECEC Portal ECEC Network 22 nd of June, 2010 Ineke Litjens Directorate for Education Education and Training Policy Division.
SGA1 – The evolving role of UNAIDS in a changing financial environment UNAIDS has adapted to a new funding environment and developed strong and positive.
PACE EH Redefining Local Environmental Health PACE EH National Summit Louisville, Kentucky March 28-29, 2006 The PACE EH Methodology.
The European University Associations Institutional Evaluation Programme Nina Arnhold European University Association Birmingham, 09 December 2005.
Response to Recommendations by the National Association of Child Care Resource & Referral Agencies (NACCRRA) The Massachusetts Child Care Resource & Referral.
WORK PROGRAM Academic Year UNICA General Assembly 2009-UPMC, Paris.
Consorzio Interuniversitario Improving Scientific Research in Higher Education Institutions: a process management experience in Italian Universities.
Excellence in Europe and the Institute of Physics Eötvös Loránd University Budapest, HUNGARY Jenő KÜRTI 20. March www:
Internationalisation: Strategy, Indicators, Achievements Prof. Arthur Mettinger Vice-Rector Educational Program Development & Internationalisation Vienna,
AGENDA General Assembly Paris, November 6, 2009 UNICA General Assembly 2009-UPMC, Paris.
Application of Ensemble Models in Web Ranking
TATIONpRÆSEN AARHUS UNIVERSITET 1 AARHUS UNIVERSITET Aarhus University - The new administration.
Science, Technology and Culture Strategy document October 2006.
Based on a criteria method. The FWUC capacity to undertake their main tasks:  Operation  Maintenance  Communication and relationship with farmers and.
Academic Analytics: Obtaining Benchmarks for Faculty Productivity Carol Livingstone Division of Management Information
Evaluations in the Academy of Sciences of the Czech Republic: Past and present Leoš Horníček Head Office of the ASCR.
Presented by: Charles Pallandt Title: Managing Director EMEA Academic & Governmental Markets Date: April 28 th, Turkey “Driving Research Excellence.
University of Vienna Rectorate – Office of the Rectorate May 30, 2008 Claudia Kögler University of Vienna, Office of the Rectorate.
1 UNICA WORKING GROUPS ON RESEARCH EVALUATION AND ADMINISTRATIVE EXCELLENCE Prof. Véronique Halloin, GS of Fund for Scientific Research.- FNRS Prof. Philippe.
I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=
I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=
UNICA Rectors’ Seminar Dubrovnik, July 2009 Which way to quality? Rankings vs Benchmarks Stavros A. Zenios Professor of Management Science Rector, University.
Ranking universities: The CHE Approach Gero Federkeil CHE – Centre for Higher Education Development International Colloquium “Ranking and Research Assessment.
Research universities across the world: paths to academic excellence // 3rd Int. Conf. RAHER, October 20th 2012 Research universities across the world:
NSERC has an overview of the discovery grant program on their website:
Usage Data Practical evaluation Katarina Standár LM Information Delivery Boatshow May 2013.
The CHE ranking The multi-dimensional way of Ranking Isabel Roessler CHE – Centre for Higher Education Development International Conference “Academic Cooperation.
PILOT PROJECT: External audit of quality assurance system on HEIs Agency for Science and Higher Education Zagreb, October 2007.
Defining and Measuring Impact Professor Andy Neely Deputy Director, AIM Research.
HUNGARIAN ACADEMY OF SCIENCES NORMS AND STANDARDS IN INTERNATIONAL RESEARCH NORBERT KROO HUNGARIAN ACADEMY OF SCIENCES BUDAPEST,
Strategies for capacity building for health systems research in LMIC: some lessons and ideas from ICDDRB HPF Hub Technical Review meeting Krishna Hort.
Uwe Brandenburg Options and limits of measurability: the experience from the Excellence Ranking in the light of the global ranking discussion.
Toolbox CRC programme managers – Dag Kavlie, RCN Analysis of indicators used for CRC Monitoring and Evaluation Ljubljana, 15 September 2009.
Transregional Workshop – Sofia, October 30, 2008 R4R Tools and Methodologies.
ESPON Seminar 15 November 2006 in Espoo, Finland Review of the ESPON 2006 and lessons learned for the ESPON 2013 Programme Thiemo W. Eser, ESPON Managing.
Gero Federkeil Expert Seminar „Quality Assurance and Accreditation in Lifelong Learning“, Berlin, February 2011 Rankings and Quality Assurance.
Evaluating the impact of the Quality Research in Dementia (QRD) Consumer Network on the Alzheimer’s Society’s Research Programme Shirley NurockTed Freer.
La qualitat, garantia de millora AQU Catalunya. Agency for Quality Assurance in Higher Education of Catalonia Projects Esteve.
Monitoring R&D as a Methodological Tool for Impact Studies Stig Slipersæter Norwegian Institute for Studies in Research and Higher Education Centre for.
What is HMN? Global partnership founded on the premise that better health information means better decisions and better health Partners reflect wide.
Program Evaluation for Nonprofit Professionals Unit 2: Creating an Evaluation Plan.
ERIM Next Generation Graduate Programme & Networking Dynamics Wilfred Mijnhardt Executive Director Erasmus Research Institute of Management - ERIM Presentation.
Research Quality Framework Presentation to APSR - ARROW - Repository Market Day 4 May 2007 Sandra Fox Department of Education Science and Training.
2007 Faculty & Staff Denison Organizational Culture Survey.
Defining and Managing Internationalisation in Higher Education Institutions.
EUA Institutional Evaluation Programme Georges Verhaegen Former Rector Université Libre de Bruxelles.
Universiteit Antwerpen Conference "New Frontiers in Evaluation", Vienna, April 24th-25th Reliability and Comparability of Peer Review Results Nadine.
HELCOM HOLAS II ESA WS, Helsinki EU Horizon 2020 Coordination and support action Ville Karvinen / SYKE Enhancing ecosystem services mapping for.
University rankings, dissected Răzvan V. Florian Ad Astra association of Romanian scientists Center for Cognitive and Neural Studies, Cluj, Romania.
Town Centre Healthchecks. Page 1 Introduction “A Health check is the appropriate monitoring tool to measure the strengths and weaknesses of a town centre.
UNIVERSITIES EVALUATIONS AND RANKINGS Philippe VINCKE Rector of the Université Libre de Bruxelles.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Classification & Ranking in Higher Arts Education New EU developments and the role of ELIA.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Academic Ranking of World Universities
UNIVERSIDADE NOVA DE LISBOA
The role and contributions of the Information Specialist to the core business/functions of the University of Pretoria Suzy Nyakale: Faculty Library.
International Research Agendas
GSF Results and Financial Monitoring Workshop
2010年度部门内部品牌推广计划 部门: 调研及策略部.
MIT- Massachusetts Institute of Technology Number 1 University in World By. Prof. B.A.Khivsara.
OCTAE presents: Up-to-Date with DATE
Human resource and Work flow of the Research Office, RUPP
Rankings from the perspective of European universities
Standard 10 Research(**) البحث العلمي )**(.
8 Core Steps of success: I.Step:1 : DREAM SETTING: II. Step: 2 : LIST MAKING : IV. Step: 4 : SHOW THE PLAN: III. Step: 3 : INVITATION: V. Step: 5 : FOLLOW.
Presentation transcript:

UNICA Core Group on Evaluation in Research and Higher Education 29 October 2010 Vienna 1

I. Context 2 Data “mysteriously” aggregated Multidimensionality of research excellence neglected unreliable data etc. Current rankings not very useful for benchmarking or to develop research in one university

II. Objectives 3 to provide to the UNICA network a toolbox of research evaluation methodologies adapted to different “cases” identified as interesting by the UNICA members to pool resources in learning from each other about best practices in the context of benchmarking activities to develop a methodology accepted by Unica members, and based on expertise sharing, complementarities and harmonization of databases

III. Working Group 4 10 universities 4 meetings Carte d’europe et universités Leurs classements ectuels

IV. Methodology (1) 5 1. To list the possible objects to evaluate Projects Researchers Teams, faculties, centres, … … 2. To list the possible goals of an evaluation Typology Ranking Swot analysis … 3. To list the possible users of an evaluation Rectors Researchers Governments, funding agencies Media, public, … One case is defined by a combination of answers to each of the 3 questions

IV. Methodology (2) Focus on Case 2 Case study number ObjectGoalsUsers 1FacultiesStrong and weak pointsRectors 2Group/department (smaller groups than “faculties”) Strong and weak pointsRectors 3Teams, groups, faculties Strong and weak pointsPublic 6

III. Methodology (3) 7 Agreement to limit the number of indicators per dimension Dimensions : what do we want to take into account Indicators : how do we want to take into account Two first fields: Economic/business, Physics 5 Dimensions and 20 indicators D1: scientific output (7 indicators) D2: Attractiveness/internationalization (5 indicators) D3: Research training (3 indicators) D4: Transfer of technology (5 indicators) D5: Links to education (indicators still to be selected) + Basic Data on university (Global budget, staff, etc)

8 DIMENSION 1: SCIENTIFIC OUTPUT 1. Publications in Scopus, WoF 2. Peer review chapters of books 3. Peer review books 4. # Citations in databases 5. # Patents 6. # ERC grants 7. # Highly scientists III. Methodology (4)

IV. What has been done ? 9 Agreement on dimensions and indicators Data collection from several members of the group Analysis of difficulties

IV. Challenges and next steps 10 To get data from all members Analysis and graphical representation Need for common definitions of indicators Try to use existing data To have a meeting with Unica members involved in Multirank.