Download presentation
Presentation is loading. Please wait.
Published byAnabel McCormick Modified over 9 years ago
1
1 Trends in Research Assessment and Higher Education, and their impact on TS Gyde Hansen Copenhagen Business School (CBS) 2008
2
2 Criteria: Assessment of departments International board-memberships Expert referee jobs Invitations from institutions (from abroad) Organization of international conferences Editorships of international Journals Reviewing international papers, abstracts Having invited guests from other universities
3
3 Criteria: World Class Research Environments (WCRE) Publication: highly ranked journals/publishers Enhancement of –international cooperation –CBS’ reputation International competition Recruitment of doctoral students Cooperation with business partners Attraction of external funding
4
4 European Quality Assurance : mission statements/goals Bologna Process: European co-operation on quality – comparable criteria and methodologies – mobility of students and teachers EUA: European University Association: a coherent system of education and research at the European level, e.g. doctoral programmes ENQA: European Association for Quality Assurance in Higher Education: cooperation as to quality, sharing experience, methods, standards – improvement – effective systems – accreditation EQAR: European Quality Assurance Register of Higher Education: publicly accessible ranking system – quality – student mobility – transparency and trust
5
5 TS Globalization – Internationalization – Market forces – Branding Cooperation - Competition International Management (in detail) – Control HE Ranking of departments, institutions and Journals Academic f reedom? Creativity? Cooperation - Competition Institutional? Tradition? Quality assurance – Evaluation Extern funding – Money Research? Knowledge? Bildung? Languages etc. Interrelated CHANGES influencing TS
6
6 TS Globalization – Internationalization – Ranking – Branding – Management – Control Cooperation – Competition International: doctoral schools, summer schools? Ranking of Journals and Publishing houses: dominance of English? Cooperation – Competition Departments: social climate? Ranking of Universities: who assesses? how? what consequences 1.2. 3.4.
7
7 1. Cooperation - Competition: Departments Questions/Problems a.Expected outcome of the evaluations? b.What is the impact of the ranking of colleagues on the social climate at the department? Division of labour? Top-researchers often work on applications or funded research projects – not much time for teaching and administration
8
8 1.1 Expected outcome of the evaluations of departments (CBS 1999 - 2008) Increased attention on research, dynamic environments Understanding the necessity (and legitimacy) to discuss own and colleagues research Action plans, perhaps reorganization of departments Quality development Did they get it?
9
9 1.2 Social climate at split departments? Cooperation? Discussing each others’ research? Competition? Hierarchies? Bitterness and isolation? Lack of influence and lack of democracy?
10
10 2. International doctoral schools Questions/Problems a.How do we prevent – in our little field – that not all research / assessment / evaluation will be done in the same way? It seems to be the same international trainers at the doctoral schools/summer schools b.What will happen to / with variety?
11
11 3. Ranking of institutions and departments (see some of the criteria on dias 2 and 3 above) In future, a part of the basic resources will be given according to “research quality” – Danish universities are expected to compete Criteria are “performance goals” Research indicators are taken mostly from bibliometrics / ranking of publications
12
12 3.1 Quality indicators: their reliability, validity, comparability? Indicators and their weighting are quite different Exchange students (number of international students) Visiting professors (number of …) Student – teacher ratio –at some universities this is irrelevant Attracting extern funding –at some places this is irrelevant Graduates’ employment rate? …
13
13 3.2 Ranking of Universities as to TS Questions/Problems a.What are high standards in our field? b.What are the selection indicators? Paradigms? c.How are the indicators weighted? d.How is validity, reliability, objectivity, guaranteed? e.How can “political” interests be prevented?
14
14 4. Ranking of Journals/Publishers The same criteria in all fields of research 2 levels – in order not to make it too complicated 20% of all journals or publications in the world are expected to be ranked as highest level. (The average number of articles in about 19.000 journals will be counted.)
15
15 4.1 Criteria Few criteria: consensus among researchers in the field that the journal 1. is absolutely “leading” 2. publishes the most important articles (in the research field) 3. from researchers from different countries Supported by a Norwegian ranking list: dbh.nsd.uib.no/kanaler/ and by ISI-data; Ulrichs; Digital Article Database Service (DADS)... NB Two September issues on this website by Daniel Gile
16
16 4.2 Ranking of Journals and publishers and the consequences Questions/Problems a.What does ranking mean to TS? b.What impact does the ranking have on “not top-level” Journals? c.Will it only be a few top-researchers, reviewers or colleagues on the editorial boards who decide as to the ranking? d.What does this do to variety (again) – innovation? Will we write what publishers/editors/reviewers like, are interested in?
17
17 4.3 Ranking of Journals and publishers and the consequences – money! Questions/Problems f. e.“Citedness” – is that an indicator of quality? see Gile’s September 2008 issues f.What will happen to research written in other languages than English? Not many French, Portuguese, German or Spanish papers are cited g.What can be done in order not to forget what is written in languages other than English? If bibliometrics counts so much for the future budget of TS departments this may be an important question?
18
18 Who assesses? How? What indicators? What paradigms? Effects on academic freedom, on science and on TS? National research careers will depend on citation… Much power goes to a 20% of the Journals/Publishers Friends will cite and promote friends…? Articles will be shorter…? Not the papers’ content – leage tables will be in focus…? Reading is not necessary – just a look at the ranking…? A few people will do the league tables for TS? How will the hopefully responsible colleagues handle the gap between quantitative ranking and quality?
19
19 The usefulness of Quality Assurance – Evaluation – Ranking – Control? *Pigs don’t get fat from getting weighted Does this also hold for research? For departments, institutions and Journals? For TS? Grise bliver ikke fede af at blive vejet.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.