Download presentation
Presentation is loading. Please wait.
Published byNorma Hensley Modified over 9 years ago
1
RAE Briefing by Keith van Rijsbergen at CPHC on 19th April, 2007
2
RAE 2001/2008 Two tier panel structure Quality profiles Outputs, not individual researchers are rated. Fairness continues
3
outputs/individuals An underpinning principle is that sub-panels should assess each submission in the round: they will not make collective judgements about the contributions of individual researchers, but about a range of indicators relating to the unit, research group or department that is put forward for assessment (Generic; §21)
4
Size (2001) No. of submissions cross-refs Pure Maths473 Applied Maths583 Stats462 CS806
5
Timeline June 2007: Consider survey of intentions... January2008: Allocate reading responsibilities March2008: Identify cross-refs & ext. adv May2008: Audit cross-ref advice June/July2008: First assessment meeting September2008: Continue assessment Oct/Nov2008: Final assessment meeting November2008: Submit profiles to RAE December2008: Results Published
6
CAVEATS The published document RAE 01/2006 is the authoritative version. The content was arrived at by collective discussion and debate. The document contains compromises to achieve some compatibility across subpanels in F and to satisfy the lawyers in HEFCE+
7
Main Panel F (Nigel Hitchin) Pure Mathematics (20) Applied Mathematics (21) Stats and OR (22) CS and Informatics * Ken Brown Tim Pedley Bernard Silverman Keith van Rijsbergen * To be referred to as CS henceforth
8
International Members
9
Who is on our sub-panel?
10
Quality Levels 4* Quality that is world-leading in terms of originality, significance and rigour. 3* Quality that is internationally excellent in terms of originality, significance and rigour but which nonetheless falls short of the highest standards of excellence 2* Quality that is recognised internationally in terms of originality, significance and rigour. 1* Quality that is recognised nationally in terms of originality, significance and rigour. u/ c Quality that falls below the standard of nationally recognised work. Or work which does not meet the published definition of research for the purposes of this assessment.
11
In assessing excellence, the subpanels will look for originality, innovation, significance, depth, rigour, influence on the discipline and wider fields, and relevance to users.(Main §15) Excellence
12
CS: 4* To be awarded a 4* quality level a research output must exhibit high levels of originality, innovation and depth, and must have had, or in the view of the sub-panel be likely to have, a significant impact on the development of its field. (CS: §66)
13
Generic:Section 20 These descriptive accounts should be read alongside, but do not replace, the standard definitions
14
Outputs All forms.....including, but not necessarily limited to: books, chapters in books, articles in journals, conference contributions; creative media and multimedia; standards documents; patents, product or process specifications, items of software, software manuals; technical reports, including consultancy reports or independent evaluations. All forms will be given equal consideration. (CS: §12)
15
Cross referrals & External advice Advice from other subpanels will be sought and given on the basis of the assessment criteria for the UOA to which the work was orginally submitted (Generic: §54) The advice of external advisers will be used to inform the subpanels’ assessment. (Main: §27)
16
‘Reading’ As far as possible, the sub-panel will examine all research outputs, and expects to examine in detail at least 25% of the outputs in each submission. The subpanel will use its professional judgement to select a subset of outputs to examine in detail (CS: §17)
17
Research Environment Infrastructure, facilities and administrative support for research Arrangements for developing and supporting staff in research Cumulative impact of research Industrial collaboration, relationship with research users, contribution to public awareness and understanding
18
Research Environment Academic collaboration, national and international, within discipline and interdisciplinary Research degrees awarded Research income: funding strategy, amount received and sustainability Credibility, vitality and sustainability of research organisation (CS: §27)
19
Esteem Indicators awards, fellowships of learned societies, prizes, honours and named lectures personal research awards and fellowships keynote and plenary addresses at conferences etc etc, see CS: §43
20
Citations No panel will use journal impact factors as a proxy measure for assessing quality (Generic: §32) They will not use a formal ranked list of outlets, nor impact factors, nor will they use citation indices in a formulaic way (Main §15)(CS: §19) But.....In reaching these judgements the assessor may consider: citations of the output, relative to others of similar age in the same field (CS: §63)
21
Quality Profile By percentage of research activity in the submission judged to reach quality standard Sub-Panel x Cat A FTE Staff submitted for assessment 4*3*2*1* Unclassified University A50152540155 University B2005404510
22
Research Outputs Research EnvironmentEsteem Indicators 4*3*2*1*u/c 2030152015 4*3*2*1*u/c 3025102015 70% Overall Quality Profile 15 4* 10203025 u/c1*2*3* 4*3*2*1*u/c 1025401510 20% 10% is comprised of the aggregate of the weighted profiles produced for outputs, research environment and esteem indicators.(Annex 1) Quality Level % of Research Activity The overall quality profile
23
Some Do’s and Don’ts Do read criteria and working methods carefully Do submit early career researchers Don’t provide info on detailed organisation and management structure of the department CS: §33
24
Take home message Continue to produce high quality research, papers, etc Continue to enhance research environment eg. grants, PhD’s, etc Continue to attract awards, prizes, keynotes, etc All the stuff we normally do.
25
POSTMORTEM
26
Budget 2006 Selected Highlights
27
Some Highlights Maximising the Impact of Science Funding –Wider remit for Technology Strategy Board to stimulate business innovation in areas which boost UK growth and productivity –Seeking views on: how best to support higher-risk high-impact research in novel fields of scientific enquiry How national & regional policies can increase business-university collaboration in the regions How to encourage a wider spectrum of business-university interactions –Large Facilities Research Council + reorganise research funding for physical sciences? –MRC and NHS R&D funds to be combined? –Improve the supply of scientists More teachers in secondary STEM subjects (CS is not mentioned, but Maths is) Targets for increased A-level enrolments in Physics, Chemistry & Maths More pupils taking 3 sciences to GCSE –R&D tax credit for companies up to 500 employees
28
RAE Preamble “the Government is keen to ensure that excellent research of all types is rewarded, including user-focussed and inter-disciplinary research” “also wants to ensure that institutions continue to have the freedom to set strategic priorities for research, undertake blue skies research and respond quickly to emerging priorities and new fields of enquiry” “the Government is strongly committed to the dual support system and to rewarding research excellence, but recognises some of the burdens imposed by the RAE” “firm presumption is that after the 2008 RAE the system for assessing research quality and allocating `quality rated’ (QR) funding from the DfES will be mainly metric-based.” “In May 2006, the Government will launch a consultation on its preferred option for a metrics-based system for assessing research quality and allocating QR funding, publishing results in time for the 2006 Pre-Budget Report”
29
RAE alternative “aware that preparations for the 2008 RAE are well underway” “presumption that the 2008 RAE should go ahead, incorporating a shadow metrics exercise alongside the traditional panel-based peer review system” “However, if an alternative system is agreed and widely supported, and a clear majority of UK Universities were to favour an earlier move to a simpler system, the Government would be willing to consider that”
30
Why RAE? Is it now just to validate the metrics? What metrics? Do we get to invent them? Supporting documents suggest this is already well on the way to being decided
31
Science and Innovation Investment Framework 2004-2014: next steps SIIF’04-’14: “metrics collected as part of the next assessment will be used to undertake an exercise shadowing the 2008 RAE itself, to provide a benchmark of the information value of the metrics as compared to the full peer review process.” Principles –Fund excellent research of all types –Preserve dual support –Minimise burden, sufficient to support a fair distribution of funds –Processes should: Be open, apply equally, be simple transparent & cost-effective, fund institutions not individuals, allow institutions to plan effectively Favours a simpler system based on one or more metrics
32
Options 1 RC Research Grant Income –At institutional level, correlation between QR and Research Council income is 0.98 with no variation across years –Major differences at departmental level balance out at institutional level
33
Options 2 Total Research Income –Including charities, industry, EU and Govt departments –Average correlation 0.98 –Varies between 0.97 and 0.99
34
Caveats 1 “The government is also aware that while the correlation between research income and QR is close when measured at an institutional level, this is largely driven by science engineering and medicine. It is therefore not clear that a metric based on research income would fairly support excellent research in the arts and humanities and some other subjects, such as mathematics. It might therefore be the case that other options would need to be explored for these subjects”
35
Caveats 2 “Alongside running a mainly metrics-based system, the Government will also explore the option of continuing to convene expert panels to provide an extra level of verification for the results generated by metrics. The panels would not be expected to hold their own information-gathering exercise.”
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.