RAE Briefing by Keith van Rijsbergen at CPHC on 19th April, 2007.

Slides:



Advertisements
Similar presentations
RAE 2008: Goldsmiths Outcomes. Sample Quality Profile.
Advertisements

UCET Annual Conference 2011 Shifting Sands and Stable Foundations: Insecurity and Instability in Teacher Education.
DUAL SUPPORT DUEL FOR SUPPORT Professor Sir Gareth Roberts University of Oxford.
Research funding and assessment: beyond 2008 Professor David Eastwood Vice Chancellor University of East Anglia, Chair 1994 Group, Chief Executive Designate.
GSOE Impact Workshop Impact and the REF 19 th May 2010 Lesley Dinsdale.
Research and Innovation Challenges: Excellence and Sustainability Trevor McMillan Low Wood, January 2009.
The Research Excellence Framework A beginner’s guide.
REF2014 HODOMS Birmingham 8 th April Ann Dowling: Chairman of REF Main Panel B John Toland: Chairman of REF Sub-Panel B10: Mathematical Sciences.
RAE 2008 Engineering Ann Dowling, Chairman Main Panel G - Engineering EPC Congress March 2005.
Accessing Research Outputs for RAE 2008 UKSG Ed Hughes, RAE Manager.
Prof. Robert Morrell, UCT Research Office Presentation to North West University 28 February 2014.
These slides have been produced by the REF team, and were last updated on 3 September 2011 They provide a summary of the assessment framework and guidance.
Communicating the outcomes of the 2008 Research Assessment Exercise A presentation to press officers in universities and colleges. Philip Walker, HEFCE.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
Research at York Presentation to Council Alastair Fitter Pro-Vice-Chancellor, Research.
RQF Trials and the Newcastle Experience Barney Glover.
Presented by: Eileen Harvey – Senior Human Resources Consultant June 2015 UCL Senior Promotions Academic, Research and Teaching Fellow Staff.
The Higher Education Innovation Fund Vinnova and British Embassy seminar 21 March 2006.
Demonstrating research impact in the REF Graeme Rosenberg REF Manager
The Research Excellence Framework. Purpose of REF The REF replaces the RAE as the UK-wide framework for assessing research in all disciplines. Its purpose.
The Research Excellence Framework. Presentation outline The REF assessment framework and guidance on submissions: - Overview - Staff - Outputs - Impact.
REF Information Session August Research Excellence Framework (REF)
REC Subject Review Phase 1: Expert Panel Report and Recommendations.
M AXIMISING C AREER D EVELOPMENT Geoff Foster, Deputy Director of Human Resources Joanne Smailes, Teacher Fellow & Learning and Teaching Advisor.
The UK Experience of Quality Assurance in Research and Doctoral Education Dr Robin Humphrey Director of Research Postgraduate Training Faculty of Humanities.
Beyond the RAE: New methods to assess research quality July 2008.
A month in the life of a university bibliometrician Dr Ian Rowlands University of Leicester, UK.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
The REF assessment framework and guidance on submissions Linda Tiller, HEFCW 16 September 2011.
Research Assessment Exercise RAE Dr Gary Beauchamp Director of Research School of Education.
Helping to make care better Cynthia Bower, CEO National Care Association Conference 11 November 2009.
1 Reflections on RAE 2008 Richard Thorpe Business & Management Sub-panel (i36)
The Research Excellence Framework Expert Advisory Groups round 1 meetings February 2009 Paul Hubbard Head of Research Policy.
The Research Excellence Framework: principles and practicalities Stephen Pinfield Thanks to Paul Hubbard and Graeme Rosenberg of HEFCE for providing much.
The Research Excellence Framework Impact: the need for evidence Professor Caroline Strange 22 June 2011.
THE IMPACT OF RAE ON SERIAL PUBLICATION Professor Judith Elkin UK Serials Group March 2004.
The REF assessment framework (updated 23 May 2011)
1 Research Context and the RAE John Saunders Head of School, Aston Business School IDEAS Factory 23/24 October 2006.
Committee Meeting, June 9, 2008 Strategic Institutional Research Plan.
Delivering Strength Across the Piece David Sweeney Director, Research, Education and Knowledge Exchange HEPI, Royal Society 31 March 2015.
Main Panel A Criteria and Working Methods Cardiff School of Biosciences Ole H Petersen Chair.
What is impact? What is the difference between impact and public engagement? Impact Officers, R&IS.
ACADEMIC PROMOTIONS Promotions Criteria Please note, these slides only contain a summary of the promotions information – full details can be found.
Research Excellence Framework 2014 Michelle Double Hyacinth Gale Sita Popat Edward Spiers Research and Innovation Support Conference.
Impact and the REF Consortium of Institutes of Advanced Study 19 October 2009 David Sweeney Director (Research, Innovation and Skills)
Merit JISC Collections Merit: presentation for UKCORR Hugh Look, Project Director.
Excellence in Research for Australia 2015 Update Dr Tim Cahill Director Research Evaluation 1.
Academic Program Review Workshop 2017
Current R& KE Issues David Sweeney
Towards REF 2020 What we know and think we know about the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS Anglia.
Research Outcomes Collection
Phil Quirke RAE 2008 & REF 2014 panels
Welcome slide.
RESEARCH REALLY MATTERS
Impact and the REF Tweet #rfringe17
WP2. Excellent university for the researchers
REF 2021 What we know and thought we knew, in preparation for the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS.
UC policy states:  "Superior intellectual attainment, as evidenced both in teaching and in research or other creative achievement, is an indispensable.
Law Sub-panel Generic Feedback - Impact
Anglia Ruskin REF Awayday 2017
Research Update GERI May 2010.
Research Excellence Framework: Past and Future
Research Assessment Exercise
Towards Excellence in Research: Achievements and Visions of
NRF Evaluation & Rating
DUAL SUPPORT DUEL FOR SUPPORT
Developing a User Involvement Strategy.
REF and research funding update
Understanding Impact Stephanie Seavers, Impact Manager.
Being a Local University: Towards New Assessment Tools and Indicators Dr John H Smith Senior Adviser, European University Association (EUA) Brussels Member,
Presentation transcript:

RAE Briefing by Keith van Rijsbergen at CPHC on 19th April, 2007

RAE 2001/2008 Two tier panel structure Quality profiles Outputs, not individual researchers are rated. Fairness continues

outputs/individuals An underpinning principle is that sub-panels should assess each submission in the round: they will not make collective judgements about the contributions of individual researchers, but about a range of indicators relating to the unit, research group or department that is put forward for assessment (Generic; §21)

Size (2001) No. of submissions cross-refs Pure Maths473 Applied Maths583 Stats462 CS806

Timeline June 2007: Consider survey of intentions... January2008: Allocate reading responsibilities March2008: Identify cross-refs & ext. adv May2008: Audit cross-ref advice June/July2008: First assessment meeting September2008: Continue assessment Oct/Nov2008: Final assessment meeting November2008: Submit profiles to RAE December2008: Results Published

CAVEATS The published document RAE 01/2006 is the authoritative version. The content was arrived at by collective discussion and debate. The document contains compromises to achieve some compatibility across subpanels in F and to satisfy the lawyers in HEFCE+

Main Panel F (Nigel Hitchin) Pure Mathematics (20) Applied Mathematics (21) Stats and OR (22) CS and Informatics * Ken Brown Tim Pedley Bernard Silverman Keith van Rijsbergen * To be referred to as CS henceforth

International Members

Who is on our sub-panel?

Quality Levels 4* Quality that is world-leading in terms of originality, significance and rigour. 3* Quality that is internationally excellent in terms of originality, significance and rigour but which nonetheless falls short of the highest standards of excellence 2* Quality that is recognised internationally in terms of originality, significance and rigour. 1* Quality that is recognised nationally in terms of originality, significance and rigour. u/ c Quality that falls below the standard of nationally recognised work. Or work which does not meet the published definition of research for the purposes of this assessment.

In assessing excellence, the subpanels will look for originality, innovation, significance, depth, rigour, influence on the discipline and wider fields, and relevance to users.(Main §15) Excellence

CS: 4* To be awarded a 4* quality level a research output must exhibit high levels of originality, innovation and depth, and must have had, or in the view of the sub-panel be likely to have, a significant impact on the development of its field. (CS: §66)

Generic:Section 20 These descriptive accounts should be read alongside, but do not replace, the standard definitions

Outputs All forms.....including, but not necessarily limited to: books, chapters in books, articles in journals, conference contributions; creative media and multimedia; standards documents; patents, product or process specifications, items of software, software manuals; technical reports, including consultancy reports or independent evaluations. All forms will be given equal consideration. (CS: §12)

Cross referrals & External advice Advice from other subpanels will be sought and given on the basis of the assessment criteria for the UOA to which the work was orginally submitted (Generic: §54) The advice of external advisers will be used to inform the subpanels’ assessment. (Main: §27)

‘Reading’ As far as possible, the sub-panel will examine all research outputs, and expects to examine in detail at least 25% of the outputs in each submission. The subpanel will use its professional judgement to select a subset of outputs to examine in detail (CS: §17)

Research Environment Infrastructure, facilities and administrative support for research Arrangements for developing and supporting staff in research Cumulative impact of research Industrial collaboration, relationship with research users, contribution to public awareness and understanding

Research Environment Academic collaboration, national and international, within discipline and interdisciplinary Research degrees awarded Research income: funding strategy, amount received and sustainability Credibility, vitality and sustainability of research organisation (CS: §27)

Esteem Indicators awards, fellowships of learned societies, prizes, honours and named lectures personal research awards and fellowships keynote and plenary addresses at conferences etc etc, see CS: §43

Citations No panel will use journal impact factors as a proxy measure for assessing quality (Generic: §32) They will not use a formal ranked list of outlets, nor impact factors, nor will they use citation indices in a formulaic way (Main §15)(CS: §19) But.....In reaching these judgements the assessor may consider: citations of the output, relative to others of similar age in the same field (CS: §63)

Quality Profile By percentage of research activity in the submission judged to reach quality standard Sub-Panel x Cat A FTE Staff submitted for assessment 4*3*2*1* Unclassified University A University B

Research Outputs Research EnvironmentEsteem Indicators 4*3*2*1*u/c *3*2*1*u/c % Overall Quality Profile 15 4* u/c1*2*3* 4*3*2*1*u/c % 10% is comprised of the aggregate of the weighted profiles produced for outputs, research environment and esteem indicators.(Annex 1) Quality Level % of Research Activity The overall quality profile

Some Do’s and Don’ts Do read criteria and working methods carefully Do submit early career researchers Don’t provide info on detailed organisation and management structure of the department CS: §33

Take home message Continue to produce high quality research, papers, etc Continue to enhance research environment eg. grants, PhD’s, etc Continue to attract awards, prizes, keynotes, etc All the stuff we normally do.

POSTMORTEM

Budget 2006 Selected Highlights

Some Highlights Maximising the Impact of Science Funding –Wider remit for Technology Strategy Board to stimulate business innovation in areas which boost UK growth and productivity –Seeking views on: how best to support higher-risk high-impact research in novel fields of scientific enquiry How national & regional policies can increase business-university collaboration in the regions How to encourage a wider spectrum of business-university interactions –Large Facilities Research Council + reorganise research funding for physical sciences? –MRC and NHS R&D funds to be combined? –Improve the supply of scientists More teachers in secondary STEM subjects (CS is not mentioned, but Maths is) Targets for increased A-level enrolments in Physics, Chemistry & Maths More pupils taking 3 sciences to GCSE –R&D tax credit for companies up to 500 employees

RAE Preamble “the Government is keen to ensure that excellent research of all types is rewarded, including user-focussed and inter-disciplinary research” “also wants to ensure that institutions continue to have the freedom to set strategic priorities for research, undertake blue skies research and respond quickly to emerging priorities and new fields of enquiry” “the Government is strongly committed to the dual support system and to rewarding research excellence, but recognises some of the burdens imposed by the RAE” “firm presumption is that after the 2008 RAE the system for assessing research quality and allocating `quality rated’ (QR) funding from the DfES will be mainly metric-based.” “In May 2006, the Government will launch a consultation on its preferred option for a metrics-based system for assessing research quality and allocating QR funding, publishing results in time for the 2006 Pre-Budget Report”

RAE alternative “aware that preparations for the 2008 RAE are well underway” “presumption that the 2008 RAE should go ahead, incorporating a shadow metrics exercise alongside the traditional panel-based peer review system” “However, if an alternative system is agreed and widely supported, and a clear majority of UK Universities were to favour an earlier move to a simpler system, the Government would be willing to consider that”

Why RAE? Is it now just to validate the metrics? What metrics? Do we get to invent them? Supporting documents suggest this is already well on the way to being decided

Science and Innovation Investment Framework : next steps SIIF’04-’14: “metrics collected as part of the next assessment will be used to undertake an exercise shadowing the 2008 RAE itself, to provide a benchmark of the information value of the metrics as compared to the full peer review process.” Principles –Fund excellent research of all types –Preserve dual support –Minimise burden, sufficient to support a fair distribution of funds –Processes should: Be open, apply equally, be simple transparent & cost-effective, fund institutions not individuals, allow institutions to plan effectively Favours a simpler system based on one or more metrics

Options 1 RC Research Grant Income –At institutional level, correlation between QR and Research Council income is 0.98 with no variation across years –Major differences at departmental level balance out at institutional level

Options 2 Total Research Income –Including charities, industry, EU and Govt departments –Average correlation 0.98 –Varies between 0.97 and 0.99

Caveats 1 “The government is also aware that while the correlation between research income and QR is close when measured at an institutional level, this is largely driven by science engineering and medicine. It is therefore not clear that a metric based on research income would fairly support excellent research in the arts and humanities and some other subjects, such as mathematics. It might therefore be the case that other options would need to be explored for these subjects”

Caveats 2 “Alongside running a mainly metrics-based system, the Government will also explore the option of continuing to convene expert panels to provide an extra level of verification for the results generated by metrics. The panels would not be expected to hold their own information-gathering exercise.”