Download presentation
Presentation is loading. Please wait.
Published byNora Morris Modified over 8 years ago
1
REF 2014 Computer Science and Informatics update Jim Briggs 5 th October 2011 1REF CSI update Oct 2011
2
This seminar slot Every fortnight (even weeks) –specific school seminars in the odd weeks Should be on everyone's timetable Important part of all researchers' staff development Come early for tea/coffee and a chat The one time all CSI researchers get together Alex still looking for speakers 2REF CSI update Oct 2011
3
Contents REF background What will the REF assess? REF sub-panel 11B Mock REF Organisation of our research 3REF CSI update Oct 2011
4
REF BACKGROUND 4REF CSI update Oct 2011
5
REF background REF = Research Excellence Framework Replaces Research Assessment Exercise (last 2008) Very similar to RAE, except: –reduced number of panels –aim for more consistency between panels –some panels will use citation data to assess outputs –all panels will assess non-academic impact –esteem no longer included as a distinct element –reduced data requirements Used to determine the "QR" element of university funding 5REF CSI update Oct 2011
6
REF timetable December 2014: results published 31 December 2013: cut-off for publications 29 November 2013: final date for submissions 31 October 2013: census date for staff 31 July 2013: cut-off date for impact October 2012: survey of submission intentions January 2012: publish panel criteria Now: consultation on panel criteria July 2011: publish assessment framework 6REF CSI update Oct 2011
7
WHAT WILL REF ASSESS? 7REF CSI update Oct 2011
8
Each unit of assessment submits: REF 1a/b/c: Staff details REF 2: Research outputs (normally 4 per member of staff) REF 3a/b: Impact template and case studies (at least 2) REF4a/b/c: Environment data (PhDs, external income) REF 5: Environment template 8REF CSI update Oct 2011
9
Output of the REF process Result expressed as a profile –what proportion of research is "4*" (world-class), "3*" (internationally excellent), "2*" (international), "1*" (national) and "unclassified" –in 2008 our profile was 5/20/40/30/5 (GPA 1.90) –slightly stronger on outputs than environment and esteem In 2014, results weighted according to: –Outputs 65% –Impact 20% –Environment 15% 9REF CSI update Oct 2011
10
REF PANEL 11B 10REF CSI update Oct 2011
11
Membership of panel 11B Prof Steve Furber, Manchester (Chair) Prof Andrew Adamatzky, UWE Prof David Benyon, Edinburgh Napier Prof Alan Burns, York Prof Anthony Cohn, Leeds Prof Jon Crowcroft, Cambridge Prof Anthony Finkelstein, UCL Mr Martin Jackson, Twin Technologies Ltd Prof Marta Kwiatkowska, Oxford Prof Ursula Martin, QMUL Prof Tom McCutcheon, DSTL Prof Alexandra Poulovassilis, Birkbeck Prof Tom Rodden, Nottingham Prof Stan Scott, Queen's University, Belfast Prof Nigel Shadbolt, Southampton Prof Qiang Shen, Aberystwyth Prof Morris Sloman, Imperial (Deputy Chair) Prof Iain Stewart, Durham Prof Joseph Sventek, Glasgow Prof Josie Taylor, OU Prof Chris Taylor, Manchester Prof Bonnie Webber, Edinburgh 11REF CSI update Oct 2011
12
Scope of panel 11B The UOA includes the study of methods for acquiring, storing, processing, communicating and reasoning about information, and interactivity in natural and artificial systems, through the implementation, organisation and use of computer hardware, software and other resources. The subjects are characterised by the rigorous application of analysis, experimentation and design. 12REF CSI update Oct 2011
13
THE MOCK REF 13REF CSI update Oct 2011
14
Plan Send data to external assessor Ask for feedback on: –which staff to submit especially "early career researchers" –which are everyone's "best" outputs –strengths/weaknesses of our environment 14REF CSI update Oct 2011
15
Data updating/gathering New form "REF Data Form" (RDF) Intended: –make it easy for you to keep up to date/refresh from time to time –make it easy for you to resubmit –make it easy for Uni to enter into final REF forms Outputs based on contents of Parade New is for you to write about the "significance" of each output 15REF CSI update Oct 2011
16
What is my best paper? Judged on a (holistic) combination of 3 criteria: Originality –the extent to which the output introduces a new way of thinking about a subject, or is distinctive or transformative compared with previous work in an academic field Significance –the extent to which the work has exerted, or is likely to exert, a significant influence on an academic field or practical applications Rigour –the extent to which the purpose of the work is clearly articulated, an appropriate methodology for the research area has been adopted, and compelling evidence presented to show that the purpose has been achieved 16REF CSI update Oct 2011
17
Proxies for quality Journal impact factor –Will NOT be used by REF panels Citations, as measured by –Web of Knowledge (http://wok.mimas.ac.uk/)http://wok.mimas.ac.uk/ –Scopus (http://www.scopus.com/)http://www.scopus.com/ –Google Scholar (http://scholar.google.co.uk/)http://scholar.google.co.uk/ 17REF CSI update Oct 2011
18
Writing about significance 1 "Significance is the extent to which the work has exerted, or is likely to exert, a significant influence on an academic field or practical applications" Guidance says "significance … that is not evident within the output itself" –but even if it is evident, re-iterate it! I asked why not include originality and rigour as well – no real answer but those are more likely to be evident in the output –suggest you re-iterate but be subtle! 18REF CSI update Oct 2011
19
Writing about significance 2 Evidence needs to be succinct, verifiable and externally referenced where appropriate Do not provide citation data –except perhaps "this highly-cited paper …" In most cases, provide in significantly fewer than 100 words Examples from RAE 2008 (albeit different rules) available online 19REF CSI update Oct 2011
20
THE UOP CSI RKT ENVIRONMENT 20REF CSI update Oct 2011
21
Research groups Established and cross- departmental 1.Artificial intelligence - needs a name 2.Health informatics - Centre for Healthcare Modelling and Informatics Less established/not so cross-departmental 1.Future-proof computing (CT) 2.Games and game playing (CT) 3.Human computer interaction (SoC) 4.Information systems (SoC and the Business School) 5.Parallel and distributed systems (SoC and SoE) 6.Security and digital forensics (SoC, SoE and ICJS) 7.Systems engineering (SoE) 8.Technology enhanced learning (SoC) 9.Virtual reality and visualisation (CT) 21REF CSI update Oct 2011
22
Management Group Membership –Research group leaders –Heads of department Purpose –Agree objectives to be included in school plans –Share good practice 22REF CSI update Oct 2011
23
THE END jim.briggs@port.ac.uk 23REF CSI update Oct 2011
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.