Research Update GERI May 2010
Funding climate Research Excellence Framework Symplectic demo
Funding Severe constraints on external and internal funding for research All-time low success rates for grants (10-15%) No further rounds of research capital (RCIF)? STEM subjects slightly better protected via Science budget EPSRC adopts “3 strikes and you are out” policy for investigators National review of PhD provision
Research Excellence Framework RAE2008 viewed as a successful process Increased emphasis on STEM Need to demonstrate that publicly-funded research benefits UK PLC
What will be assessed? Outputs (60%) Environment (15%) Impact (25%) Esteem is no longer a stand-alone measure Common weightings across all UoAs Greater consistency for assessment process across panels
Quality profile 4* exceptional (world-leading) 3* excellent 2* very good 1* good Unclassified Likely that only 3* and 4* research will be funded
Panels, sub-panels and sub-sub-panels… 30-40 UoAs (67 in RAE2008) Current proposal to merge all Engineering UoAs (24-29) into a single uber-UoA Specialist sub-panels
Outputs 4 best outputs from your portfolio of work Originality, significance & rigour Early Career Researchers may submit fewer Peer-reviewed assessment with sampling of outputs (implications for journal choice?) Citations data (WoS and Scopus) will be provided to panels Single biggest driver to quality assessment (60% weighting) Clear message –> focus on quality outputs
Environment Less reliant on ‘creative writing’ Demonstrate intellectual and physical infrastructure to support research Uses a common template to evidence: Resourcing (staff, income, infrastructure) Management (strategy, staff development & PGR training) Engagement (KE with people & organisations) Largely qualitative supported by key metrics (income, PGRs)
Impact Impact with users of research (i.e., beyond academic peers) Impact is broadly defined Assesses impact of the ‘department’ – not individuals Not all people, projects or outputs need to demonstrate impact Impact should be evident within REF window (2008-2013) Based on excellent research performed by the submitting dept. in past 10-15 years
Public policy & services Economic Quality of life Social Types of impact Cultural Public policy & services Environmental Health
Impact criteria Assessment against Impact pilot exercise Reach (how widely the impacts have been felt) Significance (how transformative the impacts have been) Impact pilot exercise Physics & English
Impact – what we submit Impact statement Case studies Our approach / strategy Range and breadth of interactions with research users Case studies 1 case study per 5-10 staff (min 2?) Both supported by appropriate metrics
Timetable (subject to change!!!) 2010 Impact pilot submission & results Expert panels established Guidance to HEIs 2012 Submissions (November?) 2013 Outcomes published (December?)