Reviewing professional services: Support Process Review (SPR) at the University of Bristol HESA/SUMS Seminar, 24 th June 2011 Helen Galbraith, Director of Planning and Deputy Secretary, University of Bristol
The University of Bristol 6 faculties, 27 academic schools, 15 research centres (each with their own support structures) 13,500 Undergraduate students 5,500 Postgraduate students 11 applicants for each UG place (2009) Estate of over 300 buildings, diversely located Turnover £350m 5800 members of staff (~1900 admin/professional)
Reasons for review Largely driven by financial pressures and need to reduce our staff cost base: Existing and planned reductions in Government funding Staff costs – pay awards, incremental/ promotional drift, pensions Salary costs a high proportion of overall cost base compared to peers (HESA data). Acknowledged problems in current systems, processes and ways of working, inconsistent service to staff and students. Concluded that more efficient and effective support structures were needed, saving £4-6m p.a.
Phase 1: Costs and Opportunities Key question: where might we focus our activity? High-level activity costing of staff time spent on major support processes. Each manager asked to identify FTE of their support staff spent on each activity. Aim to identify priority areas for further analysis and subsequent process change. Identified 8 broad process areas for focussed review. Each area allocated a ‘process owner’, responsible and accountable for process and its resources & systems.
Phase 2: Detailed Review Key question: where are the specific opportunities to streamline processes and systems? Key approaches taken to detailed review: 1. Process review workshops. 2. Internal benchmarking of school-level activity.
Process workshops Aims: To understand broad process areas in more detail. To identify best and poor practice. To look at the use of corporate and local systems. To identify the ‘optimum’ process map in each case. Outcomes Inefficiencies identified in every major process area. Considerable duplication of effort, with multiple inputting of the same data into a plethora of systems. Multiple points of authorisation = lack of ownership.
Process workshops
School-level benchmarking Aims: To undertake a detailed analysis of support structures within each academic school in order to understand: What support is currently being provided. Rationale for differences in structure. Examples of efficient practice. To look for opportunities to: Standardise processes to achieve efficiency gains. Remove duplication and single points of failure. Improve use and enhance quality of corporate systems.
School-level benchmarking Outcomes: A wide range of approaches to administration. Different staffing profiles for similar sets of tasks. Standards/practices vary widely – a lot of best practice which could be shared. A plethora of home-grown local systems, some excellent What is discipline-specific, what is custom & practice? (=how much can be standardised?). Varying levels of academic input into support processes.
Phase 3: Implementation Implementation of ‘top to toe’ support structures controlled by process owners. “One activity, one process, one place” – main focus at faculty level. Greater standardisation, greater resilience and economies of scale: More generic roles focussed on a single process. Staff appointed to process not to school/department. Process overseen by process owner – minimal ‘exceptions’. A renewed programme of investment in IT systems. New structures in place by September 2011 but ‘continuous improvement’ beyond this.
Phase 4: Benefit delivery Key question: How do we measure ‘success’? Difficult to energise colleagues about importance of this! Focus on benchmarking relative to previous position, rather than our position relative to peers. A series of measures identified against programme objectives: Staff survey – 3-monthly to same group of staff. Review of academic time spent on administration. KPIs – response times, volumes of complaints, £ time spent on key processes, reports from ‘service desk’ systems. Student surveys – NSS, ISB, Students’ Union survey. Systems – business cases with identified benefits/measures.
External benchmarking What did we do? Sharing of best practice in managing structural change Some use of external consultants (e.g. Estates VFM) Sector-wide interest in process review High-level analysis of HESA data but impeded by structural, physical and cultural factors. Multiple site visits to explore usage of key systems ‘Admin benchmarking group’ with Planning colleagues What more could we do? Further use of HESA data to measure progress Benchmarking at school level – e.g. medical schools. What is the ‘optimum’ structure?
Reviewing professional services: Support Process Review (SPR) at the University of Bristol HESA/SUMS Seminar, 24 th June 2011 Helen Galbraith, Director of Planning and Deputy Secretary, University of Bristol