Download presentation
Presentation is loading. Please wait.
Published byBernard Skipper Modified over 10 years ago
1
Course costing… Dr Thomas Loya, Director Planning and Management Information University of Nottingham HESA: Process Benchmarking Seminar – June 2011 and reflections on process benchmarking
2
HESA: Process Benchmarking Seminar – June 2011 2 Overview Process benchmarking: evaluating aspects of processes in relation to peer group / sector best practice – but how best to determine? 1.Various models for identifying good practice 2.Course costing: University of Nottinghams journey Introduction
3
HESA: Process Benchmarking Seminar – June 2011 3 Independent HE Research Agencies (USA) Extensive specialist research capability Membership based - but wide representation Expanding membership into UK Strong basis for Identifying good practice Comprehensive coverage of HE activities Examples: – Education Advisory Board – Hanover Research Council Process BM: Other models
4
HESA: Process Benchmarking Seminar – June 2011 4 Standards organisations (BSI, ISO, etc) Justifiable claim to define best practice Extensive cross-sector engagement Strong links to international / global practice Negligible engagement by UK HE sector! Strong coverage of biz process improvement: Risk mgmt, Info security, environmental mgmt, business continuity, accessibility, business system documentation, supply chain mgmt, etc Process BM: Other models
5
HESA: Process Benchmarking Seminar – June 20115 Purposes and issues Assess teaching cost, income & profitability Focus resource on higher net revenue areas Cultural change: cost & efficiency awareness Identify all costs - in Schools and centre Provide powerful form of mgmt information Not to be used in isolation; context is a review / refresh of institutional portfolio Course costing
6
HESA: Process Benchmarking Seminar – June 20116 Two overall approaches Bottom-up: activity based costing –Aggregate up specific activity costs –Detailed time, quantity, activity data –Fine grained; costly to gather & analyse data –Successful one-School pilot => scalability issues Top-down –Parcel out high-level costs to modules –Aggregate up costs to courses –Broad-brush, use centrally held data –Set out on this path for first institution-wide exercise Course costing
7
HESA: Process Benchmarking Seminar – June 20117 Course costing TRAC Teaching Costs By School Assessment Variable CostsFixed Costs TeachingOther The Model: Costs
8
HESA: Process Benchmarking Seminar – June 20118 Course costing The Model: Shares Module Variable Cost Amount Credits * School share * weight * students Module Fixed Cost Amount Credits * School share * weight
9
HESA: Process Benchmarking Seminar – June 20119 Course costing The Model: Modules Module Variable Cost Amount Module Fixed Cost Amount Fixed Costs Variable Costs Module Income Total Module Income and Costs Central Costs
10
HESA: Process Benchmarking Seminar – June 201110 Course costing The Model: Courses Module Income and Costs Students by Course and Module Course Income and Costs
11
HESA: Process Benchmarking Seminar – June 201111 Outputs Income, school and central costs => operating and net margins, for every course Currently covers UG + PGT, for 2 years Identify cost drivers, areas for review Valuable input to academic strategy development and institutional portfolio review Course costing
12
HESA: Process Benchmarking Seminar – June 201112 Some issues and next steps Costs dependent on module classification Skepticism about use of TRAC data Input data as valuable as headline figures Complex chain of reasoning to mgmt action Have moved to bottom-up approach! Course costing
13
HESA: Process Benchmarking Seminar – June 201113 Practical considerations High data volumes led to use of Cognos EP for data management and analysis Long period to register impact of changes To gain full value, needs to be repeatable Bottom-up data gathering is costly Can appear contrary to academic culture Course costing
14
HESA: Process Benchmarking seminar – June 201114 Implications for benchmarking… Feasibility dependencies: data availability, data mgmt capability, drivers, scale, scope… Costs may reflect mission, subject mix, organisation structure, efficiency of ops Powerful internal metric, but… –Not likely to be a quick win –Difficult/impossible to include quality –Limited scope comparability b/w HEIs Course costing
15
HESA: Process Benchmarking Seminar– June 201115 Issues, questions and concerns Best practice vs just good practice(s)? Efficiency - institution or sector attribute? Competition - cuts both ways Can be irrational to share real innovation Myth of HE exceptionalism What do we want from HESA? Finally…
16
Thank you For questions, please contact: Thomas.loya@nottingham.ac.uk
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.