UniForum (University Operations Forum) Student Support & Service Study HoSA Conference Auckland, September 2011.

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

1 PRESENTATION TO THE PORTFOLIO COMMITTEE ON PUBLIC SERVICE AND ADMINISTRATION Report on the causes and effects of mobility amongst senior management service.
12 August 2004 Strategic Alignment By Maria Rojas.
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Delivering effective enterprise education: the role of learning design and technology Professor Pauric McGowan University of Ulster Dr Richard Blundel.
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
HR Manager – HR Business Partners Role Description
SCCD Year 4 Analysis All 32 reports received on time at end of fy Carbon Trust, Adaptation Scotland, SSN analysis CAG Consultants engaged to undertake.
SEM Planning Model.
Consistency of Assessment
Evaluating the Mixed Economy Model in Central Scotland Police Kenneth Scott Director, Centre for Criminal Justice and Police Studies University of the.
Reviewing professional services: Support Process Review (SPR) at the University of Bristol HESA/SUMS Seminar, 24 th June 2011 Helen Galbraith, Director.
CISB444 - Strategic Information Systems Planning
Business Excellence within The University of Bolton Strategic Planning Process 17 th October 2006.
Viewpoint Consulting – Committed to your success.
Quality evaluation and improvement for Internal Audit
What SMS means for an Operator’s relationship with the CAA
Victorian Managed Insurance Authority APCO Presentation – Risk Management in the VPS Jonathon Masom – Risk Management Adviser.
Formulation of a Government-wide Fixed Asset Management Framework
with Terry Doerscher Doerscher Consulting LLC
IDP Conference 2004 “Developmental Governance in Action” SESSION 7: Integrated Governance: > Provincial-Municipal Engagement - A Provincial Perspective.
The Student Experience Project Overview for Kosovo Higher Education visit Mark Wilkinson October 2014.
Training and Learning Needs Analysis (TLNA) a tool to promote effective workplace learning & development Helen Mason, Project Worker, Unionlearn Representing.
Diana Laurillard Head, e-Learning Strategy Unit Overview of e-learning: aims and priorities.
Needs Analysis Session Scottish Community Development Centre November 2007.
2008 Adobe Systems Incorporated. All Rights Reserved. Developing an eLearning Strategy at a Nigerian University By Jerome Terpase Dooga, Christopher Tony.
From Conformance to Performance: Using Integrated Risk Management to achieve Organisational Health Ms Stacie Hall Comcover National Manager.
THE REGIONAL MUNICIPALITY OF YORK Information Technology Strategy & 5 Year Plan.
Performance Measurement and Analysis for Health Organizations
Smart Grid Forum - Update DCMF Meeting – 7 February 2013 Gareth Evans Head of Profession – Engineering Ofgem.
UNDP-GEF Adaptation 0 0 Impact of National Communications on Process of Integrating Climate Change into National Development Policies UNFCCC Workshop on.
1 Perform! Benchmark™ - OVERVIEW & DATA CAPTURE INFORMATION Current State Benchmarking & Best Practices Analysis.
CSI - Introduction General Understanding. What is ITSM and what is its Value? ITSM is a set of specialized organizational capabilities for providing value.
Irene Khan – Secretary General Building effective and responsive INGOs, the strategic role of HR: The IS Job Value Review 8 February 2008.
Engaging Student Affairs Professionals in Division-Wide Assessment Lisa Garcia-Hanson,University of Washington Tacoma Charlotte Tullos, Central Washington.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Capacity Self-Assessment as a management tool for organisational development planning u A model used for the Ministry of Foreign Affairs and European Integration,
The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE,
1 Perform! Benchmark™ - OVERVIEW Current State Benchmarking & Best Practices Analysis Tool for the Public Sector.
1 1 Workshop on Improving Statistics on SME's and Entrepreneurship, Paris, September 2003 Draft Conclusions and Recommendations.
JOINING UP GOVERNMENTS EUROPEAN COMMISSION Establishing a European Union Location Framework.
Introduction to the Continual Service Improvement Toolkit Welcome.
M & E TOOLKIT Jennifer Bogle 11 November 2014 Household Water Treatment and Water Safety Plans International and Regional Landscape.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
We provide web based benchmarking, process diagnostics and operational performance measurement solutions to help public and private sector organisations.
We provide web based benchmarking, process diagnostics and operational performance measurement solutions to help public and private sector organisations.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Building Effective Staff Development to Support Employer Engagement Jane Timlin & Renata Eyres The University of Salford.
Kathy Corbiere Service Delivery and Performance Commission
1 Fit for the Future Selvin Brown MBE Programme Director, GCS Improvement Programme November 2015.
Joint Priority Project #2: Service Visions and Mapping Presentation to PSSDC/PSCIOC Winnipeg, Manitoba, September 28, 2004 By: Industry Canada Ontario.
Strategies for making evaluations more influential in supporting program management and informing decision-making Australasian Evaluation Society 2011.
CSI - Introduction ITIL v3.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Info-Tech Research Group1 Manage the IT Portfolio World Class Operations - Impact Workshop.
Peer Review and Development Group East Midlands Independent Specialist Colleges June Murray and Norma Curtis 20 May 2009.
Leading Nottingham Programme update to ACOS 7 September 2010 Angela Probert Director of HR and Organisational Transformation Contributions from Lisa Sharples.
Transforming Practice Program Project Report. UWS Academic Careers & Development The Context: The UWS Academic Career project takes a holistic approach.
How Good are you at Managing your Processes? Operational Excellence.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
The Global Partnership Monitoring Framework Indicator on Use of Results Framework Seoul GP Annual Workshop Indicator Clinic 6 November 2014effectivecooperation.org.
Procurement Development Programs
PMO Awareness and Support Presentation
Investment Logic Mapping – An Evaluative Tool with Zing
BUMP IT UP STRATEGY in NSW Public Schools
Programme Board 6th Meeting May 2017 Craig Larlee
Evaluating Partnerships
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Draft OECD Best Practices for Performance Budgeting
Regional Workforce Programme
Presentation transcript:

UniForum (University Operations Forum) Student Support & Service Study HoSA Conference Auckland, September 2011

UniForum: Outline UniForum: 1. The Process: what we did private forum of 5 Universities goal is to improve efficiency and effectiveness of support services through major institutional insights study is under a collaborative framework facilitated by CUBANE Consultants to enable structured approach & review of key operational areas 1. The Process: what we did 2. The Constraints: limitations & blockages 3. The Current Activities: what we are doing 4. The Learnings: take-away initiatives

UniForum: The Process Data Input Professional Staff Surveys: Identifying professional staff & worked proportion FTE for 12 month period as coded for each activity e.g.. UoM 2010 Activity Analysis Project Professional Staff Benchmarking: - FTEs from Professional Staff Surveys were used to analyse differences into comparable data (coding, scaling and normalisation of raw data. etc) Priority Activity Areas Priority = require significant resourcing - Admissions - Enrolments - Examinations & Special Consideration - Student/Academic Advising (including Unsatisfactory Progress) Local Reference Areas - Business & Economics - Education - Science - Engineering Data from local areas used for benchmarking study. Data Input University site visits for priority activity areas: Panel meetings/ interviews with relevant staff from local reference areas and central areas In depth analysis of business process, system enablers, policy framework

Workshop 3 (Conference) UniForum: The Process Participant Input 3 workshops engaged participants with findings/ observations including qualitative assessment of the normalised data. Workshop 1 Established key priority areas & chose reference faculties/ disciplines across participants Workshop 2 Reviewed consultant’s activity analysis findings and draft benchmarking report Considered performance, improvement options & evaluated measures Analysed models of service delivery for student administration and student support services for the 5 comparator universities focusing on priority activities Workshop 3 (Conference) Tested presentation and understanding of insights to other participants Highlighted and tested relative position to “leading practices” across participants Considered maturity of priority activity processes (including business and IT systems / resource management) and discussed next steps

Example report data and analysis UniForum: The Process Example report data and analysis Front-line activities

UniForum: The Process Participants presented at WS 3 (conference): Devolved or central model Activities / Functions Casual / permanent staff mix Where are we high / low in resourcing and / or costs? What structures and practices are creating this position? What implications do these positions have for scope and quality of service? Staff seniority Peak management Frontline/ Governance mix Consistency & accessibility vs. variability Generalist or specialist staff Value: $ vs quality

UniForum: The Process Participants presented to their Executive : Participants presented at WS 3 (conference): What structures / practices do we have that appear to be leading the way amongst the 5 universities? What structures / practices do other universities have that we could benefit from adopting / moving towards? What activities require further investigation and the potential value of investing effort in this area? Benchmarking Highest resourced activities Participants presented to their Executive : Objectives: prompt discussion on key opportunities and implications for existing / planned projects Target: activities with greatest enterprise-wide opportunity and interest

UoM: Current Activities Data Mining at central, faculty and departmental levels identify anomalies, trends, issues Test Assumptions at local 2nd & 3rd tier level for reviewed activities address anomalies, identify cost patterns/trends Bench-marking across similar functional units (eg. Student Centres) establish best practice & efficient practices against quality of service to test proposition that High $ = High quality Cross Reference with other surveys / data – International Barometer, student experience etc

UoM: Current Activities Next Steps: Present findings and analysis to divisional heads: Examine contextualised recommendations Consider opportunities for process management changes Training needs or business improvement analysis Meet Business Plan objectives & KPIs

Monash University - issues for further consideration Functional activities are Highly devolved (i.e. b/n central vs faculty) Highly devolved within faculties Predominantly staffed at HEW 7 and below Predominantly generalists, less specialised Benchmarks and metrics need to be defined to measure Costs and distribution of resources service standards to applicants/students

UniForum: Constraints Reviewing Process Issues Opportunity lost to obtain context for data since site visits prior to data release Difficulty of interpretation. Are the context of derivation and our pre-conceptions important in the interpretation of data? Pre-selection of priority areas risks pre-determining outcomes & opportunity lost to uncover other significant issues Variety in structures and scope make institutional comparisons difficult

UniForum: Constraints Data Integrity Issues Variability in coding at activity (cf. functional) level (though this was countered during the site visits) Staff Survey data alignment for some key activity areas (e.g. Enrolments & Admissions). Variability in coding due to varying operational contexts of budget unit and perceptions of roles (work to be done to align Staff Survey process) Data may require context to be indicative of a significant issue. Detailed qualitative data analysis may be needed.

UniForum: Constraints Organisational Buy-In Immature learning culture –short term focus, misaligned structures (central / local) – no strategy around comms or enforcing incorporation of best practice into work practice Staff Survey undertaken without understanding why or context of what will be done with the data Diverse organisational values & practices and change fatigue make it difficult to integrate benchmarking lessons Innovation & change tend to be locally driven – enterprise wide strategic objectives/priorities requiring local change difficult. Small units share knowledge & handle change easier Data may require context to be indicative of a significant issue. Detailed qualitative data analysis may be needed Benchmarking activities are not effectively challenging the organisational perceptions at the local level

UniForum: Learnings Consistency in data collection Maximise value by providing adequate resourcing for study lead Improved Promotion / Communication of benchmarking tool to drive change & achieve excellence Can lead to ongoing investigation / learning experience by ensuring best practice adopted enterprise wide Can provide a rigorously informed method of developing quality improvement projects based on sector best practices Synergy of quantitative and qualitative approaches but greater emphasis on quantitative required to fully understand outputs Discussion of best practice elsewhere valuable in terms of stimulating thinking Improvements in data collection over time will lead to greater confidence in integrity and trend analysis

UniForum: Learnings Take-away Initiatives Establish a detailed reporting framework Integrate with planning process and align outcomes with other benchmarking activities Determine extent outcomes may inform decisions Consider profiling lead practices in view of business process reviews due to policy developments

UniForum: Questions ?