Download presentation
Presentation is loading. Please wait.
Published byRose Gilmore Modified over 8 years ago
1
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester1 Benchmarking e-learning: Putting the Pick & Mix approach into action Professor Paul Bacsich Consultant to Higher Education Academy (formerly at OU, SHU and UKeU)
2
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester2 Timetable for the morning This is the first half of a two-part workshop Its focus is to decide on the criteria
3
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester3 Timetable 0930-1000: Overview 1000-1045: The core criteria 1045-1130: Supplementary and local criteria 1130-1200 Collecting the evidence
4
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester4 Benchmarking and Pick & Mix Overview (quick) (0930-1000)
5
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester5 Benchmarking e-learning Outlined in HEFCE e-learning strategy Higher Education Academy is overseeing it Pilot phase: 12 HEIs chosen of which Leicester is one (along with Chester and Staffordshire) –In three groups of 4 each (above 3 plus Manchester) –Each with an “uncle” (in our case, Paul) Justifies entry to Pathfinder programme Phase 1 in autumn: 42 more HEIs Phase 2 later, for the rest of the HEIs (Russell…)
6
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester6 Pick & Mix features Focussed purely on e-learning, but could be extended more widely (e.g. L&T, IT, QA) Draws on several sources and methodologies – pragmatic Not linked to any particular style of e- learning (distance, on-campus, blended) Orientation to outputs and process, but could add more on inputs
7
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester7 Pick & Mix history Initial version developed in early 2005 in response to a request from Manchester Business School for competitor study Refined by literature search, discussion, feedback, presentations and workshops Being used in 3 of the 4 institutions in Paul’s mini-club for the HEA benchmarking pilot: Chester, Leicester, Staffordshire
8
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester8 Pick & Mix Criteria and metrics
9
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester9 Pick & Mix: 18 core criteria Composited some previous criteria together But sometimes split others e.g. for legal reasons Removed any not specific to e-learning (including QAA territory) Tended to leave out ones which were not critical success factors Left out of the core some criteria where there was not (yet) UK consensus Leicester wishes to add specific ones to monitor their objectives and KPIs. This is encouraged.
10
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester10 Pick & Mix Metrics Use a 6-point scale (1-6) –5 normal points plus 1 more for “excellence” Backed up by continuous metrics where possible Also contextualised by narratives of “better practice” Many criteria are really groups of sub- criteria – “drill-down”, additive scoring,….
11
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester11 Pick & Mix Two sample criteria
12
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester12 01 Adoption phase (Rogers) 1.Innovators only 2.Early adopters taking it up 3.Early adopters adopted; early majority taking it up 4.Early majority adopted; late majority taking it up 5.All taken up except laggards, who are now taking it up (or retiring or leaving) 6.First wave embedded, second wave under way (e.g. m-learning after e-learning)
13
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester13 01 Adoption – multiple Note that there is a range of hills, receding into the distance… –Word –Email –VLE –Streaming –M-learning –podcasting
14
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester14 10 Training 1.No systematic training for e-learning 2.Some systematic training, e.g. in some projects and departments 3.U-wide training programme but little monitoring of attendance or encouragement to go 4.U-wide training programme, monitored and incentivised 5.All staff trained in VLE use, training appropriate to job type – and retrained when needed 6.Staff increasingly keep themselves up to date in a “just in time, just for me” fashion except in situations of discontinuous change
15
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester15 Training – multiple… Different departments may organise it Different rates of adoption for different staff: –IT support –Learning technologists –Academics –Admin staff (e.g. supporting DL)
16
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester16 Questions?
17
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester17 The core criteria (18) (1000-1045)
18
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester18 Criteria 01 to 06 1.Adoption phase 2.VLE stage (includes “productive” use) 3.Tools use 4.Usability 5.Accessibility 6.E-Learning Strategy
19
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester19 06 e-learning strategy Depth: Top-level and Schools Breadth: School level, all schools Frequency: How often updated Integration up: Do they talk to each other Integration out: Does it link to others? –IT –Learning and Teaching –Physical space planning
20
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester20 Criteria 07 to 12 7.Decision-making 8.Instructional design and pedagogy 9.Learning material 10.Training 11.Academic workload planning in e-learning 12.Costs management
21
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester21 Criteria 13 to 18 13.Planning (annual) 14.Evaluation (and revalidation) 15.Organisation 16.Technical support 17.Quality and Excellence 18.Staff reward and recognition
22
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester22 Supplementary and local criteria (1045-1130)
23
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester23 Supplementary criteria Based on further work and feedback from “my” HEIs, new criteria are under consideration for a “supplementary” set – and some may be added into the core criteria in Pick & Mix release 2.0
24
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester24 Suggeste supplementary criteria #1 58: Market research 59: Competitor research 55: Foresight 68: Research outputs 99: Convergence of on- and off-campus support levels
25
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester25 Student supplementary criteria (These have been composited) 93: Student understanding of e-learning system 94: Student satisfaction with e-learning system 63: Level of exploitation of student IT and information skills
26
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester26 Local criteria from EoI, refined I believe that they are all covered in supplementary
27
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester27 Issues with local criteria Some are variants of benchmarks Some are “drill downs” Some are taxonomic BUT Doesn’t matter as long as benchmarking remains the prime focus
28
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester28 Local criteria - updates (to discuss)
29
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester29 Scoring, evidence and documentation 1130-1200
30
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester30 Scoring 5-point scale is norm, point 6 is exceptional Focus not only best practice but also bad practice and real practice Do not try to ensure that your HEI and all HEIs score 5
31
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester31 Evidence and documentation Have a file for each criterion Some narrative will be needed but aim is NOT to recreate the QAA-style SED or DATs We are hoping to group the criteria into about 4 or 5 clumps (but not easy)
32
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester32 Thank you for listening Any questions? Professor Paul Bacsich Email for this work is pbacsich@runbox.com
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.