Benchmarking Pilot Launch Meeting, 20 January 2006, London1 Benchmarking e-learning: The Pick & Mix approach Professor Paul Bacsich Matic Media Ltd
Benchmarking Pilot Launch Meeting, 20 January 2006, London2 Pick & Mix Overview and history
Benchmarking Pilot Launch Meeting, 20 January 2006, London3 Pick & Mix overview Focussed purely on e-learning, but could be extended more widely Draws on several sources and methodologies Not linked to any particular style of e-learning (e.g. distance or on-campus or blended) Oriented to institutions past “a few projects” Suitable for desk research as well as “in-depth” Suitable for single- and multi-institution studies
Benchmarking Pilot Launch Meeting, 20 January 2006, London4 Pick & Mix history Initial version developed in early 2005 in response to a request from Manchester Business School for competitor study Since deepened by literature search, discussion, feedback, presentations (UK, Brussels, Berlin, Sydney, Melbourne) and workshops (ALT-C and Online Educa)
Benchmarking Pilot Launch Meeting, 20 January 2006, London5 Pick & Mix Criteria and metrics
Benchmarking Pilot Launch Meeting, 20 January 2006, London6 How Many Criteria? BM is like ABC: how many activities? Vague answer: Not 5, not 500 Better answer: Well under 100
Benchmarking Pilot Launch Meeting, 20 January 2006, London7 Pick & Mix: 18 core criteria Composited some criteria together Removed any not specific to e-learning Was careful about any which are not provably critical success factors Left out of the core some criteria where there was not (yet) UK consensus Institutions will wish to add specific ones to monitor their objectives and KPIs. This is allowed – system is extensible.
Benchmarking Pilot Launch Meeting, 20 January 2006, London8 Pick & Mix Metrics Use a 6-point scale (1-6) –5 from Likert plus 1 more for “excellence” Backed up by continuous metrics where possible Also contextualised by narrative Some criteria are really “criteria bundles” There are always issues of judging progress especially “best practice”; judging “better practice” is easier –e.g. VLE convergence
Benchmarking Pilot Launch Meeting, 20 January 2006, London9 Pick and Mix System: summary Based on survey of “best of breed” ideas 6-point scale (Likert + excellence) Backed up by narrative and metrics 18 core criteria (e-learning specific) Can easily add more in same vein for local needs Output and student-oriented aspects covered Focussed on critical success factors Methodology-agnostic but uses underlying methodologies where useful Requires no long training course to understand –But users must know and be undogmatic about e-learning
Benchmarking Pilot Launch Meeting, 20 January 2006, London10 Other aspects of Pick & Mix Core criteria are more output and process- focussed than input-focussed; could add input criteria Explicit (otherwise danger of lying not trying) Independent or collaborative Internal (done by you) or external (done to you) Horizontal –Allows a focus on processes across whole institution – but can look at individual projects, missions and departments to get a “range of scores”
Benchmarking Pilot Launch Meeting, 20 January 2006, London11 Pick & Mix Three sample criteria
Benchmarking Pilot Launch Meeting, 20 January 2006, London12 “Adoption phase” (Rogers) 1.Innovators only 2.Early adopters taking it up 3.Early adopters adopted; early majority taking it up 4.Early majority adopted; late majority taking it up 5.All taken up except laggards, who are now taking it up (or retiring or leaving) 6.First wave embedded, second wave under way (e.g. m-learning after e-learning)
Benchmarking Pilot Launch Meeting, 20 January 2006, London13 “Training” 1.No systematic training for e-learning 2.Some systematic training, e.g. in some projects and departments 3.U-wide training programme but little monitoring of attendance or encouragement to go 4.U-wide training programme, monitored and incentivised 5.All staff trained in VLE use, training appropriate to job type – and retrained when needed 6.Staff increasingly keep themselves up to date in a “just in time, just for me” fashion except in situations of discontinuous change
Benchmarking Pilot Launch Meeting, 20 January 2006, London14 “Accessibility” 1.e-learning material and services is not accessible 2.Much e-learning material and most services conform to minimum standards of accessibility 3.Almost all e-learning material and services conform to minimum standards of accessibility 4.All e-learning material and services conform to at least minimum standards of accessibility, much to higher standards 5.e-learning material and services are accessible, and key components validated by external agencies 6.Strong evidence of conformance with letter & spirit of accessibility in all countries where students study Too aspirational, too international, too regulated?
Benchmarking Pilot Launch Meeting, 20 January 2006, London15 Thank you for listening Any questions? Professor Paul Bacsich for this work is