Benchmarking Pilot Workshop 1, 17 March 2006, Leicester1 Benchmarking e-learning: Putting the Pick & Mix approach into action Professor Paul Bacsich Consultant.

Slides:



Advertisements
Similar presentations
Personal Development Plans (PDPs) Subject-specific PDPs in Economics.
Advertisements

Directorate of Human Resources Understanding design for learning Dr. Rhona Sharpe Oxford Centre for Staff and Learning Development.
Dr Catherine Bovill Academic Development Unit University of Glasgow Dr Kate Morss and Dr Cathy Bulley, Queen Margaret University.
A centre of expertise in digital information managementwww.ukoln.ac.uk Approaches To E-Learning: Developing An E-Learning Strategy Brian Kelly UKOLN University.
Introducing the Researcher Development Framework (RDF) Gill Johnston, University of Sussex.
Confirming Student Learning: Using Assessment Techniques in the Classroom Douglas R. Davenport Dean, College of Arts and Sciences Truman State University.
The Creative Learning Journey an introduction . . .
Specialist leaders of education Briefing session for potential applicants Application window now open 5 – 23 May 2014.
Teacher Evaluation Model
Kay Hutchfield, Senior Lecturer Department of Health, Wellbeing and the Family Canterbury Christ Church University Canterbury, Kent.
The Higher Education Academy Update Eddie Gulc Senior Adviser The Higher Education Academy 25 May 2006 York.
Beyond Your VLE – Strategic Challenges – Embedding at Staffordshire University Professor Mark Stiles Head of Learning Development & Innovation Staffordshire.
Facilitators: Janet Lange and Bob Munn
Supporting education and research E-learning tools, standards and systems Sarah Porter Head of Development, JISC.
Higher Education Academy Update Lawrence Hamburg, Assistant Director and Head of e-learning HeLF: Wed 2 April 2008.
Program Review: The Foundation for Institutional Planning and Improvement.
National Commission for Academic Accreditation & Assessment Preparation for Developmental Reviews.
FedEx Corporation Must be included:
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
The BLU and the HEA/JISC Benchmarking Pilot The Higher Education Academy (HEA)/JISC Benchmarking Exercise The HEA is working with the JISC on a major Benchmarking.
Exploration of the Student The Student Experience of E Learning: beginning to bridge the digital divide. Lynne Jump and Mandy Atkinson University of Greenwich.
KTH Seminar, Stockholm, Sweden, 9 October Approaches to quality in e-learning through benchmarking programmes Professor Paul Bacsich Matic Media.
Institutional embedding of e- learning: findings from the Adelie Pathfinder project Alejandro Armellini and Sylvia Jones Beyond Distance Research Alliance.
Week 2 Standards and evidence Building your professional persona and portfolio.
Customer Service Excellence Standard – adding value for your students Helen Loughran Libraries and Learning Innovation Leeds Metropolitan University
Embedding information literacy into the curriculum - How do we know how well we are doing? Katharine Reedy (Open University) Cathie Jackson (Cardiff University)
Understanding the postgraduate experience Chris Park Director, Lancaster University Graduate School Senior Associate, Higher Education Academy (HEA)
E mpowering S taff T hrough I nstitute P lanning (ESTIP) Academic managers workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
Welcome slide. Enhancing learning, teaching and assessment: an overview of national initiatives in the UK Presented by Richard Blackwell, HEFCE Regional.
Creating Value Learning resources for managers to deliver efficiencies whilst improving effectiveness Iain Springate, Project Manager & Researcher.
UCP JISC RSC Conference, Bristol 8 th April 2008 Embedding e-Learning in everyday practice David Benzie & Adam Read March 2008 USB_B\Pathfinder\RSC.
The International Entrepreneurship Educators Conference Sharing experience across contexts: UK Higher Education Academy Subject Centres.
1 Standards, quality assurance, best practice and benchmarking in e-learning Professor Paul Bacsich Matic Media Ltd, and Middlesex University, UK.
Survey tools, focus groups and video as a means of capturing student experience and expectations of e-learning Dave.
Angela Hammond University of Hertfordshire Putting internationalisation into practice: how to inform and develop your teaching. SEDA Spring Conference.
ACODE 39: Quality in Teaching and Learning, November 2005, Melbourne1 Benchmarking in e-learning: an overview Professor Paul Bacsich Matic Media.
Key features of the University of Manchester Professor Cathy Cassell Deputy Director (Academic) Sarah Featherstone Head of Undergraduate Services Original.
Open practice and staff development at Oxford University.
Using Commtap Communication Targets and Activities Project.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results Dr Pam Wells Adviser, Evidence-Informed Practice.
Introduction to Development Centres Sandra Schlebusch The Consultants.
Evaluating E-Learning Efficacy University of York, UK Wayne Britcliffe and Simon Davis Edinburgh Napier Learning and Teaching conference 14 th June 2012.
© Copyright 2013 OPP Ltd. All rights reserved. Making MBTI ® feedback more memorable and meaningful Vanessa Rhone, Lead Consultant.
Welcome to the UK Premiere of Internet Detective Brought to you today by … Emma Place, RDN Training Manager, Institute for Learning and Research Technology.
HEFCE/Higher Education Academy/JISC cc-by-sa (uk2.5) Image source – flickr (cc-by) OER and the Open Agenda Malcolm Read, Executive Secretary, JISC.
UK eUniversities: eLearning Research Centre and the Community Professor Paul Bacsich Director of Special Projects UK eUniversities Worldwide Limited.
Changes to assessment and reporting of children’s attainment A guide for Parents and Carers Please use the SPACE bar to move this slideshow at your own.
What students really think of their reading lists: reading list software at the University of Huddersfield Alison Sharman 2015.
Accreditation! The responsive curriculum game is made available through JISC under the terms of the Creative Commons BY-NC-SA Attribution-NonCommercial-ShareAlike.
Performance Enabling – Engagement & Cultural Change.
Advancing teaching: inspiring able learners every day Meeting the Challenge 14 th November 2012.
Evaluating E-Learning Efficacy for blended learning courses : University of York, UK Wayne Britcliffe ALT-C : A confrontation with reality 11 th September.
The Quality Assurance Agency for Higher Education ‘Tutoring for the 21 st Century’ 28 January 2015 Harriet Barnes Natalja Sokorevica Standards, Quality.
Applying Laurillard’s Conversational Framework to Blended Learning Blogging and Collaborative Activity Design R Papworth, R Walker & W Britcliffe E-Learning.
Benchmarking Pilot Launch Meeting, 20 January 2006, London1 Benchmarking e-learning: The Pick & Mix approach Professor Paul Bacsich Matic Media Ltd.
Effecting institutional change through the evaluation of e-learning Richard Walker & Rose Papworth E-Learning Development Team, University of York eLearning.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results
Customer Service Excellence Standard – adding value for your students
Learning Into Practice Plan
Evaluating Learning Technologies at University of Worcester
Lisa Dawson – Head of Student Systems Operations
Introductions… Who am I? Why am I here?. Programme Leaders Developing Academic Leadership and Innovative Practice.
28-Nov-18 Benchmarking e-learning in UK Universities: Reflections on University of Leicester’s participation in the HEA-led benchmarking of e-learning.
DETAILED PLANNING.
Learning gain metrics and personal tutoring: Opportunities and ethics
Developing a Research Impact Capture System
Learning gain metrics and personal tutoring: Opportunities and ethics
DCB Annual Review of Teaching Performance
Anna Gaughan Centre for Local Governance 26th March 2008
E-learning benchmarking at Leicester
Presentation transcript:

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester1 Benchmarking e-learning: Putting the Pick & Mix approach into action Professor Paul Bacsich Consultant to Higher Education Academy (formerly at OU, SHU and UKeU)

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester2 Timetable for the morning This is the first half of a two-part workshop Its focus is to decide on the criteria

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester3 Timetable : Overview : The core criteria : Supplementary and local criteria Collecting the evidence

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester4 Benchmarking and Pick & Mix Overview (quick) ( )

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester5 Benchmarking e-learning Outlined in HEFCE e-learning strategy Higher Education Academy is overseeing it Pilot phase: 12 HEIs chosen of which Leicester is one (along with Chester and Staffordshire) –In three groups of 4 each (above 3 plus Manchester) –Each with an “uncle” (in our case, Paul) Justifies entry to Pathfinder programme Phase 1 in autumn: 42 more HEIs Phase 2 later, for the rest of the HEIs (Russell…)

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester6 Pick & Mix features Focussed purely on e-learning, but could be extended more widely (e.g. L&T, IT, QA) Draws on several sources and methodologies – pragmatic Not linked to any particular style of e- learning (distance, on-campus, blended) Orientation to outputs and process, but could add more on inputs

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester7 Pick & Mix history Initial version developed in early 2005 in response to a request from Manchester Business School for competitor study Refined by literature search, discussion, feedback, presentations and workshops Being used in 3 of the 4 institutions in Paul’s mini-club for the HEA benchmarking pilot: Chester, Leicester, Staffordshire

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester8 Pick & Mix Criteria and metrics

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester9 Pick & Mix: 18 core criteria Composited some previous criteria together But sometimes split others e.g. for legal reasons Removed any not specific to e-learning (including QAA territory) Tended to leave out ones which were not critical success factors Left out of the core some criteria where there was not (yet) UK consensus Leicester wishes to add specific ones to monitor their objectives and KPIs. This is encouraged.

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester10 Pick & Mix Metrics Use a 6-point scale (1-6) –5 normal points plus 1 more for “excellence” Backed up by continuous metrics where possible Also contextualised by narratives of “better practice” Many criteria are really groups of sub- criteria – “drill-down”, additive scoring,….

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester11 Pick & Mix Two sample criteria

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester12 01 Adoption phase (Rogers) 1.Innovators only 2.Early adopters taking it up 3.Early adopters adopted; early majority taking it up 4.Early majority adopted; late majority taking it up 5.All taken up except laggards, who are now taking it up (or retiring or leaving) 6.First wave embedded, second wave under way (e.g. m-learning after e-learning)

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester13 01 Adoption – multiple Note that there is a range of hills, receding into the distance… –Word – –VLE –Streaming –M-learning –podcasting

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester14 10 Training 1.No systematic training for e-learning 2.Some systematic training, e.g. in some projects and departments 3.U-wide training programme but little monitoring of attendance or encouragement to go 4.U-wide training programme, monitored and incentivised 5.All staff trained in VLE use, training appropriate to job type – and retrained when needed 6.Staff increasingly keep themselves up to date in a “just in time, just for me” fashion except in situations of discontinuous change

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester15 Training – multiple… Different departments may organise it Different rates of adoption for different staff: –IT support –Learning technologists –Academics –Admin staff (e.g. supporting DL)

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester16 Questions?

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester17 The core criteria (18) ( )

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester18 Criteria 01 to 06 1.Adoption phase 2.VLE stage (includes “productive” use) 3.Tools use 4.Usability 5.Accessibility 6.E-Learning Strategy

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester19 06 e-learning strategy Depth: Top-level and Schools Breadth: School level, all schools Frequency: How often updated Integration up: Do they talk to each other Integration out: Does it link to others? –IT –Learning and Teaching –Physical space planning

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester20 Criteria 07 to 12 7.Decision-making 8.Instructional design and pedagogy 9.Learning material 10.Training 11.Academic workload planning in e-learning 12.Costs management

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester21 Criteria 13 to Planning (annual) 14.Evaluation (and revalidation) 15.Organisation 16.Technical support 17.Quality and Excellence 18.Staff reward and recognition

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester22 Supplementary and local criteria ( )

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester23 Supplementary criteria Based on further work and feedback from “my” HEIs, new criteria are under consideration for a “supplementary” set – and some may be added into the core criteria in Pick & Mix release 2.0

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester24 Suggeste supplementary criteria #1 58: Market research 59: Competitor research 55: Foresight 68: Research outputs 99: Convergence of on- and off-campus support levels

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester25 Student supplementary criteria (These have been composited) 93: Student understanding of e-learning system 94: Student satisfaction with e-learning system 63: Level of exploitation of student IT and information skills

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester26 Local criteria from EoI, refined I believe that they are all covered in supplementary

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester27 Issues with local criteria Some are variants of benchmarks Some are “drill downs” Some are taxonomic BUT Doesn’t matter as long as benchmarking remains the prime focus

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester28 Local criteria - updates (to discuss)

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester29 Scoring, evidence and documentation

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester30 Scoring 5-point scale is norm, point 6 is exceptional Focus not only best practice but also bad practice and real practice Do not try to ensure that your HEI and all HEIs score 5

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester31 Evidence and documentation Have a file for each criterion Some narrative will be needed but aim is NOT to recreate the QAA-style SED or DATs We are hoping to group the criteria into about 4 or 5 clumps (but not easy)

Benchmarking Pilot Workshop 1, 17 March 2006, Leicester32 Thank you for listening Any questions? Professor Paul Bacsich for this work is