Benchmarking Pilot Launch Meeting, 20 January 2006, London1 Benchmarking e-learning: The Pick & Mix approach Professor Paul Bacsich Matic Media Ltd.

Slides:



Advertisements
Similar presentations
Reflect on our progress since OFSTED (focus on assessment) Identify areas in which each of us can make assessment more effective.
Advertisements

Specialist leaders of education Briefing session for potential applicants 2013.
Children’s subjective well-being Findings from national surveys in England International Society for Child Indicators Conference, 27 th July 2011.
Developing our next strategic plan Manager’s pack TEAM MEETING DISCUSSIONS.
Key Performance Indicators in Measuring Institutional Performance Case Study Use of Board Level KPIs John Lauwerys Secretary & Registrar.
A Research Active Hospice
Facilitators: Janet Lange and Bob Munn
Supporting education and research E-learning tools, standards and systems Sarah Porter Head of Development, JISC.
NETT Recruitment-Admissions Interactive Review Congruence Survey for case study 1 Relationship between recruitment and admissions activity.
Lessons learned on the self- assessment process: some FAQs Dr. Christian Engel State Chancellery of Northrhine-Westfalia / Germany European Institute of.
TRANSLINK Training Effective Management and Supervision of PhD Candidates University of Indonesia, 9-10 May 2006 Postgraduate Supervision Dr. Paul Timms.
KNOWLEDGE MANAGEMENT AT ACCENTURE
Improving Students’ understanding of Feedback
KTH Seminar, Stockholm, Sweden, 9 October Approaches to quality in e-learning through benchmarking programmes Professor Paul Bacsich Matic Media.
1 Change in e-Learning in a UK university – London Met RLO-CETL John Cook CETL Centre Manager Reusable Learning Objects (RLO) Centre for Excellence in.
Project Workshops Results and Evaluation. General The Results section presents the results to demonstrate the performance of the proposed solution. It.
Slide 1 of 17 Lessons from the Foundation Learning provision for the new 16 to 19 Study Programmes Discussion materials Issue 2: The development of English.
Embedding information literacy into the curriculum - How do we know how well we are doing? Katharine Reedy (Open University) Cathie Jackson (Cardiff University)
NHS Wakefield CCG NHS North Kirklees CCG Consultation on changes to hospital services in North Kirklees and Wakefield District Wakefield public meeting.
1 Building and Maintaining Information Systems. 2 Opening Case: Yahoo! Store Allows small businesses to create their own online store – No programming.
Slide 1 of 19 Lessons from the Foundation Learning provision for the new 16 to 19 Study Programmes Discussion materials Issue 1: Attendance, retention,
Insert name of presentation on Master Slide Professor Sue Lister Introducing quality improvement into the education of health and social care professionals.
Select the Perfect Team, using the Job Requirements process  Gary Hewins F.CIPD  Until last Friday 1 Oct Behavioural Development Manager, Sellafield.
UCP JISC RSC Conference, Bristol 8 th April 2008 Embedding e-Learning in everyday practice David Benzie & Adam Read March 2008 USB_B\Pathfinder\RSC.
1 Standards, quality assurance, best practice and benchmarking in e-learning Professor Paul Bacsich Matic Media Ltd, and Middlesex University, UK.
SPA Seminar Managing Numbers for 2010 Entry Welcome and context 2 June 2010 Janet Graham, Director of SPA.
How does OU study work? Ann Matsunaga June
ACODE 39: Quality in Teaching and Learning, November 2005, Melbourne1 Benchmarking in e-learning: an overview Professor Paul Bacsich Matic Media.
How is the OER Scored? It’s rated on a scale of or 3 is considered “passing” So, what is a 0, 1, 2 or 3 answer?
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
Student Voice in Academic Audit Dr Jan Cameron Director Academic Quality Agency for New Zealand Universities May 2015.
Metrics 2.0 Rick A. Morris, PMP, OPM3, MCITP
Good Practice Conference “…collating, disseminating and encouraging the adoption of good practice…”
QA for MIMAS A Case Study Anne McCombe MIMAS
Action research projects This sequence may help you when planning your case study What does the data identify about this target group? What will.
Evaluating E-Learning Efficacy University of York, UK Wayne Britcliffe and Simon Davis Edinburgh Napier Learning and Teaching conference 14 th June 2012.
1 © Netskills Quality Internet Training, University of Newcastle Strategic Staff Development for the Web-enabled Organisation Dave Hartland
© University of Reading 2006www.reading.ac. uk12 November 2015 HEA e-Learning Benchmarking and Pathfinder Project Professor Ginny Gibson & Maria Papaefthimiou.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
The HMO Research Network (HMORN) is a well established alliance of 18 research departments in the United States and Israel. Since 1994, the HMORN has conducted.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
UK eUniversities: eLearning Research Centre and the Community Professor Paul Bacsich Director of Special Projects UK eUniversities Worldwide Limited.
Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven.
Modernisation of Statistics Production Stockholm November 2009 Summary and Conclusions New York 24 February 2010 Mats Wadman Deputy Director General Statistics.
Teaching in teams: lessons from systematic review training NCRM Training the Trainers Event 4 th June 2007 Angela Harden and Karen Bird MRS Node EPPI Centre,
LORO: fostering professional development through OER Anna Comas-Quinn and Tita Beaven Department of Languages, Faculty of Education and Language Studies,
What will constitute “Outstanding” in MFL lessons? OfSTED guidance for subject inspections. MFL 2012 onwards.
Disabled Children's Social Care Families & Carers Feedback Summary April – October 2015.
Is there a role for online repositories in e-Learning? Sarah Hayes Andrew Rothery University of Worcester.
Enhancing capacity in the Child & Youth sector through situated supported distance learning Building Capacity through a Participatory Institutional Assessment.
SAT’s Information Parent’s Meeting 10 th February February 2016.
Hearing Voices: Listening to Students' Views in Collaborative Partnerships Ian Willcock – February 2016.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Advancing teaching: inspiring able learners every day Meeting the Challenge 14 th November 2012.
Evaluating E-Learning Efficacy for blended learning courses : University of York, UK Wayne Britcliffe ALT-C : A confrontation with reality 11 th September.
The Quality Assurance Agency for Higher Education ‘Tutoring for the 21 st Century’ 28 January 2015 Harriet Barnes Natalja Sokorevica Standards, Quality.
Applying Laurillard’s Conversational Framework to Blended Learning Blogging and Collaborative Activity Design R Papworth, R Walker & W Britcliffe E-Learning.
Employability Christine Bertram Erasmus+ Learning Network Belfast 21 May 2015.
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester1 Benchmarking e-learning: Putting the Pick & Mix approach into action Professor Paul Bacsich Consultant.
European Agency for Development in Special Needs Education Project updates Marcella Turner-Cmuchal.
Ellinogermaniki Agogi Research and Development Department DigiSkills Network DigiSkills: Network for the enhancement of Digital competence skills.
© FSAI FSAI Advice-Line Evaluation Survey of Advice-line and Query Users and Mystery Shopper Measurement Evaluation carried out by by Insight Statistical.
Spotlight on the digital: improving discoverability of digital collections Paola Marchionni, Head of digital resources for teaching, learning and research,
Academic Support Tuition project Phase 2
28-Nov-18 Benchmarking e-learning in UK Universities: Reflections on University of Leicester’s participation in the HEA-led benchmarking of e-learning.
Modernisation of Statistics Production Stockholm November 2009
CORE 3: Unit 3 - Part D Change depends on…
How is the OER Scored? It’s rated on a scale of 0-3
Presentation transcript:

Benchmarking Pilot Launch Meeting, 20 January 2006, London1 Benchmarking e-learning: The Pick & Mix approach Professor Paul Bacsich Matic Media Ltd

Benchmarking Pilot Launch Meeting, 20 January 2006, London2 Pick & Mix Overview and history

Benchmarking Pilot Launch Meeting, 20 January 2006, London3 Pick & Mix overview Focussed purely on e-learning, but could be extended more widely Draws on several sources and methodologies Not linked to any particular style of e-learning (e.g. distance or on-campus or blended) Oriented to institutions past “a few projects” Suitable for desk research as well as “in-depth” Suitable for single- and multi-institution studies

Benchmarking Pilot Launch Meeting, 20 January 2006, London4 Pick & Mix history Initial version developed in early 2005 in response to a request from Manchester Business School for competitor study Since deepened by literature search, discussion, feedback, presentations (UK, Brussels, Berlin, Sydney, Melbourne) and workshops (ALT-C and Online Educa)

Benchmarking Pilot Launch Meeting, 20 January 2006, London5 Pick & Mix Criteria and metrics

Benchmarking Pilot Launch Meeting, 20 January 2006, London6 How Many Criteria? BM is like ABC: how many activities? Vague answer: Not 5, not 500 Better answer: Well under 100

Benchmarking Pilot Launch Meeting, 20 January 2006, London7 Pick & Mix: 18 core criteria Composited some criteria together Removed any not specific to e-learning Was careful about any which are not provably critical success factors Left out of the core some criteria where there was not (yet) UK consensus Institutions will wish to add specific ones to monitor their objectives and KPIs. This is allowed – system is extensible.

Benchmarking Pilot Launch Meeting, 20 January 2006, London8 Pick & Mix Metrics Use a 6-point scale (1-6) –5 from Likert plus 1 more for “excellence” Backed up by continuous metrics where possible Also contextualised by narrative Some criteria are really “criteria bundles” There are always issues of judging progress especially “best practice”; judging “better practice” is easier –e.g. VLE convergence

Benchmarking Pilot Launch Meeting, 20 January 2006, London9 Pick and Mix System: summary Based on survey of “best of breed” ideas 6-point scale (Likert + excellence) Backed up by narrative and metrics 18 core criteria (e-learning specific) Can easily add more in same vein for local needs Output and student-oriented aspects covered Focussed on critical success factors Methodology-agnostic but uses underlying methodologies where useful Requires no long training course to understand –But users must know and be undogmatic about e-learning

Benchmarking Pilot Launch Meeting, 20 January 2006, London10 Other aspects of Pick & Mix Core criteria are more output and process- focussed than input-focussed; could add input criteria Explicit (otherwise danger of lying not trying) Independent or collaborative Internal (done by you) or external (done to you) Horizontal –Allows a focus on processes across whole institution – but can look at individual projects, missions and departments to get a “range of scores”

Benchmarking Pilot Launch Meeting, 20 January 2006, London11 Pick & Mix Three sample criteria

Benchmarking Pilot Launch Meeting, 20 January 2006, London12 “Adoption phase” (Rogers) 1.Innovators only 2.Early adopters taking it up 3.Early adopters adopted; early majority taking it up 4.Early majority adopted; late majority taking it up 5.All taken up except laggards, who are now taking it up (or retiring or leaving) 6.First wave embedded, second wave under way (e.g. m-learning after e-learning)

Benchmarking Pilot Launch Meeting, 20 January 2006, London13 “Training” 1.No systematic training for e-learning 2.Some systematic training, e.g. in some projects and departments 3.U-wide training programme but little monitoring of attendance or encouragement to go 4.U-wide training programme, monitored and incentivised 5.All staff trained in VLE use, training appropriate to job type – and retrained when needed 6.Staff increasingly keep themselves up to date in a “just in time, just for me” fashion except in situations of discontinuous change

Benchmarking Pilot Launch Meeting, 20 January 2006, London14 “Accessibility” 1.e-learning material and services is not accessible 2.Much e-learning material and most services conform to minimum standards of accessibility 3.Almost all e-learning material and services conform to minimum standards of accessibility 4.All e-learning material and services conform to at least minimum standards of accessibility, much to higher standards 5.e-learning material and services are accessible, and key components validated by external agencies 6.Strong evidence of conformance with letter & spirit of accessibility in all countries where students study Too aspirational, too international, too regulated?

Benchmarking Pilot Launch Meeting, 20 January 2006, London15 Thank you for listening Any questions? Professor Paul Bacsich for this work is