E-Learning Maturity Model Mike Barker. Reference Marshall, S. & Mitchell, G. (2004). Applying SPICE to e-learning: An e-learning maturity model? Sixth.

Slides:



Advertisements
Similar presentations
Standard 13 Related Educational Activities. What does it cover? The institutions programs or activities that are characterized by particular content,
Advertisements

Policy development workshop The role and characteristics of appropriate supportive policy within Bandwidth Management and Optimisation (BMO)
INSTRUCTORS, FACULTY AND COMMITTEES MUST PLAN WHERE WE WANT THE STUDENT TO GO. THE COURSE OUTLINE GUIDES THE STUDENT ON WHERE TO GO AND HOW.
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
UNSW Strategic Educational Development Grants
A Self Study Process for WCEA Catholic High Schools
Facilitators: Janet Lange and Bob Munn
MOOCs and the Quality Code Ian G. Giles PFHEA Medical Education
Dr. Barbara Wheeling Coordinator for Institutional Assessment Montana State University Billings September 1, 2010.
R R R CSE870: Advanced Software Engineering (Cheng): Intro to Software Engineering1 Advanced Software Engineering Dr. Cheng Overview of Software Engineering.
Quality evaluation and improvement for Internal Audit
Institutional Effectiveness at CCC The Journey The Priority  It is always important to know if what you are doing is effective. This is the story of.
Software Process CS 414 – Software Engineering I Donald J. Bagert Rose-Hulman Institute of Technology December 17, 2002.
ACE TESOL Diploma Program – London Language Institute OBJECTIVES You will understand: 1. The difference between a course, curriculum, and syllabus. 2.
TALOE – Time to Assess Learning Outcomes in E-learning
February 13, From NEW to MAINSTREAM 6.7 million students took at least one online course in the Fall % of all higher education students.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
The Accreditation: The Policies on Distance Learning.
Academy for Student-Centered Learning – Workshop Two Melia Fritch, Shawna Jordan, & Shannon Washburn October 28, 2013 CREATING STUDENT-CENTERED LEARNING.
DPW BD&C Employee Performance Evaluation Guideline Discussion
Learning outcomes and introduction to assessment Pg Certificate in Higher Education Professional Practice Jannie Roed and Sue Moron-Garcia 6 th May 2009.
ASSESSMENT IN ONLINE ENVIRONMENTS. WELCOME o Facilitator name Position at university Contact info.
Consent to use the Outcomes Group exercises
ACADEMIC PERFORMANCE AUDIT
Chemistry B.S. Degree Program Assessment Plan Dr. Glenn Cunningham Professor and Chair University of Central Florida April 21, 2004.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
NCATE Standard 6 Governance and Resources: Debunking the Myths AACTE/NCATE Workshop Arlington, VA April 2008 Linda Bradley James Madison University
GUIDELINES ON CRITERIA AND STANDARDS FOR PROGRAM ACCREDITATION (AREA 1, 2, 3 AND 8)
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
QAA Summative Review Staff Briefing Leeds College of Art 8 September 2010.
Instructional Plan | Slide 1 AET/515 Instructional Plan December 17, 2012 Kevin Houser.
1 DoQuP standards for quality assurance of study programmes in partner countries Project strategic outcomes at Qafqaz University AZERBAIJAN Sannur Aliyev.
Welcome Welcome to “Getting Results” A National Science Foundation project developed by WGBH with the League for Innovation and 13 community colleges from.
ACADEMIC PERFORMANCE AUDIT ON AREA 1, 2 AND 3 Prepared By: Nor Aizar Abu Bakar Quality Academic Assurance Department.
New Zealand E-Learning Capability Determination Dr Stephen Marshall University Teaching Development Centre Victoria University.
Information & Communication Technologies (ICT) Professional Development Clusters 2010–2012.
Teaching and Learning with Technology ick to edit Master title style Teaching and Learning with Technology Designing and Planning Technology Enhanced Instruction.
Distinguished Educator Initiative. 2 Mission Statement The Mission of the Distinguished Educator is to build capacity in school districts to enable students.
Formative Peer Review at Ocean County College. Guiding Principle “Ideally, the peer review of teaching is a critically reflective and collaborative process.
Student Support Services Standard II B & C. II.B. The institution recruits and admits diverse students who are able to benefit from its programs, consistent.
Proposal Writing Workshop Features of Effective Proposals.
Feasibility Study.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Quality: Becoming what you measure Chère Campbell Gibson Presented at the 17 th Annual TelecoopTechnology Conference.
National Standards in Reading & Writing Sources : NZ Ministry of Education websites. G Thomas, J Turner.
Designing and Planning for Technology-Enhanced Instruction
ELEARNING IN EDUCATION: AN OVERVIEW Raymond S. Pastore, Ph.D. Bloomsburg Univeristy Bloomsburg, PA SITE 2002 Thursday, March 21, 2002 Purpose: To give.
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
Assessing Online Learning Process Maturity: the e-Learning Maturity Model Stephen Marshall, Victoria University of Wellington Ken Udas, Penn State World.
Presenter: Ching-ting Lin Instructor: Ming-puu Chen Toward an Effective Quality Assurance Model of Web- Based Learning: The perspective of Academic Staff.
Teaching and Learning with Technology ick to edit Master title style  Allyn and Bacon 2005 Teaching and Learning with Technology  Allyn and Bacon 2002.
1. October 25, 2011 Louis Everett & John Yu Division of Undergraduate Education National Science Foundation October 26, 2011 Don Millard & John Yu Division.
The combination of traditional Learning and e-learning.
Criterion 1 – Program Mission, Objectives and Outcomes Weight = 0.05 Factors Score 1 Does the program have documented measurable objectives that support.
21 st Century Learning and Instruction Session 2: Balanced Assessment.
CHAPTER 16 Preparing Effective Proposals. PRELIMINARY CONSIDERATIONS  Conducting a Preliminary Assessment  Prior to Writing the Proposal  How Fundable.
Math and Science Partnership National Science Foundation MSP Project Description FY’06 Institute Partnerships  Vision, Goals and Outcomes  Vision, Goals.
Instructional Design Course Evaluation Phase. Agenda The Evaluation Process Expert Review Small Group Review The Pilot Feedback and Revision Evaluation.
Defining & Aligning Local Curriculum. What is Curriculum? Individually consider your personal definition of the term curriculum What words do you think.
Aligning Student Learning outcomes to Evaluation 1 Applied Technologies - ICRDCE Conference Day
Denise Kirkpatrick Pro Vice-Chancellor The Open University, UK Quality Assurance in Distance Education.
Advanced Software Engineering Dr. Cheng
Presentation to the Portfolio Committee
BAKU HIGHER OIL SCHOOL Quality Assurance
Creating Your Course Outline
Creating effective learning objectives and measures
Project proposal presentation Guidelines
Sam Catherine Johnston, Senior TA Specialist National AEM Center
Cynthia Curry, Director National AEM Center
Presentation transcript:

E-Learning Maturity Model Mike Barker

Reference Marshall, S. & Mitchell, G. (2004). Applying SPICE to e-learning: An e-learning maturity model? Sixth Australasion Computing Education Conference (ACE2004). Conferences in Research and Practice in Information Technology, 30,

Summary Proposes an e-learning process improvement model Proposes an e-learning process improvement model based on benchmarks proposed by the Institute for Higher Education Policy 2000 based on benchmarks proposed by the Institute for Higher Education Policy 2000 tests the model by applying it to an e- learning module at a New Zealand university. tests the model by applying it to an e- learning module at a New Zealand university. focuses more on improvement of existing e-learning than on initial implementation. focuses more on improvement of existing e-learning than on initial implementation. the set of processes can be considered as guidelines for best practices. the set of processes can be considered as guidelines for best practices.

Process Categories Learning: pedagogical aspects Learning: pedagogical aspects Development: creation and maintenance of e-Learning resources Development: creation and maintenance of e-Learning resources Coordination: oversight and management Coordination: oversight and management Evaluation: evaluation and quality control through lifecycle Evaluation: evaluation and quality control through lifecycle Organization: institutional planning and management Organization: institutional planning and management

Levels 0: Not Performed: Not done at all 1: Initial: ad-hoc processes 2: Planned: clear, measurable objectives for projects 3: Defined: defined process for development and support 4: Managed: ensures quality of resources and student learning outcomes 5: Optimizing: continual improvement of all aspects

Learning Examples Courses designed to require students to engage in analysis, synthesis, and evaluation Courses designed to require students to engage in analysis, synthesis, and evaluation Student interaction with faculty and other students is an essential characteristic Student interaction with faculty and other students is an essential characteristic Learning outcomes for each course are summarized in clearly written, straightforward statement Learning outcomes for each course are summarized in clearly written, straightforward statement

Development Examples Reliability of technology delivery system is as failsafe as possible Reliability of technology delivery system is as failsafe as possible Learning outcomes, not availability of technology, determine the technology used Learning outcomes, not availability of technology, determine the technology used Technical assistance in course development is available to faculty Technical assistance in course development is available to faculty

Coordination Examples A centralized system provides support for building and maintaining the e-learning infrastructure A centralized system provides support for building and maintaining the e-learning infrastructure Students are able to practice with any technologies prior to commencing a course Students are able to practice with any technologies prior to commencing a course Questions directed to student service personnel are answered accurately and quickly Questions directed to student service personnel are answered accurately and quickly

Evaluation Examples The programme’s educational effectiveness is formatively and summatively assessed with multiple, standards based, and independent evaluations The programme’s educational effectiveness is formatively and summatively assessed with multiple, standards based, and independent evaluations Success of technology/innovation used as a measure of effectiveness within course/programmes Success of technology/innovation used as a measure of effectiveness within course/programmes Intended learning outcomes are reviewed periodically to ensure clarity, utility, and appropriateness Intended learning outcomes are reviewed periodically to ensure clarity, utility, and appropriateness

Organization Examples A documented technology plan is in place and operational to ensure quality of delivery standards A documented technology plan is in place and operational to ensure quality of delivery standards Students are provided with supplemental course information that outlines course objectives, concepts, and ideas Students are provided with supplemental course information that outlines course objectives, concepts, and ideas Students are provided with supplemental course information that outlines student support services Students are provided with supplemental course information that outlines student support services

Some Questions How do you measure these points? 5 point scale, objectively, etc? How do you measure these points? 5 point scale, objectively, etc? Do these cover the process well? Are there other areas? Do any of these need to be broken up? Do these cover the process well? Are there other areas? Do any of these need to be broken up? How does this apply to initial implementation? How does this apply to initial implementation?