FREMA : e-Learning Framework Reference Model for Assessment www.frema.ecs.soton.ac.uk www.frema.ecs.soton.ac.uk Hugh Davis, Yvonne Howard, David Millard,

Slides:



Advertisements
Similar presentations
OMV Ontology Metadata Vocabulary April 10, 2008 Peter Haase.
Advertisements

Design for Learning Start Up meeting 23 – 24 May 2006 Supporting the Design for Learning Programme Lisa Corley CETIS, University of.
EPortfolio for Lifelong Learning Angela Smallwood Peter Rees Jones Sandra Kingston Exemplifying the value of the eFramework A.
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Database Systems: Design, Implementation, and Management Tenth Edition
The Innovation Base Tom Franklin Franklin Consulting Hilary Dexter University of Manchester.
The JISC IE Metadata Schema Registry Pete Johnston UKOLN, University of Bath JISC Joint Programmes Meeting Brighton, 6-7 July 2004
Supporting education and research E-learning tools, standards and systems Sarah Porter Head of Development, JISC.
E-Testing based on Service Oriented Architecture Institute of Informatics - FNS University “Ss. Ciryl and Methodious” – Skopje, Macedonia 10-th International.
Interaction and adaptation in SCORM-based SE course Todorka Glushkova, University of Plovdiv, Bulgaria
Service Oriented Architecture in eTesting Systems Institute of Informatics - FNS University “Ss. Ciryl and Methodious” – Skopje, Macedonia 6 th Workshop.
Introduction To System Analysis and Design
July 11 th, 2005 Software Engineering with Reusable Components RiSE’s Seminars Sametinger’s book :: Chapters 16, 17 and 18 Fred Durão.
The Innovation Base Tom Franklin Franklin Consulting Hilary Dexter University of Manchester.
FREMA: e-Learning Framework Reference Model for Assessment David Millard Yvonne Howard Learning Technology Group University of Southampton, UK.
Thee-Framework for Education & Research The e-Framework for Education & Research an Overview TEN Competence, Jan 2007 Bill Olivier,
Lecture 13 Revision IMS Systems Analysis and Design.
FREMA: e-Learning Framework Reference Model for Assessment Yvonne Howard David Millard Hugh Davis Gary Wills Lester Gilbert Learning Societies Lab University.
FREMA Exploratory Visualisations Steve Jeyes. A Service Domain view – after initial reflection on QCA, York, Kingston and Lboro visits Items Validation.
FREMA: e-Learning Framework Reference Model for Assessment David Millard Yvonne Howard IAM, DSSE, LTG University of Southampton, UK.
FREMA : e-Learning Framework Reference Model for Assessment Assessment: Priorities and Challenges for 2006 Hugh Davis Learning Technologies University.
Introduction and Hands-On Session June 2007 David Millard, Yvonne Howard Learning Societies Lab University of Southampton Kate Dickens, Ann Jeffery, Julie.
Cherchez the ‘Learning Object’ Workshop November 2007 David Millard, Gary Wills, (Yvonne Howard) Learning Societies Lab University of Southampton Julie.
FREMA: e-Learning Framework Reference Model for Assessment Design Patterns for Wrapping Similar Legacy Systems with Common Service Interfaces Yvonne Howard.
FREMA Lester Gilbert Dave Millard Yvonne Howard An ELF Reference Model Project In the JISC Distributed e-Learning Programme e-Learning Framework Reference.
FREMA: e-Learning Framework Reference Model for Assessment Lester Gilbert David Millard Yvonne Howard University of Southampton, UK University of Strathclyde,
David Millard Karen Fill Hugh Davis Lester Gilbert Gary Wills Learning Societies Lab, University of Southampton, UK.
IS550: Software requirements engineering Dr. Azeddine Chikh 4. Validation and management.
FREMA : e-Learning Framework Reference Model for Assessment FREMA Overview David Millard Learning Technologies University of Southampton, UK.
FREMA : e-Learning Framework Reference Model for Assessment FREMA Workshop Hugh Davis David Millard Yvonne Howard Learning Technologies University of Southampton,
David Harrison Senior Consultant, Popkin Software 22 April 2004
Domain Modelling the upper levels of the eframework Yvonne Howard Hilary Dexter David Millard Learning Societies LabDistributed Learning, University of.
Design of Blended Learning Activities Design of blended learning activities: Issues and perspectives An ASCILITE workshop By Allison Littlejohn Chair of.
Ontology-derived Activity Components for Composing Travel Web Services Matthias Flügge Diana Tourtchaninova
Teachers Discovering Computers Integrating Technology and Digital Media in the Classroom 7 th Edition Evaluating Educational Technology and Integration.
12 November 2010 New Way forward to ICT Literacy Training.
Learning Technology Interoperability Standards Niall Sclater, and Lorna M. Campbell,
Software Engineering for Business Information Systems (sebis) Department of Informatics Technische Universität München, Germany wwwmatthes.in.tum.de Master’s.
Domain Modeling In FREMA David Millard Yvonne Howard Hugh Davis Gary Wills Lester Gilbert Learning Societies Lab University of Southampton, UK.
UNINETT, Harstad, June 2004 Developing Sustainable E- Learning Frameworks to Support Teaching and Learning Lorna M. Campbell, CETIS
CLARIN work packages. Conference Place yyyy-mm-dd
L8 - March 28, 2006copyright Thomas Pole , all rights reserved 1 Lecture 8: Software Asset Management and Text Ch. 5: Software Factories, (Review)
COMP 208/214/215/216 – Lecture 8 Demonstrations and Portfolios.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Sandra Kingston, Project Manager Centre for International ePortfolio Development Enterprise SIG, University of Nottingham 20 April 2007 ADoM and DELIA.
Health eDecisions Use Case 2: CDS Guidance Service Strawman of Core Concepts Use Case 2 1.
Joint Pedagogy Forum & EC SIG meeting 26 April, Liverpool Hope University.
Portable Infrastructure for the Metafor Metadata System Charlotte Pascoe 1, Gerry Devine 2 1 NCAS-BADC, 2 NCAS-CMS University of Reading PIMMS provides.
FYITS – Students Mktg Briefing Nov 2010 BSc (Hons) Engineering Management Nature of Course The course seeks to equip students with management knowledge.
Database Administration
Domain Modeling In FREMA Yvonne Howard David Millard Hugh Davis Gary Wills Lester Gilbert Learning Societies Lab University of Southampton, UK.
EASiHE Gary Wills Bill Warburton Lester Gilbert E-Assessment in Higher Education A JISC project in Institutional Innovation.
Ontology Evaluation, Metrics, and Metadata in NCBO BioPortal Natasha Noy Stanford University.
Supporting further and higher education E-learning Framework and Tools Tish Roberts JISC Programme Manager.
Yu, et al.’s “A Model-Driven Development Framework for Enterprise Web Services” In proceedings of the 10 th IEEE Intl Enterprise Distributed Object Computing.
2 EC SIG 7/12/06 JISC Design for Learning Programme What is it?  2 year project - till May 2008  Part of the pedagogy strand of the JISC eLearning Programme.
Joint Information Systems Committee 09/03/2016 | | Slide 1 Toolkit and Demonstrator Calls Section Title Tish Roberts JISC programme Manager.
Joint Information Systems Committee 11/03/2016 | slide 0 e-Framework Next Generation - Sharing Learning & Creating Value Joint Information Systems CommitteeSupporting.
International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Modeling Standards Activity Team Model-based Systems Engineering (MBSE) Initiative Roger Burkhart.
Requirements capture: Using UML Use Cases David Millard and Yvonne Howard {dem,
BPEL for Web Services Warwick Bailey Business Process Execution Language for web services 1.Introduction 2.BPEL in Theory 3.BPEL in Practice.
Event ELearning Frameworks and eAdmin Hugh Davis.
Object-Orientated Analysis, Design and Programming
The JISC IE Metadata Schema Registry
The JISC IE Metadata Schema Registry
X-DIS/XBRL Phase 2 Kick-Off
JISC and SOA A view Robert Sherratt.
Knowledge Management Strategies to Improve Business Performance
FREMA: e-Learning Framework Reference Model for Assessment
Presentation transcript:

FREMA : e-Learning Framework Reference Model for Assessment Hugh Davis, Yvonne Howard, David Millard, Gary Wills Learning Technologies University of Southampton, UK

Project Expertise Hugh Davis: University Director of Education, University of Southampton Niall Sclater: VLE Programme Director, OU Robert Sherratt: Head of Development for e-Services Integration at the University of Hull Steve Jeyes: e-Learning Manager Edexcel Rowin Young: CETIS Assessment SIG co-ordinator Lester Gilbert: Lecturer, General Manager (Learning Design) Gary Wills: Examinations Officer David Millard: Lecturer, Technical Manager (Knowledge Representation) Yvonne Howard: Project Manager, Software engineer Iain Tulloch: Domain researcher Pasha Jam: Researcher Swapna Chennupati: Researcher Joe Price: Graphic Design Christopher Bailey: Developer

Background What is FREMA? –JISC funded Project between Southampton, Strathclyde and Hull –Part of the e-Framework effort –Aim to produce a Reference Model of the e-Learning Assessment Domain –To aid interoperability and aid in the creation of Assessment Services for the e-Framework What is a Reference Model for Assessment? –Assessment is a broad and complex domain –Not enough to describe and define a set of services –Need a proper audit trail of decision making –Start by defining the domain –Work up through use cases to services (and example implementations)

Anatomy of FREMA Reference Model Domain Definition –Overview of the domain, and how projects and standards fit within it Assessment Domain Definition Use Cases Service Profiles Gap Analysis Reference Impl’ Identifying Common Usage Patterns –Scoping the FREMA Project Gap Analysis –Mapping of Use Cases to the Services in ELF Service Profiles –Formal descriptions of those services Example Implementation –Of key/core services –Examples –Validation –Resource Common Usage Patterns Developing Use Cases –Formal descriptions of usage patterns

Domain Definition – Engaging the Community An evolving, cross-referenced, searchable web site of resources in the assessment domain Gathered and Validated through engagement with the community A number of workshops, interviews and questionnaires –Institutions: York, Kingston, Loughborough, Strathclyde, SQA, Edexcel, Heriot-Watt, Glasgow Caledonian, Warwick, University of Technology Sydney Conference: –9th CAA 2005 Loughborough –E-Learn 2005 –Educa Online Berlin joint workshop CETIS Assessment SIG: –Southampton April 2005 –Glasgow July 2005 –Edinburgh November 2005 –Glasgow March 2006 Assessment Domain Definition Use Cases Service Profiles Gap Analysis Reference Impl’ Common Usage Patterns

Interconnected Website Schema of resources and relationships between them (ontology) Used to generate dynamic pages for each resource Searching/Navigating/ Analysis –User Defined Smart Search –Dynamic Gap Analysis –Concept maps –Secondary data (like how much info we have on a resource)

Validating the Domain Definition We undertook a qualitative evaluation with domain experts to check that our concept map navigation tools were appropriate We also analysed the contents of peer reviewed published papers on assessment in the e-learning domain –To check coverage/validity of concepts –For this evaluation the forty five papers from the 9th International Conference on Computer Assisted Assessment were analysed. Reflection of database contents to original experts Other people are interested in using the FREMA database and concept map technology –Loughborough CAA conference in using similar to communicate with Assessment community perhaps –REAP project –LADiE extension bid

Use Cases FREMA has drawn a number of scenarios from its Domain Definition and has modelled them as use cases Scenarios are modelled as UML Use Case diagrams, with a number of actors taking part in high level Use Cases Use Cases are high level, and implementation independent They are structured, but relatively informal Scenarios –Summative, Peer & Artefact Assessment –Item Banking –External Examiner Assessment Domain Definition Use Cases Service Profiles Gap Analysis Reference Impl’ Common Usage Patterns

Use Cases - Provenance The use cases were developed by eliciting practice from a number of member in the e-assessment community. –CETIS (IBIS, Accessibility, Item Banking) –Qualification agencies: SQA, Edexcel –Universities: University of Hull, Spark University of Technology Sydney, Loughborough, University of Southampton, Kingston University.

Peer Assessment

Summative Assessment

Analysis Tools: Gap Analysis

Validating the Use Cases Peer Review by reflection with original domain experts for, –Coverage, accuracy, completeness We undertook a qualitative evaluation with domain experts at the CETIS Assessment SIG in Glasgow to validate the use cases and their representation in the FREMA model for, –Appropriate granularity –Understandability –Clarity –Choice of representation –Description quality and level of detail

Service Profiles Service Profiles are high level descriptions of an atomic service Defines the granularity of a service Does not define a formal interface –But outlines the responsibilities of the service Does not define how the service interact –But does describe potential collaborations that would help the service fulfil its responsibilities Assessment Domain Definition Use Cases Service Profiles Gap Analysis Reference Impl’ Common Usage Patterns

Service Responsibilities and Collaboration Cards Assessment Domain Definition Use Cases Service Profiles Gap Analysis Reference Impl’ Common Usage Patterns SRC: Take Assessment Collaborations: Authenticate Schedule Track Responsibilities: Manage Assessment Session Choose Assessment Open Assessment Display Rubric Confirm Candidate has read Rubric Confirm Candidate has finished assessment Enforce assessment constraints (time, etc) Close assessment Present Items according to assessment instructions Render Item Render Hints Record Response Process Response Enforce item constraints (no. retry’s, etc) Record usage data

Factoring a Service SRC: Take Assessment Collaborations:Responsibilities: Authenticate Users Choose Assessment Open Assessment Display Rubric Confirm Candidate has read Rubric Present Items according to assessment instructions Render Item Render Hints Record Response Process Response Confirm Candidate has finished assessment Enforce assessment constraints (time limits, etc) Record usage data Enforce item constraints (no. retry’s, etc) Close assessment

Factoring a Service SRC: Take Assessment Collaborations: Authenticate Schedule Track Responsibilities: Authenticate Users Choose Assessment Open Assessment Display Rubric Confirm Candidate has read Rubric Present Items according to assessment instructions Render Item Render Hints Record Response Process Response Confirm Candidate has finished assessment Enforce assessment constraints (time limits, etc) Record usage data Enforce item constraints (no. retry’s, etc) Close assessment Track Progress

Factoring a Service SRC: Take Assessment Collaborations: Authenticate Schedule Track Responsibilities: Choose Assessment Open Assessment Display Rubric Confirm Candidate has read Rubric Present Items according to assessment instructions Render Item Render Hints Record Response Process Response Confirm Candidate has finished assessment Enforce assessment constraints (time limits, etc) Record usage data Enforce item constraints (no. retry’s, etc) Close assessment

Factoring a Service SRC: Take Assessment Collaborations: Authenticate Schedule Track Responsibilities: Manage Assessment Session Choose Assessment Open Assessment Display Rubric Confirm Candidate has read Rubric Confirm Candidate has finished assessment Enforce assessment constraints (time, etc) Close assessment Present Items according to assessment instructions Render Item Render Hints Record Response Process Response Enforce item constraints (no. retry’s, etc) Record usage data

Service Workflow Services that are otherwise independent may be orchestrated in a workflow to fulfil a particular use case This workflow definition applies to a number of service profiles, but is independent of the individual services The workflows are themselves a more coarse grained service profile But need an additional description of how the services within the workflow interact Interaction is defined in the same abstract way as Service Profiles Assessment Domain Definition Use Cases Service Profiles Gap Analysis Reference Impl’ Common Usage Patterns

Service Interaction Diagrams Assessment Domain Definition Use Cases Service Profiles Gap Analysis Reference Impl’ Common Usage Patterns

Service Patterns Exemplar workflows –Described as patterns –How service interoperability can be achieved –Solutions to common interoperability problems Behaviour intersect Mimic Semantic Web approach Reference implementations of Service Patterns –WS–Interaction diagrams –WSDL –Java Code to download and install –Running services to try Assessment Domain Definition Use Cases Service Profiles Gap Analysis Reference Impl’ Common Usage Patterns

FREMA Core Services Assessment Services –Assign –Author item –Author assessment –Validate assessment –Take assessment –Mark assessment –Moderate assessment –Grade Assessment –View Feedback Support Services –Schedule –Notify –Track –Authorise and Authenticate

Validating the Services Working with Kingston –starting now, Peer review of services from Use Cases to SRC’s and WS- Interaction diagrams –continuing into the extension bid, Use the FREMA model to design a toolkit project Designed for an evolving view –Alternative services can be authored –New Use Cases may be added –Gap Analysis and other tools are dynamic –Opportunities for refactoring as new scenarios added

Using the FREMA Reference Model Use the Reference Implementation –Build on some or all of the developed services Use the Service Profiles –To develop your own services that will fit into the framework Use the Use Cases –To help understand usage patterns within the domain –Develop new Service Profiles and thus Services Use the Domain Definition –To develop a context for your own work –Understand how existing work fits together –Identify standards –Locate experts Assessment Domain Definition Use Cases Service Profiles Gap Analysis Reference Impl’ Common Usage Patterns

Scenario: Technical Developer Will, Technical Developer –Scenario ‘I want to lookup use cases and scenarios to help me design my application. This will help me to define my footprint in the assessment domain. I see there are some web services I could re-use but some are missing. What standards can I use when writing my own web services to ensure that I can interoperate with the web services I’ve chosen?’

Overview: Proposed Extension

Our Proposed Extension Provide for community curation, guardianship, and sustainability –To harden the website, add moderation, discussion forum, authoring facilities, and hand it over to the community –To support the debate over service refactoring To validate the reference model through real use –Assist one or more partners to develop toolkit demonstrations of assessment services –Engage in a validation activity with a commercial vendor (QuestionMark) –Coordinate with the eAssessment Roadmap project, and to link to the eAssessment Glossary Extend Service Workflows and Services Descriptions, other Use Cases and Scenarios Help JISC map assessment services to the e-Framework