Presentation is loading. Please wait.

Presentation is loading. Please wait.

A programming exercise evaluation service for Mooshak

Similar presentations


Presentation on theme: "A programming exercise evaluation service for Mooshak"— Presentation transcript:

1 A programming exercise evaluation service for Mooshak
José Paulo Leal | Ricardo Queirós CRACS & INESC-Porto LA Faculdade de Ciências, Universidade do Porto Rua do Campo Alegre, Porto PORTUGAL

2 Outline Introduction Architecture Design Conclusion Context Motivation
Goal Architecture eLearning Frameworks E-Framework Evaluation service (service genre, expression and usage model) Design Conclusion

3 1. Introduction: Context
Experience of projects with evaluation components Mooshak - contest management system for ICPC contests EduJudge - use of UVA programming exercises collections in LMSs Emergence of eLearning frameworks advocate SOA approaches to facilitate technical interoperability based on a survey the most prominent is the E-Framework (E-F)

4 1. Introduction: Motivation
Integration of systems for automatic evaluation of programs program evaluators are complex difficult to integrate in eLearning systems (e.g. LMS) program evaluators should be autonomous services Modelling evaluation services communication with heterogeneous systems Learning Objects Repositories (LOR) Learning Management Systems (LMS) Integrated Development Environments (IDE) conformance to eLearning frameworks improves interoperability

5 1. Introduction: Motivation
Integration of evaluation service in eLearning network

6 1. Introduction: Goal Architecture Design
Integration of evaluation service on eLearning network Definition of an evaluation service on eLearning framework Formalise concepts related to program evaluation Design Extend existing contest management system Expose evaluation functions as services Reuse existing administration functions

7 2. Architecture eLearning frameworks Specialized software frameworks
Advocates SOA to facilitate technical interoperability Types: Abstract: creation of specifications and best practices for eLearning systems (e.g. IEEE LTSA, OKI, IMS AF) Concrete: service designs and/or components that can be integrated in implementations of artifacts (e.g. SIF, E-F) Survey: E-F and SIF are the most promising frameworks they are the most active projects both with a large number of implementations worldwide.

8 2. Architecture E-Framework
initiative established by JISC, DEEWR, NZ MoE and SURF aims to facilitate system interoperability via a SOA approach has a knowledge base to support its technical model Components Description User role Service genre Collection of related behaviors that describe an abstract capability  No technical expert (e.g. IT Manager) Service expression A specific way to realize a service genre with particular interfaces and standards Technical expert (e.g. Developer) Service Usage Model The relationships among technical components (services) used for applications  Domain expert (e.g. Business Analyst)

9 2. Architecture support of the online community (developers wiki)
contribution for the E-Framework: Service Genre (SG) Service Expression (SE) Service Usage Model (SUM)

10 2. Architecture - SG Text File Evaluation Service Genre
responsible for the assessment of a text file text file with an attempt to solve an exercise exercise described by a learning object supports three functions ListCapabilities EvaluateSubmission GetReport

11 2. Architecture - SG ListCapabilities function:
list all the capabilities supported by a specific evaluator capabilities depend strongly on the evaluation domain computer programming evaluator: programming language compiler electronic circuit simulator: collection of gates that are allowed on a circuit

12 2. Architecture - SG EvaluateSubmission function:
requests an evaluation for a specific exercise request includes: reference to an exercise as a learning object held in a repository text file with an attempt to solve a particular exercise evaluator capability necessary for a proper evaluation of the attempt response includes ticket for a later report request or a detailed evaluation report

13 2. Architecture - SG GetReport function:
get a report for a specific evaluation report included in the response may be transformed in client side: based on a XML stylesheet able to filter out parts of the report calculate a classification based on its data

14 Source code + LO reference
2. Architecture - SE The Evaluate-Programming Exercise SE requests program source code reference to programming exercise as a Learning Object (LO) resources learning objects retrieved from repository LO are archives with assets (test cases, description) and metadata responses XML document containing evaluation report details of test case evaluations Evaluation Engine input output Source code + LO reference report resource LO

15 2. Architecture - SE The E-Framework model contains 20 distinct elements to describe a service expression (SE) Major E-Framework elements: Behaviours & Requests Use & Interactions Applicable Standards Interface Definition Usage Scenarios

16 2. Architecture - SE Behaviours & Requests
details technical information about the functions of the SE the 3 types of request handled by the SE: ListCapabilities: provides the client systems with the capabilities of a particular evaluator EvaluateSubmission: allows the request of an evaluation for a specific programming exercise GetReport: allows a requester to get a report for a specific evaluation using a ticket

17 2. Architecture - SE Use & Interactions
illustrates how the functions defined in the Requests & Behaviours section are combined to produce a workflow LEARNING MANAGEMENT SYSTEM 1 LO reference and attempt 4 Report EVALUATION ENGINE (correction and classification) 2 LO reference LO 3 REPOSITORY LEARNING OBJECTS

18 2. Architecture - SE Applicable Standards
enumerates the technical standards used on the SE content (IMS CP, IEEE LOM, EJ MD) and interoperability (IMS DRI)

19 2. Architecture - SE Interface Definition
formalizes the interfaces of the service expression syntax of requests and responses of SE functions functions exposed as SOAP and REST web services Function Web Service Syntax ListCapabilities SOAP ERL ListCapabilities() REST GET /evaluate/ > ERL EvaluateSubmission ERL Evaluate (Problem, Attempt ,Capability) POST /evaluate/$CID?id=LOID < PROGRAM > ERL GetReport ERL GetReport(Ticket) GET $Ticket > ERL

20 2. Architecture - SE Interface Definition
Evaluation Response Language (ERL) covers the definition of the response messages of the 3 functions formalised in XML Schema

21 Competitive (contests)
2. Architecture - SE Usage Scenarios Learning Examples Issues/Features Curricular (classes) Self-evaluation feedback for wrong submissions Assigments feedback & evaluation Exams computes a grade Competitive (contests) IOI points for accepted test cases ICPC penalizations for wrong submissions IEEExtreme high number of participants

22 2. Architecture - SUM Text File Evaluation SUM
describes the workflows within a domain composed by SG or SE template diagram from E-F two business processes Archive Learning Objects Evaluate Learning Objects

23 2. Architecture - SUM Business Processes Roles Description
Archive Learning Objects Teacher searches an exercise in a Learning Objects Repository (LOR) links the most appropriate in a Learning Management System (LMS) Evaluate Learning Objects Student gets the exercise from the LMS solves the exercise in a specialized resolution environment submits the resolution to a evaluation engine (EE) receives a notification with a evaluation report

24 3. Design Evaluation Service: Design principles & decisions
support e-framework architecture extend existing contest management system – Mooshak reuse existing functions rather than implement new ones create service front controller for service maintain administration web interface map service concepts in Mooshak concepts

25 3. Design Evaluation Service: Mapping service concepts to mooshak
Service -> Contest only contests marked as serviceable several contests served simultaneously same contest can be served and managed Capability -> Contest + Language service request specify contest & language (whiting contest) controls evaluation context produces evaluation report (XML)

26 3. Design Evaluation Service: Mapping service concepts to mooshak
Service requester -> Team IDs based on remote IP address & Port basis for authentication useful also for auditing Learning Object -> Problem LOs downloaded from remote repositories converted to Mooshak problems downloaded problems used as cache

27 4. Conclusion Definition of a evaluation service
Contribution to the E-Framework with a new Service Genre, Service Expression and Service Usage Model Validation of the proposed model with a extension of Mooshak contest management system Current and future work first prototype already available communication with repositories still in development integration in network of eLearning systems full evaluation of this service planned for next fall

28 Questions? Authors José Paulo Leal Ricardo Queirós Thanks!


Download ppt "A programming exercise evaluation service for Mooshak"

Similar presentations


Ads by Google