Download presentation
Presentation is loading. Please wait.
Published byReynold Allen Modified over 9 years ago
1
A Web-based Automatic Evaluation System Chittaranjan Mandal (School of IT, IIT Kharagpur, India) Chris Reade (Kingston Business School, Kingston University, UK) Vijay Luxmi Sinha (School of IT, IIT Kharagpur, India)
2
ECEL 2004Mandal, Reade, Sinha Automated Evaluation Motivation Other systems and choices Our design and choices Details of assignment set-up Conclusions and ‘The bigger picture’
3
ECEL 2004Mandal, Reade, Sinha Programming assignments (at IIT Kharagpur) 9600 to mark per semester –Setting assignment (5hrs?) –Distribution and collection (Automated by WBCM) –Marking (@20m each) = 3200hrs = 20 people * 20 working days each For sophisticated testing maybe 5 times this.
4
ECEL 2004Mandal, Reade, Sinha Automated marking is a must Additional Benefits: –automated rapid feedback (resubmission possibilities) –consistency –plagiarism checking –rigorous testing (suite)
5
ECEL 2004Mandal, Reade, Sinha Different to simple test and assessment What can be automatically assessed? Program behavior (execution) –Correctness testing –Performance testing –Whole program and/or components –Data (randomising + special test cases) –Large numbers of runs Analysis of structure and style (still very hard to do)
6
ECEL 2004Mandal, Reade, Sinha Other Systems Full Automation versus Assistance –examples: Try (1989) / ASSYST 1997 Feedback Only versus Full Management –examples: Submit (2003) / Ceilidh (1993) Eval of Design / Behavior / Components Fixed / multiple programming language –Pascal, Ada, Scheme, C, C++, Java Web-based? –examples: GAME (2004), Submit (2003)
7
ECEL 2004Mandal, Reade, Sinha Our design choices and goals Choose: –Component testing (performance and behavior) –Single language (for now) –Full automation AND rapid feedback –Integration with full management system WBCM Security of the process –Potentially unsafe programs (malicious/accident) Marking issues –Partly working programs –Feedback based on schema Address Overheads for assignment setup
8
ECEL 2004Mandal, Reade, Sinha Setting up an assignment Design a programming problem –Sub-problem approach (white box) –Marking scheme Implementation involves (formally) –Expressing assignment plan (components and strategy) –writing components for test harness –describe testing data and process –describe marking scheme with feedback example: mergesort ….
9
ECEL 2004Mandal, Reade, Sinha Mergesort mergesort merge main mergesort merge main make1 make2 Binary files to test Student submissionReference Files
10
ECEL 2004Mandal, Reade, Sinha XML files in preference to web entry XML specification: source files marking scheme testing process processed Tester Script and files
11
ECEL 2004Mandal, Reade, Sinha Input generation (types) random integers: (array / single) (distinct / non-distinct) (un-sorted / sorted ascending / sorted descending) (positive / negative / mixed / interval) random floats: (array / single) (distinct / epsilon-apart) (un-sorted / sorted ascending / sorted descending) (positive / negative / mixed / interval) strings: (array / single) (distinct / non-distinct) (fixed length / variable length)
12
ECEL 2004Mandal, Reade, Sinha XML specification for input generation <input inputVar="a1" type="integer” varType="array" sequence="ascend” range="positive" duplication="distinct"> 50 <input inputVar="a2" type="integer” varType="array" sequence="ascend” range="positive" duplication="distinct"> 50 a1 a2 test_merge_to_k (a1, 50, a2, 50)
13
ECEL 2004Mandal, Reade, Sinha XML specification for a test... Evaluation of merge_function...
14
ECEL 2004Mandal, Reade, Sinha Testing script flowchart XML submit File 1 File 2 File 3 Test Cases Source File Details Makefile Marking Scheme Feedback Details submit File 1 File 2 Reference Implementation Submissions Get submission Get data prepare makefile build and execute test award marks generate report more test data start stop
15
ECEL 2004Mandal, Reade, Sinha Conclusions This system is designed to support complex assessments. For automated assessment, the assessment set-up becomes the hardest part for designers –Our tool directly supports implementation of Assessment strategy, marking sceme, feedback and testing harness –Key features XML specs to generate testing script… Re-use of assignment designs –Future Work Generalising Assessing effectiveness with more examples
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.