Presentation is loading. Please wait.

Presentation is loading. Please wait.

Improving assessment quality using electronic testing Institute of Informatics - FNS University “Ss. Ciryl and Methodious” – Skopje, Macedonia Software.

Similar presentations


Presentation on theme: "Improving assessment quality using electronic testing Institute of Informatics - FNS University “Ss. Ciryl and Methodious” – Skopje, Macedonia Software."— Presentation transcript:

1 Improving assessment quality using electronic testing Institute of Informatics - FNS University “Ss. Ciryl and Methodious” – Skopje, Macedonia Software Engineering Education and Reverse Engineering Goce Armenski, M.Sc

2 1. CONTENT Content - C ONCLUSION - R ESULTS - Е Т ЕСТ – N EW SYSTEM FOR ELECTRONIC TESTING - A PLICATION OF Е Т EST Goce Armenski - I NTRODUCTION - М OTIVATION AND GOALS - B ASIC CONCEPTS OF THE SYSTEMS FOR Е Т ESTING - А RCHITECTURE, CONCEPTS AND FUNCTIONALITY OF Е Т EST

3 2.2.2.2. 1. INTRODUCTION Introduction Goce Armenski Changes in the way people live influenced by: - Globalization - Increased meaning of knowledge - Information and communication revolution “Industrial society”  “Information society” “Information society”  “Knowledge society” 50% of working skills are getting old in 3-5 years

4 3. Introduction Changes in education Teacher  Student centered education

5 4. Introduction Goce Armenski What are the assessment functions? What are the assessment functions? –Check the successfulness in achieving learning goals –Provide feedback –Improve learning process Keys achieved 14% improvement of student results when assessing their knowledge once a week instead once a month. Pikunas and Mazzota also mention improvements in results of 10%, when tests are delivered every week instead every 6 weeks Changes in the education (lot more students) - Lecturing process - Delivery of materials - Knowledge assessment

6 5. Introduction Goce Armenski Institute of Informatics (FNS) - every year 150+ students - 4 questions (assignments) x 150 students = 600 questions - 5 min for marking a question - 3000 min / 50 hours How long the marking will take??? Ontime feedback??? Personalized feedback??? Objectivity???

7 6. Motivation and goal Goce Armenski Development of a system for assessment of large number of students (more that 150) every month, which can be used in distance learning, as well as any other form of knowledge or skill assessment. - Motivation for reports creation Statistical analyses of gathered information - Motivation for test construction Different tests for ecery student with same weight Lowering the possibility for memorizing - Motivation for simple data entry Possibility for bulk entering of data 2. MOTIVATION AND GOAL

8 7. Motivation and goal Goce Armenski Focus of the research: To implement a system for computer based assessment and evaluate the results from its use Global goal: Identify the influence of this system to the teachers and students Main goal: Is computer based assessment more effective and objective than the traditional assessment Goal

9 8. 2. еТEST – CONCEPTS AND FUNCTIONALITY еТest – concepts and functionality Goce Armenski - Question bank, - Algoritms for test creation, - Systems for data presentation, - Result reports QuestionMark, BlackBoard, WEB CT, Top Class, EduSystem Basic concepts of systems for eTesting

10 9. еТest – concepts and functionality Goce Armenski Database of unique questions, with needed characteristics for simple selection during the test construction. some of the parameters are created dynamicaly Question bank - Standards for question bank development. - Exchange of questions of university level

11 10. еТest – concepts and functionality Goce Armenski - Fixed answers questions (objective) - multichoice - short entry - questions with graphical selection (hotspot) - Free text answers - programming code as answer - essay answer Fig. 2 Types of questions

12 11. еТest – concepts and functionality Goce Armenski Main difference between them in the adaptation to the characteristics of the person which knowledge is assessed Algoritms for test creation Fig. 3 Algoritms for test creation

13 12. еТest – concepts and functionality Goce Armenski Content depending on the characteristics of the monitor and the network In which way the system shows the results? - shows true answers? - the result is in points or percents? - negative marking??? Systems for data presentation Marking and reporting

14 13. еТest – concepts and functionality Goce Armenski еТest Technology - Web based application - Active Server Pages (ASP) - JavaScript - SQL Server 2000 - NT Server and Win 2000 compatible - IIS 4.0 or newer Web based sollutions VS desktop based sollutions Web Browser (Netscape 4.x or Internet Explorer 4.x, and above) Win 2000 Server Email (SMTP) IIS 4.0 +ASP SQL Server Access JScript

15 14. еТest – concepts and functionality Goce Armenski еТest Architecture Fig. 4 Three tier architecture of the eTesting system

16 15. еТest – concepts and functionality Goce Armenski - Types of users - Course organisation - Types of questions - Test creation algoritm - System for data presentation - Marking and reporting еТест Concepts

17 16. еТest – concepts and functionality Goce Armenski Types of users Fig. 5 Types of users

18 17. еТest – concepts and functionality Goce Armenski - learning objects - three structure Course organisation Fig. 6 Course organisation

19 18. еТest – concepts and functionality Goce Armenski Multichoice questions (choose one of many, choose many of many, yes/no answers);Multichoice questions (choose one of many, choose many of many, yes/no answers); Short entry answer (text or numerical);Short entry answer (text or numerical); Essay answer.Essay answer. Types of questions Questions can have pictures or graphs in the text or offered answersQuestions can have pictures or graphs in the text or offered answers

20 19. еТest – concepts and functionality Goce Armenski Multichoice questions Fig. 7 Choose one of many Fig. 8 Choose many of many

21 20. еТest – concepts and functionality Goce Armenski Short entry answers Сл. 9 Short entry

22 21. еТest – concepts and functionality Goce Armenski - these answers are not evaluated by the system - lowering the objectivity - these answers are not evaluated by the system - lowering the objectivity - Project Essay Grade (PEG) - Intelligent Essay Assessor (IEA), - Erater, - Bayesian Essay Test Scoring System (BETSY). Questions with essay answer Fig. 10 Essay answer

23 22. еТest – concepts and functionality Goce Armenski - dynamic linear tests (fixed number of questions) System for data presentation Marking and reporting Test creation algoritm - adjusted to Web standards (800x600) - possibility for picture and graph display - results are shown at the end of the test - negative marking

24 23. еТest – concepts and functionality Goce Armenski Statistical data analyses Identification of content which is not well presented;Identification of content which is not well presented; Personalized feedback to students;Personalized feedback to students; Identification of week questions which need to be revised before used again;Identification of week questions which need to be revised before used again; Identifying the individual weeknesses of students.Identifying the individual weeknesses of students.

25 difficulty – how difficult a question is;difficulty – how difficult a question is; discrimination – how well the question diferentiate good from bad students;discrimination – how well the question diferentiate good from bad students; guessing – how students answer the corect answer without knowing it;guessing – how students answer the corect answer without knowing it; Different achievements – what are the results of diferent user groups.Different achievements – what are the results of diferent user groups. 24. еТest – concepts and functionality Goce Armenski Inportant statistical data about questions:

26 25. 4. APPLICATION OF еТEST Application of eTest Goce Armenski - Integration in the process of learning - Controled learning Fig. 12 Way of passing the learning objects Succesfull strategy?? - all questions - N questions in a row - N wright questions - 3 wright questions in a row Statistical analyses of the user activities

27 26. 5. RESULTS Results Goce Armenski The use of technology in education is very dependent of the organization - logistics - socal changes Syncronisation with other systems Technical infrastructure Prctical Implementation Institute of Informatics, FNS (2001)Institute of Informatics, FNS (2001) А.D. Mobimak (2002)А.D. Mobimak (2002) UNDP (2003-2004)UNDP (2003-2004)

28 27. Results Goce Armenski Institute of Informatics, FNS (january 2001) - 26 courses - 26 courses - 12391 questions - 12391 questions - 589 scheduled assessments - 589 scheduled assessments - 9861 generated tests - 9861 generated tests Data gathering Does eTesting provides more effective and more objective assessment compared to the traditional forms, and does it help the learning process

29 28. Results Goce Armenski Technical infrastructure Software development and maintenance Training of administrators Technical personal Cost savings Decreasing the use of paper Decreasing the use of print material Decreasing the time for conducting the assessment

30 Questionary with 10 questions - Questions with Likert type of answers with 5 values - 236 students were enroled in the survey Student perspective Teacher perspective Creating of question bank is time consuming - more than 1300 per course Time saving after the creation 29. Results Goce Armenski

31 Question 1 30. Results Goce Armenski

32 31. Results Question 2

33 32. Results Goce Armenski Question 3

34 33. Results Goce Armenski Question 4

35 34. Results Goce Armenski Question 5

36 35. Results Goce Armenski Question 6

37 36. Results Goce Armenski Question 7

38 37. Results Goce Armenski Question 8 and 9

39 38. Results Goce Armenski Question 8 and 9 cont

40 39. Results Goce Armenski Question 10

41 40. Results Summary of the results ПрашањеNСр. Вр.Std The use of the system for electronic testing is:2353,71490,8865 The electronic testing is __________ compared to the traditional one 2333,72960,7368 I prefere assessment using the system for electronic testing, than the traditional one 2364,36860,9154 Marking on the system for electronic testing is objective (same for all) 2334,33050,8748 The use of the module for online learning with often knowledge assessment is helping with the material 2364,38140,7428 Using the module for online learning with often knowledge assessment helps me achive more knowledge 2354,01280,8986 I recommend use of the system for electronic testing for assessment in other courses 2334,43780,8288

42 41. 6. CONCLUSION Conclusion Goce Armenski System implementation Dependent of the institution in which it is implementedDependent of the institution in which it is implemented Improvements in the assessment process Possibility to assess knowledge of more than 150 studentsPossibility to assess knowledge of more than 150 students Immediate feedbackImmediate feedback Lowering the subjectivityLowering the subjectivity Self testing possibility before the official assessmentSelf testing possibility before the official assessment Analyses of the gathered dataAnalyses of the gathered data Possitive effects on the learning processPossitive effects on the learning process Incresed security???Incresed security??? Combining this method with other method of assessment can give great results


Download ppt "Improving assessment quality using electronic testing Institute of Informatics - FNS University “Ss. Ciryl and Methodious” – Skopje, Macedonia Software."

Similar presentations


Ads by Google