Download presentation
Presentation is loading. Please wait.
Published byLorena Chandler Modified over 9 years ago
1
1 30-8-2015 Challenge the future Delft University of Technology Digital testing: What do students think of it? E-merge, November 2014 Ir. Meta Keijzer-de Ruijter
2
2 Introduction The Delft situation Type A tests Uses a test system (MapleTA, Bb, COZ, Mastering Physics, etc.) Type B tests Assignment delivered on paper use professional software to show skills. File created is being assessed. Both formative and summative tests a Gibbs, G. Using Assessment to support student learning ISBN 978-1-907240-06-5
3
3 Growth Homework Assignments in MapleTA Number of classes and students taking a test
4
4 Growth digital exams (total) Number of students taking a test [1] [1] Prognosis is based on the model of growth in acceptance of Blackboard (Zanden, 2009).
5
5 Purpose Questionnaire What is the student’s view on digital testing Expectations towards digital testing Experiences and issues Suitability question types Multiple choice Simple and more complex numerical questions Essay questions Programming and simulations Midterms, homework and exams
6
6 Respondents Questionnaire Number of respondents: 1341 (53% completed fully) Response from all faculties More response from faculties that have the most digital exams (mechanical & maritime (25%), civil (22%) and aerospace (16%) engineering) Student ‘type’ Bachelor (67%), Master (31%), other (2%) Years of study 1 year (28%), 1-3 years (35%), more than 3 years (37%) Familiar with digital testing No (A:23%, B:44%), 1 course (A:23%, B:24%) or more than 1 course this year (A:40%, B:19%), earlier or elsewhere (A:15%, B: 13%)
7
7 Expectations Do you expect digital tests at TU Delft Students find that digital testing is of this age Innovation, efficiency, inevitable but… Apply in situations that are suitable : Testing of programming skills (type B) Formative situations, providing enough feedback and quick results. But don’t expect us to not to work together. Multiple choice tests for knowledge (not for higher level skills) Very simple calculations Make sure it is a valid, reliable and transparant Only numerical response that is assessed is not fair, the reasoning behind a problem solved should count.
8
8 Expectations Advantages for students Quick feedback / indication of grade (63%) Though not always resulting in faster grading Immediate feedback/preliminary score during or right after not always appreciated Readability of essay tests No cramps from handwriting Structure of response better More frequent testing More suitable for programming and simulations (type B) More a disadvantage Fraude Validity (mc and numerical)
9
9 Experience type A test Login problems (individual) Login problems (group) Password expired Thrown out of test type B test Software unavailable (computer) Software unavailable (location) Software crash Saving problems (computer) Difference in security settings, licences and access to file location(s)
10
10 Experience Validity, Reliability and Sensibility to fraude Students find multiple choice and more complex calculations not valid (70%) Can’t elaborate Can’t show approach/reasoning No partial grading Marking is reliable but too strict Decimal point instead of decimal comma Margin of error, significance More sensible to fraude (no 27%, no difference 37%, yes 35%) Someone else’s screen, multiple sessions, item bank too small Essay test is OK, but no calculations or sketches
11
11 Suitability of question type Conditions Multiple choice When quick feedback is essential knowledge Numerical question Simple calculation Adaptive question More complex calculation Essay Only for writing, not sketching or calculations and derivations
12
12 Numerical and adaptive questions Suitability Example simple numerical simple numerical Example more complex numerical more complex numerical Single formula Correct calculation Combination of formulas Multiple steps Correct calculation
13
13 Numerical and adaptive questions Suitability Complex calculations through adaptive questions YES Partial grading possible Helps students with blackout NO Takes too much time, too complex Still vulnerable to small mistakes OK but rather not Grading is too strict No opportunity for corrections Stressful Simple calculations through numerical question YES TU Student should be able to do that correctly NO Does not occur (too simple) Still vulnerable to small mistakes Impacts level of exam Grading is too strict
14
14 Attitude Study effort is less for digital tests Less effort +Easier when multiple choice +Item bank too small More effort -It has to be fully right (no ‘easy points’ gathering)
15
15 Attitude Digital exams give me more stress More stress +Due to strict grading +No partial grading +Technical issues +Timer on screen +Direct feedback Less stress, not applicable -Easier -Re-use of questions
16
16 Preference Multiple choice – delivery from paper to computer On Computer +Fast grading +Easy to check answers -On screen reading NOTE: Preferably no multiple choice since calculations are not taken into account when grading
17
17 Preference Midterms Credit
18
18 Preference Midterms Result and grade
19
19 Preference Midterms Reflect exam level
20
20 … and now Reflections and actions Our view adaptive testing can be valid Transparancy: what to expect, better presentation of the question Good quality: review and testing, quality control Develop guidelines for adaptive testing and validate them Communication to students Reduce technical issues & fraude simplify login & security new exam room (dedicated computers, one exam session) Expand functionality of MapleTA On screen grading of essay questions Add attachment to question (scanned notes) Investigate: replace Sonate with MapleTA
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.