Presentation is loading. Please wait.

Presentation is loading. Please wait.

ICMAs for Computing Courses Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT) Centre for Excellence in Teaching and.

Similar presentations


Presentation on theme: "ICMAs for Computing Courses Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT) Centre for Excellence in Teaching and."— Presentation transcript:

1 iCMAs for Computing Courses Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT) Centre for Excellence in Teaching and Learning The Open University is incorporated by Royal Charter (RC 000391), an exempt charity in England & Wales and a charity registered in Scotland (SC 038302). Michael Isherwood Associate Teaching Fellow, COLMSCT

2 Presentation Summary Project Aims Background The Quizzes (including demonstration) Initial Analyses Student Views Developing Questions and Quizzes A Way Forward A Vision

3 Project Aims - summary Support students in their understanding of new and difficult parts of M150 (in effect, Block 2) Confirm understanding prior to tma feedback Investigate extension to other computing courses Through better understanding, aid retention

4 Background Students find getting to grips with programming difficult This leads to frustration and dropping out of M150 (and therefore further computing courses)

5 Background – 2006J TMAs

6 Background - TMA02/03 Marks

7 On-Line Quizzes 6 Quizzes on 2008J and 2009B presentations – Structured English – Conditions, truth and trace tables – Basic JavaScript – Selection - the if statement – Repetition - the while statement – Repetition - the for statement Question Types – Selecting tick boxes, radio buttons – Free text and expression entry – Entering values – Drag and drop

8 if constructs – Q1 67% correct at 1st try further 21% at 2 nd attempt 3% correct at 3 rd attempt 10% remained wrong Most errors were instead of != = rather than == feedback for this Q is to indicate wrong answers. As always, correct answer is shown at the end

9 if constructs – Q2 83% correct at 1 st attempt Only 4% wrong after 3 rd attempt Note Feedback after 1 st attempt After 2 nd attempt the correct answers remain, so student concentrates on incorrect answer(s)

10 if construct – Q2 cont

11 if constructs – Q3 75% of responses correct at 1 st attempt 3% wrong after 3 rd attempt. After a 2nd attempt, the correct answers were retained, as shown, for the final attempt. The majority of the errors were at the boundary, as here, though several students reversed

12 if constructs – Q6 On 1st attempt the number of wrong answers is given along with a reminder of the OR (||) operator's effect. The 2nd attempt has feedback as above and the 3rd attempt starts with all the correct responses retained.

13 Conclusions on Questions Most students complete quizzes in 10 – 15 minutes Most/many students get questions right first time and few fail to be correct after third attempt – there are exceptions Most students find the quizzes valuable and the feedback helpful Only one question “didn’t work” on 1 st pilot - discarded Variants of a question generally seemed equivalent

14 iCMAs on Courses Philosophy is generally to use formatively However Students need “encouragement” – hence must form part of assessment M150 has 10-20% attempting as not part of assessment S104 has similar number as attempt tma (effectively 100 in my TG) Students take seriously if part of assessment Courses using iCMAs (some in pilot form) include MU120, M150, MST121, M256, S104, S342

15 Building a Quiz for Assessment Scope of Quiz Each Question needs 5 variants – must be equivalent 3 attempts allowed for each question Each Question requires feedback after each attempt –feedback targeted to answer given therefore must anticipate wrong answers free text (or symbol) entry much harder to anticipate Correct answers may not be apparent, particularly with free text entry

16 Validating Questions Initial testing by Setter Further testing by Others Reviewing Student Responses – for variant equivalence – for errors – then zero rated

17 Learning from Questions What is understood What is misunderstood How course material might be amended

18 Moodle and/or OpenMark OpenMark questions very versatile It is intended to run OpenMark questions through Moodle – work is proceding OpenMark feedback highly tailored Openmark requires specialist programming Moodle has question types Moodle feedback is available, though fairly rigid Moodle questions can be set up by anyone who is reasonably logical – but there is a learning curve Within Moodle a mixture of OpenMark and Moodle questions should soon be possible

19 A Way Forward (i) Identify Question Setter(s) for each iCMA Appoint Coordinator / iCMA expert Workshop(s) on Question and Feedback setting Set Questions with Feedback Review Questions Program Qs (OpenMark and Moodle Qs separately) Test Quiz} iterative Make Corrections} Review Quiz} iterative Further changes}

20 A Way Forward (ii) Coordinator could run workshops Reviewers may be others associated with Block Coordinator reviewing could facilitate uniformity of approach Perhaps Coordinator could interface with developers OpenMark Developer would be needed Moodle Developer probably desirable (could be Coordinator)

21 A Vision for Student Studies Section Does Quiz Next Section “Pod” + Ppt (IC) Tutor Advice Special Session


Download ppt "ICMAs for Computing Courses Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT) Centre for Excellence in Teaching and."

Similar presentations


Ads by Google