Download presentation
Presentation is loading. Please wait.
1
Multiple answer questionnaires can help students in self-evaluating their preparation to the exams.
Fabrizio Fontana and Matteo Martini Università degli Studi G. Marconi, Italy
2
DL growth In the last decade, we assisted to a large increase of university and professional courses delivered online This is due to the advantages of DL in terms of time, availability, etc. But also to the always increasing quality of these courses pushed up from WEB evolution and collaborative learning
3
An important problem: University Drop-Out Rate
4
Drop-out rate Students drop-out rate is too high especially for such countries/courses Average value for OCSE countries around 30% (too high) Italy and US have a drop-out rate greater than 50% (real unsolved problem)
5
Drop-out rate: Italy Dropout rate reach 55% considering all courses and cycles Almost 50% dropout in first year! Data from Italian ministry for three years first cycle bachelor graduate Only 23% graduate in 3 years Drop-out Almost 6% is still studying after 9 years! Still studying
6
What can contrast this trend?
From a survey conducted at Bari University*: Better quality of learning objects Specified solution for workers-student Greater availability of teachers Personalized tutoring * Suggestions from students of a Traditional University
7
University first year From Italian statistics, about 50% of students drop out university during first year Students can be afraid when passing from secondary school to university i.e. when passing from a continuous teacher control to an autonomous study
8
University first year Before starting, our students must learn how to study, i.e.: Develop a personal study-method Organize their work in complete autonomy Surely, gain a self-government is part of students growth but if not properly helped our students can drop out university.
9
First year drop-out Starting from these assumptions we want to realize a technological based self assessment activity to: Permit self evaluation during study Drive students to a better organized work Permit self evaluation before final exam Indirectly measure course quality and single arguments clearness
10
Assessment requirements
Initial requirements can be summarized as: Activity with different questions every time Possibility to choose arguments Self-evaluation by students Direct teacher control
11
Proposed assessment Our proposal? An old, well known and clear approach: Multiple Answer Questionnaire … but technologically evolved!
12
Proposed assessment Adopted features:
15 questions with 4 possible answers, only 1 correct Test is considered “passed” if at least 11 answers are correct (>70%)
13
Assessment requirements
Let’s go through Initial requirements … Activity with different questions every time Possibility to choose arguments Self-evaluation by students Direct teacher control
14
Assessment requirements
Let’s go through Initial requirements … Activity with different questions every time Possibility to choose arguments Self-evaluation by students Direct teacher control
15
Max time per question This value has been chosen taking into account that some questions are not definitions but simple exercises requiring thinking and ability with formulas and theorems. Moreover, this time has been fixed to give to students the possibility to meditate on questions avoiding stress due to time but also to do not give the possibility to consult course materials.
16
Assessment requirements
Let’s go through Initial requirements … Activity with different questions every time Possibility to choose arguments Self-evaluation by students Direct teacher control
17
Self-Evaluation When completed the questionnaire, a detailed report is showed on screen with: Details of corrects-wrongs and eventually right answers Possibility to send report to student private mail address (to meditate) and to teacher To push students in contacting teachers, 3 thresholds for the feedback have been identified: >70% correct, Test passed Between 30% and 70%, Test not passed with the possibility to send report to teacher Below 30% a pop-up appears strongly suggesting to send report and contacting teacher ** ** Statistically, answering randomly, 25% of questions could be correct.
18
Assessment requirements
Let’s go through Initial requirements … Activity with different questions every time Possibility to choose arguments Self-evaluation by students Direct teacher control
19
Teacher control Teacher task is to contact students to explain errors, explain arguments not fully understood and/or suggest additional readings This creates a direct and efficient contact between teachers and students with a personalized tutoring during study
20
Quality implication This questionnaire represents:
An help to students improving study efficiency and then pass-rate An help to teachers to measure single arguments quality
21
Quality implication Storing the entire set of completed questionnaires, important consideration can be done on both students community and course quality. Analyzing reports, instructors can evaluate to which extent learning materials are unfriendly to the students community and eventually decide to insert new learning objects.
22
Case Study: General Physics
Right answer (%) Questions Sub-sample of 50 questions over 500
23
Case Study: General Physics
Average Right answers 10% Average Right answers 50% Right answer (%) Questions Sub-sample of 50 questions over 500 This group is relative to a single arguments For specialist, 2.5 sigma far from the average Average 46% +/- 16%
24
Case Study: General Physics
Even if we still have a small data sample of completed questionnaires (less than 1 academic year, 500 tests), we identified a hard understandable argument (Entropy tbp) Then … A new didactical unit has been already prepared and inserted into LMS to better explain this argument During next weeks we will re-analyze plot to measure students response.
25
Future development To help instructors not skilled or friendly with computer technology and/or statistical analysis, a specific web interface has been created. System hosted on a dedicated server and each teacher has his own credentials to access the server.
26
Future development A user friendly page permits to:
Create different tests Indicate one or more correct questions Tag every question with an ID for arguments selection Customize the max time per question Statistically analyze single student response or the results of the entire students community
27
Conclusion Starting from the simple idea of a multiple answer questionnaire we develop a powerful technological learning tool Didactical utility of this product has been demonstrated To allow the use of this architecture in every course, a web interface has been realized simply permitting tests creation and students/course analysis
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.