Fabrizio Fontana and Matteo Martini

Slides:



Advertisements
Similar presentations
Peer-Assessment. students comment on and judge their colleagues work.
Advertisements

Academic Quality How do you measure up? Rubrics. Levels Basic Effective Exemplary.
Assessing Learning for Classroom Success Friday, February 16, :00-3:00.
Computer Systems Week 3: Social Issues Alma Whitfield.
Proposal 13 HUMAN CENTRIC COMPUTING (COMP106) ASSIGNMENT 2.
Blackboard 9.1 Presented by: Kim Shaver Associate Director of Educational Technology Assisted by : Alicia Harkless, Educational Technology Specialist,
Writing Learning Outcomes David Steer & Stephane Booth Co-Chairs Learning Outcomes Committee.
ITS-VIP SPRING 2012 FINAL PRESENTATION DATA MINING GROUP PHP?HTML INTERFACE Mide Ajayi Nakul Dureja Data Miners Rakesh Kumar David Fleischhauer.
Requirements – Scenarios and Use Cases
The changing model of teaching and learning multimedia E.Rossiou, G.Pantziou Department of Informatics TEI of Athens,Hellas.
Introduction to Clear Path: School Admin 1 Welcome to Clear Path! Your school has elected to use the Clear Path resiliency assessments to measure the.
INSTRUCTOR: JOAN RABIDEAU Unit 5 ~ CS119 is the fastest way to reach me to get assistance and support! AIM – joanlrabideau.
“We’re not preparing students for Jeopardy anymore… it’s a thinking game now!” The PBL Project Session 4 “Technology-Enhanced” Assessments.
CS 3043 Social Implications Of Computing Keith A. Pray Instructor socialimps.keithpray.net GROUP PROJECT MOVIE.
EVALUATION AND SELFASSESSMENT SYSTEMS FOR THE VIRTUAL TEACHING: MULTIPLE CHOICE QUESTIONNAIRES Claudio Cameselle, Susana Gouveia Departamento de Enxeñería.
HRM 326 TUTOR The power of possibility/hrm326tutordotcom.
PowerPoint & Evaluating Resources PowerPoint & Evaluating Resources Mike Spindler & Emma Purnell.
Educational Communication & E-learning
Chapter 1 Introduction and Data Collection
NEEDS ASSESSMENT HRM560 Sheikh Rahman
Resiliency Assessment and Using Clear Path
Planning Your Courses   The Planning phase of course development is a necessary activity for any successful E-learning project. The steps below should.
BUS 642 master Education Begins/bus642master.com
3.02H Publishing a Website 3.02 Develop webpages..
REPORT WRITING.
The New Learning Needs Analysis Tool
Civic Practicum: Project Design and Proposal Writing
Snaptutorial ESE 697 Help Bcome Exceptional/ snaptutorial.com
The progress of the world depends almost entirely upon education
Streamlining Distance Education
BUS 642 Help Bcome Exceptional / bus642.com
Study Guide Research Methods in Nursing (NSC 440) By
Concepts of Engineering and Technology
How to use By Zainab Muman
Cambridge Resources.
General Education Assessment
How college is different from high school
INFORMATION AND PROGRESS
© LiqVid eLearning Services Pvt Ltd
General Education Assessment
P7: Annotated Wireframes
Naviance: Do What You Are Personality Survey
What is OCACCESS Online?
New Teacher Evaluation Process
BUS 642 Education for Service-- snaptutorial.com.
MGT 498 EDU Lessons in Excellence-- mgt498edu.com.
BUS 642 Teaching Effectively-- snaptutorial.com
Business and Management Research
Requirements – Scenarios and Use Cases
Maths Hubs and the Core Maths Support Programme
Instructional Learning Cycle:
Warm up – Unit 4 Test – Financial Analysis
Welcome to Naviance at Lowell High School
PERFORMANCE AND TALENT MANAGEMENT
BlackBoard 5 A Definitive e-Learning Software Platform Ozgur Balsoy,
IXercise Webapp Group 8b.
Concepts of Engineering and Technology
4.02 Develop web pages using various layouts and technologies.
Business and Management Research
The InWEnt Blended-learning approach; GC21 as an e-learning and Blended-learning platform 22/02/2019 An introduction course on InWEnt Blended-learning.
Tasks & Grades for MET3.
Business Statistics: A First Course (3rd Edition)
LO4 - Be Able to Use IT Applications to Meet the Business Needs
DESIGN OF EXPERIMENTS by R. C. Baker
Indicator 3.05 Interpret marketing information to test hypotheses and/or to resolve issues.
The Moodle eLearning platform -University of Pitesti-
e - Portfolio Elham Fathi Ali Nasser ID: A
The Features of Smarthinking
Design Considerations
Presentation transcript:

Multiple answer questionnaires can help students in self-evaluating their preparation to the exams. Fabrizio Fontana and Matteo Martini Università degli Studi G. Marconi, Italy

DL growth In the last decade, we assisted to a large increase of university and professional courses delivered online This is due to the advantages of DL in terms of time, availability, etc. But also to the always increasing quality of these courses pushed up from WEB evolution and collaborative learning

An important problem: University Drop-Out Rate

Drop-out rate Students drop-out rate is too high especially for such countries/courses Average value for OCSE countries around 30% (too high) Italy and US have a drop-out rate greater than 50% (real unsolved problem)

Drop-out rate: Italy Dropout rate reach 55% considering all courses and cycles Almost 50% dropout in first year! Data from Italian ministry for three years first cycle bachelor graduate Only 23% graduate in 3 years Drop-out Almost 6% is still studying after 9 years! Still studying

What can contrast this trend? From a survey conducted at Bari University*: Better quality of learning objects Specified solution for workers-student Greater availability of teachers Personalized tutoring *http://www.dip-statistica.uniba.it/html/annali/2010/ANNALI9-19.pdf Suggestions from students of a Traditional University

University first year From Italian statistics, about 50% of students drop out university during first year Students can be afraid when passing from secondary school to university i.e. when passing from a continuous teacher control to an autonomous study

University first year Before starting, our students must learn how to study, i.e.: Develop a personal study-method Organize their work in complete autonomy Surely, gain a self-government is part of students growth but if not properly helped our students can drop out university.

First year drop-out Starting from these assumptions we want to realize a technological based self assessment activity to: Permit self evaluation during study Drive students to a better organized work Permit self evaluation before final exam Indirectly measure course quality and single arguments clearness

Assessment requirements Initial requirements can be summarized as: Activity with different questions every time Possibility to choose arguments Self-evaluation by students Direct teacher control

Proposed assessment Our proposal? An old, well known and clear approach: Multiple Answer Questionnaire … but technologically evolved!

Proposed assessment Adopted features: 15 questions with 4 possible answers, only 1 correct Test is considered “passed” if at least 11 answers are correct (>70%)

Assessment requirements Let’s go through Initial requirements … Activity with different questions every time Possibility to choose arguments Self-evaluation by students Direct teacher control

Assessment requirements Let’s go through Initial requirements … Activity with different questions every time Possibility to choose arguments Self-evaluation by students Direct teacher control

Max time per question This value has been chosen taking into account that some questions are not definitions but simple exercises requiring thinking and ability with formulas and theorems. Moreover, this time has been fixed to give to students the possibility to meditate on questions avoiding stress due to time but also to do not give the possibility to consult course materials.

Assessment requirements Let’s go through Initial requirements … Activity with different questions every time Possibility to choose arguments Self-evaluation by students Direct teacher control

Self-Evaluation When completed the questionnaire, a detailed report is showed on screen with: Details of corrects-wrongs and eventually right answers Possibility to send report to student private mail address (to meditate) and to teacher To push students in contacting teachers, 3 thresholds for the feedback have been identified: >70% correct, Test passed Between 30% and 70%, Test not passed with the possibility to send report to teacher Below 30% a pop-up appears strongly suggesting to send report and contacting teacher ** ** Statistically, answering randomly, 25% of questions could be correct.

Assessment requirements Let’s go through Initial requirements … Activity with different questions every time Possibility to choose arguments Self-evaluation by students Direct teacher control

Teacher control Teacher task is to contact students to explain errors, explain arguments not fully understood and/or suggest additional readings This creates a direct and efficient contact between teachers and students with a personalized tutoring during study

Quality implication This questionnaire represents: An help to students improving study efficiency and then pass-rate An help to teachers to measure single arguments quality

Quality implication Storing the entire set of completed questionnaires, important consideration can be done on both students community and course quality. Analyzing reports, instructors can evaluate to which extent learning materials are unfriendly to the students community and eventually decide to insert new learning objects.

Case Study: General Physics Right answer (%) Questions Sub-sample of 50 questions over 500

Case Study: General Physics Average Right answers 10% Average Right answers 50% Right answer (%) Questions Sub-sample of 50 questions over 500 This group is relative to a single arguments For specialist, 2.5 sigma far from the average Average 46% +/- 16%

Case Study: General Physics Even if we still have a small data sample of completed questionnaires (less than 1 academic year, 500 tests), we identified a hard understandable argument (Entropy tbp) Then … A new didactical unit has been already prepared and inserted into LMS to better explain this argument During next weeks we will re-analyze plot to measure students response.

Future development To help instructors not skilled or friendly with computer technology and/or statistical analysis, a specific web interface has been created. System hosted on a dedicated server and each teacher has his own credentials to access the server.

Future development A user friendly page permits to: Create different tests Indicate one or more correct questions Tag every question with an ID for arguments selection Customize the max time per question Statistically analyze single student response or the results of the entire students community

Conclusion Starting from the simple idea of a multiple answer questionnaire we develop a powerful technological learning tool Didactical utility of this product has been demonstrated To allow the use of this architecture in every course, a web interface has been realized simply permitting tests creation and students/course analysis