Presentation is loading. Please wait.

Presentation is loading. Please wait.

TAO - Open Assessment TechnologiesOpen Assessment Technologies The Dewis e-Assessment Systeme-Assessment System.

Similar presentations


Presentation on theme: "TAO - Open Assessment TechnologiesOpen Assessment Technologies The Dewis e-Assessment Systeme-Assessment System."— Presentation transcript:

1 TAO - Open Assessment TechnologiesOpen Assessment Technologies The Dewis e-Assessment Systeme-Assessment System

2 We believe the future of education is an open digital ecosystem that unites technologies and accelerates innovation.

3 The Leading Open Source, QTIQTI and LTI-Compliant Assessment PlatformLTI-CompliantAssessment Platform QTIQTI – Questions Tools Interoperability LTI - LTI - Learning Tools Interoperability

4 QTIQTI – Questions Tools Interoperability LTI - LTI - Learning Tools Interoperability The IMS Learning Tools Interoperability (LTI) standard prescribes a way to easily and securely connect learning applications and tools with platforms like learning management systems (LMS), portals and learning object repositories on your premise or in the cloud, in a secure and standard manner and without the need for expensive custom programming.LTI Using LTI, if you have an interactive assessment application or virtual chemistry lab, it can be securely connected to your LMS with a few clicks. LTI is comprised of a central core and optional extensions to add optional features and functions.

5  TAO (Test Assisté par Ordinateur) started in late 2002 as a joint project between the Henri Tudor Research Center – now called the Luxembourg Institute for Science and Technology (LIST) and the University of Luxembourg.  In 2013, following years of international adoption of the TAO platform, LIST partnered with Cito International to form Open Assessment Technologies (OAT). Powered by the OAT Open Source philosophy and driven with commitment to the common good in education, TAO has evolved into the leading open source assessment solution for education and employment, with users in 194 countries and 30 languages. To date, TAO has delivered over 30 million tests worldwide.

6  2007 PISA selects TAO. 2009 survey across 20 countries (consortium led by ACER).  2008 OECD selects TAO to conduct its 2012 PIAAC (The Program for the International Assessment of Adult Competencies) study of adult skills across 20+ countries in 30+ languages (consortium led by ETS).  2009 - First public release of TAO 1.0 under GPL 2 License.  OECD selects TAO to conduct its 2012 PISA study across 60+ countries in 80+ languages (consortium led by ACER).  2011 - OECD selects TAO as its strategic platform to conduct the PISA 2015 study across 70+ countries (consortium led by ETS)  2012 - OECD Selects TAO to expand its PIAAC study to 10 countries (consortium led by ETS)  2013 - The Swiss EDK selects TAO for monitoring student progress throughout the country.  2017 - NCSE selects TAO to assess students with learning disabilities across 24 partner and affiliated states for K12 schools in the U.S.  2019 - Open Assessment Technologies Inc. opens their Boston, Massachusetts office.

7

8  100% Open Source – simply download and have your internal or external teams install and customize.  Explore the Possibilities of Assessment Creation  TAO Community Edition puts control over your digital assessment platform back in your hands. Own every unique aspect of your assessment experience, from customization to content and collaborate openly on new solutions with the TAO community.  100% open source and free to download  Prepackaged Windows installer  Use TAO as is, or customize it to your needs  Enhance TAO with your own functionality  Use your creations in-house, or share them with the TAO community

9  A Deeply Integrated, Custom Assessment Platform  Create a deeply integrated, custom assessment environment that delivers high performance and reduced costs for your organization.  TAO Enterprise Edition gives you the flexibility to connect with every technology system you need while enjoying the platform’s off-the-shelf features and functionalities—enhanced for you.  You’ll gain a rapid competitive advantage by creating a unified assessment architecture that is secure, powerful, cost-effective, and still completely open.  A bespoke, end-to-end assessment solution offering the best of both worlds  Value-added extensions, augmented with features specific to you  Single-tenant, personalized deployment  Custom dedicated training and support

10  Create Portable Standards-Based TEIs  You can now create tests incorporating TEIs built on the QTI PCI standard, and deliver them through the TAO test driver. Because PCI is an IMS industry standard, it is now possible to even make these items portable for third party QTI test drivers.QTI PCI

11

12

13  TAO User Guide - LinkLink  TAO Product Demo - LinkLink  How to Create a Multiple-Choice Item Using TAO – LinkLink  TAO Datasheet – LinkLink

14

15

16  TAO User Guide - LinkLink

17

18  TAO Product Demo – LinkLink  Why TAO - LinkLink  How to Create a Multiple-Choice Item Using TAO – LinkLink  TAO Datasheet – LinkLink  Tutorial Videos – LinkLink  TAO User Guide - LinkLink  TAO Product Demo – LinkLink  7 Best Open Source Exam Software for Online Assessment - LinkLink

19 7 Best Open Source Exam Software for Online Assessment - LinkLink

20 Dr Rhys Gwynllyw Dr Karen Henderson Senior Lecturers in Mathematics and UWE Learning and Teaching Fellows Department of Engineering Design & Mathematics, UWE, Bristol.

21 Intelligent Marking Made possible by the algorithmic approach to marking/feedback together with loss-less data collection. Fairer to the student and is much more representative of how a human would mark. (Essential for larger compound questions with multiple coupled inputs). Common Student Errors (CSE) / Partial Credit The marking recognises common student errors. In some cases, credit may be given if the answer is ‘partially’ correct. E.g. student supplies an angle in radians as opposed to the required degrees. Whether the CSE is credited or not with marks, the triggering of this error should be fed into the feedback supplied to the student. Continuation/Follow-on Marking Where subsequent answers depend on previous ones. Verification Marking For questions with non-unique solutions. Student answers are verified against a necessary and sufficient condition for correctness. E.g. Obtain a vector orthogonal to another vector, select a line on a graph. Also applies to algebraic inputs and many other scenarios. Retrospective-Marking Particularly useful in new and/or large assessments. Facilitates the analysis of the students’ performance way beyond the study of ‘marks scored’ and ‘meta-data’. Analysis can highlight new ‘common student errors’ which can be fed into existing assessments to alter feedback. Uses DEWIS’ loss-less data feature and the extensive Reporter. Staged Assessments The assessment is in stages. E.g. ‘progression to’ or ‘difficulty of’ the next stage dependent on performance in previous stage(s). (Marking communicates with assessment generator).

22  OPEN Source  The DEWIS system has been designed and developed by a team of of Mathematicians, Statisticians and Software Engineers at the University of the West of England, Bristol (UWE).  It is a e-Assessment system initially designed for the assessment of Mathematics and Statistics but which can also be used in other subject fields.  At UWE, DEWIS has been used for both a formative and summative e-assessments across a number of modules, delivered to students in awards in the fields of Business, Computer Science, Nursing, Software Engineering, Engineering and Mathematics and Statistics.

23 DEWIS e-Assessment system  Designed and developed at UWE, first implemented in 2007 and it is supported by the university. A completely stand-alone web based system for both summative and formative assessments.  Motivated by problems encountered with other e-assessment systems (licence and version problems, lack of support, fragile, inflexible, stressful, ‘have another go’ culture).  Primarily designed for numerate e-assessments; current usage in the fields of Business, Computer Science, Nursing, Engineering, Mathematics and Statistics.  UWE.  Satellite Colleges in the SW of England, Sri Lanka, Malaysia, Nepal.  Leeds University (Mathematics).  Mathcentre  Open Source  For more information, see www.cems.uwe.ac.uk/dewis

24 1. Completely stand-alone. Independent of commercial software due to previous difficulties with licences, support and version updates. However, includes student access using institution access details. 2. Fully algorithmic in all the following:  Question parameter generation – including reverse engineering and (constrained) random variables.  Marking (allows for ‘intelligent marking’)  Feedback  Assessment performance analysis  Interacts with other computer systems/languages (R, Python) 3.Loss-less data collection. Easy and direct access to this data through a comprehensive management system. 4. Academics able to design, develop and own their own questions. 5. Question Types: Numerical, algebraic expressions, multiple choice, multiple selection, text, matrix, graphical. Compound questions.

25 Here we supply a brief summary of what we consider to be the key features of DEWIS:Algorithmic question generation, marking and feedback  Different Question Input Types  Academics' Management  Lossless Data Collection  Student Friendly Features  Independent of Commercial Software  Robust and Efficient Demonstrations  View DEWIS at mathcentre - to see examples of assessments in anonymous formative mode. (opens new window)  HEA STEM presentation - part of the 'E- assessment in the Mathematical Sciences' workshop (Middlesex University 2014)  Showcase Questions - to see examples of individual questions that may be included in assessments.  View the Public Question Bank - password protected - contact us to obtain the password.  View a Sample Reporter.  SCORM Package Example.

26

27

28

29  The Dewis e-Assessment System LinkLink  The Dewis Examples LinkLink  The Dewis Examples LinkLink  Demo 1 LinkLink  Demo 2 LinkLink  Introduction to DEWIS - motivation and key aims. (7m 44s)Introduction to DEWIS  Example Assessment 1 - illustrating the algorithmic features. (15m 41s) Example Assessment 1  Example Assessment 2 - large coupled assessment using DEWIS' communication with the 'R' statistical package. (4m 44s)Example Assessment 2  Example Assessment 3 - analysing the students' performance using the Reporter and using retrospective marking. (5m 38s)Example Assessment 3  Summary. (2m 46s)Summary  The powerpoint presentation associated with this talk is available here. available here

30  Work in progress:  Engineering diagnostic tests.  More statistics questions using ‘R’.  Documentation refresh. If you are interested in using DEWIS then please contact: rhys.gwynllyw@uwe.ac.uk System development and deployment karen.henderson@uwe.ac.uk Question bank development and assessment deployment or visit the welcome page: www.cems.uwe.ac.uk/dewis


Download ppt "TAO - Open Assessment TechnologiesOpen Assessment Technologies The Dewis e-Assessment Systeme-Assessment System."

Similar presentations


Ads by Google