Metadata in the Educational Sciences

Slides:



Advertisements
Similar presentations
Directorate of Learning Resources Steve Burholt (eLearning Systems Developer) Rowena Rouse (Library IT Developer) Richard Francis (Head of eLearning) Jan.
Advertisements

Gerrit de Bolster September 24, 2013 Generating Blaise from DDI.
Innovation and Growth of Large Scale Assessments Irwin Kirsch Educational Testing Service February 18, 2013.
Towards Adaptive Web-Based Learning Systems Katerina Georgouli, MSc, PhD Associate Professor T.E.I. of Athens Dept. of Informatics Tempus.
Plannes security for items, variables and applications NEPS User Rights Management.
A database-driven tool to create items, variables and questionnaires NEPS Metadata Editor.
Bringing XBRL tax filing to the UK Jeff Smith, Customer Contact, Online Services,
Algorithmic e-Assessment with DEWIS Dr Rhys Gwynllyw Dr Karen Henderson Senior Lecturers in Mathematics and UWE Learning and Teaching Fellows Department.
Data Documentation Initiative (DDI): Goals and Benefits Mary Vardigan Director, DDI Alliance.
1st NRC Meeting, October 2006, Amsterdam 1 ICCS 2009 Field Operations.
Systems Analysis – Analyzing Requirements.  Analyzing requirement stage identifies user information needs and new systems requirements  IS dev team.
DDI Profiles to Support Software Development as well as Data Exchange and Analysis NADDI , Vancouver (Canada) David Schiller, IAB Ingo Barkow,
DINI „Electronic Publishing Group“ DINI – Certificate Document and Publication Repositories “Electronic Publishing Group“
Meta-Knowledge Computer-age study skill or What kids need to know to be effective students Graham Seibert Copyright 2006.
1 Phillips Andover Academy 2/22/ :30 – 1:00 Darek Sady Blackboard Learning System (Release 6.3) Assessments, Surveys, and Pools.
1 Seminar on 2008 SNA Implementation June 2010, Saint John’s, Antigua and Barbuda GULAB SINGH UN Statistics Division Diagnostic Framework: National.
Questionmark’s 2005 Users Conference  New Orleans Copyright © Questionmark Corporation and/or Questionmark Computing Limited, known collectively.
Edmodo Made Easy By: Susan O’Day.
INFORMATION MANAGEMENT Module INFORMATION MANAGEMENT Module
CASE (Computer-Aided Software Engineering) Tools Software that is used to support software process activities. Provides software process support by:- –
Workforce Scheduling Release 5.0 for Windows Implementation Overview OWS Development Team.
Discovering Computers 2010
Self-assessing with adaptive exercises Chye-Foong Yong & Colin A Higgins University of Nottingham.
Moving NAPLAN Online ACE 2015 National Conference 24 September, 2015 Stanley Rabinowitz, Ph.D General Manager, Assessment and Reporting.
Lesson objective To understand how to draw and write up a questionnaire Success criteria: Build – discuss what makes a good question in a questionnaire.
STUDENT ASSESSMENT FOR INDEPENDENTS SCHOOLS: BRIEF REMARKS. J. Enrique Froemel, Ph.D. Director, Student Assessment Office.
GCE Software Systems Development
Publishing DDI-Related Topics Advantages and Challenges of Creating Publications Joachim Wackerow EDDI16 - 8th Annual European DDI User Conference Cologne,
Finishing a Presentation in PowerPoint 2016
Assessments for Monitoring and Improving the Quality of Education
CS4311 Spring 2011 Process Improvement Dr
Administrative Data in the IAB Metadata Management System
SAP University Alliances
Quality assurance in official statistics
Karen Dennison Collections Development Manager
Cypress Ridge Elementary School
Michigan Questionnaire Documentation System (MQDS)
Adding Assignments and Learning Units to Your TSS Course
Online Testing System Assessment Viewing Application (AVA)
IASSIST , Toronto (Canada)
Rogatus - Questionnaire and Metadata Management System
Setting Up and Supporting Clients Using Employee Development in ADP Workforce Now [Developer: Use this slide if you are not using audio. You can add.
Ticket In *Tape in your Study Island Login Card into your folder
What’s New in Colectica 5.3 Part 1
Questasy: Documenting and Disseminating Longitudinal Data Online with DDI 3 Edwin de Vet 11/14/2018.
Multimedia Authoring Tools
Curriculum Technology Coordinator

Enhancing ICPSR metadata with DDI-Lifecycle
Data harmonization in International Surveys on Education
Use of handheld electronic devices for data collection in GeoStat
Assessment of the Swiss National Objectives in Education
Ripple Primary School Key Stage 1 NATs
An update on Rogatus Supporting the survey workflow with open standards and tools Ingo Barkow (DIPF) – Senior Researcher, Data Manager David Schiller (IAB)
Question Banks, Reusability, and DDI 3.2 (Use Parameters)
Advances in metadata systems
The Generic Statistical Business Process Model

Prepared by Peter Boško, Luxembourg June 2012
Introducing the GSBPM Steven Vale UNECE
CBA Assessments in Eduphoria Aware
Educational Computing
Implementing DDI in a Survey Organisation
Modelling DDI-L into a combination of tools
Capitalising on Metadata
EDDI Copenhagen (Denmark)
An open-source survey and case management system
Contract Management Software 100% Cloud-Based ContraxAware provides you with a deep set of easy to use contract management features.
WHERE TO FIND IT – Accessing the Inventory
Palestinian Central Bureau of Statistics
Presentation transcript:

Metadata in the Educational Sciences Dr. Ingo Barkow, Associate Professor for Operational Data Management

Agenda Introduction Metadata in the educational sciences The Swiss National School Monitoring (ÜGK / COFO) An example process for metadata management in ÜGK Where DDI can help Where DDI (currently) cannot help Q&A

Obligatory Tourism Slide University of Applied Sciences HTW Chur 1.600 Students 3 Departments (Tourism, Management, Information Science) Swiss Institute for Information Science Master of Science (MSc) in Information Science and Data Management Project Team for Research Data Management

Metadata in the educational sciences Research process in the educational sciences is very similar to the social sciences Models like the Generic Longitudinal Business Process Model (GLBPM) can also be applied here Most large-scale studies (e.g. PISA, PIAAC, PIRLS, TIMMS) use a mix of questionnaires and cognitive items Shift from paper-based assessment towards computer-based assessment Metadata standards (e.g. QTI, SCORM) exist, but only for very specific parts of the research process

Metadata in the educational sciences Specific challenges in computer-based assessment Layout and screen size must be fixed to avoid changes in item difficulty Scoring rules have to be implemented (and documented) Very complex item types (e.g. simulations) Mode effects (e.g. paper to computer, computer to tablet) Use of logfile data analysis e.g. for diagnostic assessment Statistical parameters (e.g. Cronbach‘s Alpha) have to be stored to support adaptive testing Items are usually highly confidential (e.g. SAT, GMAT)

Other metadata standards in ÜGK Some interactions from IMS Questionnaire and Test Interoperability (QTI) ChoiceInteraction (multiple choice items with one or more correct answers) OrderInteraction (sorting answers in a list) AssociateInteraction (building pairs of answers) MatchInteraction (assigning correct responses to a matrix) GapMatchInteraction (matrix with open answers) InlineChoiceInteraction (choice of different answers within a text) TextEntryInteraction (open text boxes within different media) ExtendedTextInteraction (longer text box for free text passages) HotTextInteraction (choices of several text answers within a text) HotSpotInteraction (clickable areas in a graphic) SelectPointInteraction (marking an area in a graphic) GraphicOrderInteraction (marking hotspots with numbers) GraphicAssociateInteraction (building relations between hotspots in graphics) GraphicGapMatchInteraction (graphics with text boxes to enter data) PositionObjectInteraction (similar to HotSpot, but a predefined graphic can be used) SliderInteraction (offers a slider bar for a numeric answer) DrawingInteraction (offers a sketchboard for drawing) UploadInteraction (data created by another tool can be attached as an answer)

ÜGK Mathematics Item DIPF PowerPoint-Präsentation

PIAAC Problem Solving Item DIPF PowerPoint-Präsentation

CBA Itembuilder MicroFIN item DIPF PowerPoint-Präsentation

Background: «EDK» and «HarmoS» In Switzerland, the main responsibility for education and culture lies with the cantons. They coordinate their work at the national level. The 26 cantonal ministers of education (called “directors of education”) together form a political body to carry out this work: the Swiss Conference of Cantonal Ministers of Education (EDK). Legally binding, intercantonal agreements (known as concordats) form the foundation for the work of the EDK. One of these agreements is the HarmoS concordat, the Intercantonal Agreement on the Harmonization of Compulsory Education. It came into effect in 2009 and is aimed at harmonizing some cornerstones of the educational system nationwide.

The Swiss National Objectives in Education and «ÜGK» The Swiss National Objectives in Education are some of these cornerstones of the HarmoS concordat. They were developed until 2011 and describe which core skills students in all cantons should have obtained after 2, 6 and 9 years of school in languages, math and natural sciences. They are now part of all curriculums in all cantons. In 2013 it was decided how to assess these objectives ad if the harmonization has been successful. Therefor the Swiss National Core Skills Assessment Programme (in German: ÜGK – Überprüfung der Grundkompetenzen) was initiated. First survey in spring 2016: math, 9th school year Second Survey in spring 2017: languages, 6th school year The cantons are thinking about institutionalizing these assessments and coordinating them with the PISA assessments 15 Partners conducting theses surveys

Tasks of HTW Chur in «ÜGK» Leading the work package IT support and data management IT support: Coordinating and organizing the hardware with the service provider Coordinating the use and set up of software and servers with DIPF Optimizing and developing the tools for the survey management Data management: Introducing and optimizing the data management Introducing metadata standards for educational data Consulting for FORS regarding the preservation of educational data and metadata

A Tool for Survey Management Will be developed for PISA and ÜGK Development started in spring 2017 The tool should support the fieldwork and data management in an adequate manner: Support the administration and field monitoring during the data collection process Optimize the data documentation process by generating documentation und metadata as a byproduct without additional effort during the data collection process Tool bases on a modified version of Rogatus Survey 2.0 (specifically the case management system) Current version only supports CAPI – will be modified to use case “data collections in schools” Should be able to use different metadata standards for preservation

Metadata-driven research process Research project information stored in survey management system Items enriched with metadata during design process and stored in item bank Data collection using survey / case management system thus creating paradata / log files Storing of all metadata, paradata and data in research data facility Metadata and data are linked to existing publication (enhanced publication) All items and results are available for re-use in research data facility Research data facility ensures long-time preservation by rollover processes for format changes Here are examples for all processes with predecessors of the upcoming toolset

Metadata-driven Survey Management Software: IAB Metadata Portal (TBA21) Metadata Standard: DDI Lifecycle 3.2

Metadata-driven Item Design & Item Banking Software: TIPO Item Portal (TBA21/DIPF) / TAO (o.a.t. S.A.) Metadata Standard: QTI v2.1

Metadata-driven Data Collection Process (Mobile) Software: Rogatus Survey / Aitema Mobile Client (TBA21) Metadata Standard: DDI Lifecycle 3.2 (not for paradata)

Metadata-driven Data Dissemination Software: Rogatus Survey / Administrator Panel (TBA21) Metadata Standard: DDI Lifecycle 3.2

Metadata-driven Publication Process Software: Invenio 0.99 (CERN) Metadata Standards: MARC21 / DataCite

Metadata-driven Secondary Data Analysis / Retrospective Software: FORSbase (FORS) Metadata Standard: DDI Lifecycle 3.2

Where DDI can help DDI Lifecycle v3.2 can be used in “ÜGK” to store survey information and questionnaire design including routing It can be the standard to forward these metadata and data from the data collection towards the data archive Currently it can be used for the questionnaires as well as very simple cognitive items (items with a simple stimulus like a picture plus a multiple choice battery) Upcoming survey tool is DDI-based as well as the repository at FORS New project between HTW and FORS to create common DDI Profile based on former IAB metadata system (meaning it also contains common elements from previous versions of Questasy and Colectica)

Where DDI (currently) cannot help DDI Lifecycle v3.2 is lacking the following features for computer-based assessment Layout and screen size to avoid changes in item difficulty Scoring rules Response domains like “point-and-click”, “hotspot” or “highlighting” Use of paradata / logfile data analysis e.g. for diagnostic assessment Statistical parameters (e.g. Cronbach‘s Alpha) have to be stored with the cognitive item to support adaptive testing Support for more complex item types like simulations Very important are also the modifications due to the shift to DDI Lifecycle 4.0 Does DDI4 contain the same routing capabilities like DDI3.2 so a questionnaire or test can be rendered directly? Otherwise DDI will be useless to survey organisations

Thanks for your attention.