Assessment of the Swiss National Objectives in Education

Slides:



Advertisements
Similar presentations
Connecticut Assessment of Student Performance and Progress Smarter Balanced Spring 2014 Field Test Workshop for Students Greenwich Public Schools March,
Advertisements

Innovation and Growth of Large Scale Assessments Irwin Kirsch Educational Testing Service February 18, 2013.
ServiceConnect® 2.0 Your access to the DrägerService® world.
Optimizing Windows Vista Performance Lesson 10. Skills Matrix Technology SkillObjective DomainObjective # Introducing ReadyBoostTroubleshoot performance.
Plannes security for items, variables and applications NEPS User Rights Management.
Meta Dater Metadata Management and Production System for surveys in Empirical Socio-economic Research A Project funded by EU under the 5 th Framework Programme.
A database-driven tool to create items, variables and questionnaires NEPS Metadata Editor.
Funded under the EU ICT Policy Support Programme Automated Solutions for Patent Translation John Tinsley Project PLuTO WIPO Symposium of.
© 2014 by the Regents of the University of Michigan Metadata from Blaise and DDI 3.0/3.2 Gina Cheung Beth-Ellen Pennell North American DDI Conference April.
Algorithmic e-Assessment with DEWIS Dr Rhys Gwynllyw Dr Karen Henderson Senior Lecturers in Mathematics and UWE Learning and Teaching Fellows Department.
Systems Analysis – Analyzing Requirements.  Analyzing requirement stage identifies user information needs and new systems requirements  IS dev team.
DDI Profiles to Support Software Development as well as Data Exchange and Analysis NADDI , Vancouver (Canada) David Schiller, IAB Ingo Barkow,
1 INTRODUCTION TO DATABASE MANAGEMENT SYSTEM L E C T U R E
Nancy Severe-Barnett Program Coordinator, SCIS
Meta-Knowledge Computer-age study skill or What kids need to know to be effective students Graham Seibert Copyright 2006.
Development of metadata in the National Statistical Institute of Spain Work Session on Statistical Metadata Genève, 6-8 May-2013 Ana Isabel Sánchez-Luengo.
1 Phillips Andover Academy 2/22/ :30 – 1:00 Darek Sady Blackboard Learning System (Release 6.3) Assessments, Surveys, and Pools.
1 Seminar on 2008 SNA Implementation June 2010, Saint John’s, Antigua and Barbuda GULAB SINGH UN Statistics Division Diagnostic Framework: National.
DATABASES Southern Region CEO Wednesday 13 th October 2010.
Questionmark’s 2005 Users Conference  New Orleans Copyright © Questionmark Corporation and/or Questionmark Computing Limited, known collectively.
Edmodo Made Easy By: Susan O’Day.
Data Management Seminar, 9-12th July 2007, Hamburg 11 ICCS 2009 – Field Trial Survey Operations Overview.
G063 - Human Computer Interface Design Designing the User Interface.
Developing and Scoring Technology-Enhanced Items NCSA San Diego 23 June, 2015 Stanley Rabinowitz, Ph.D General Manager, Assessment and Reporting.
CASE (Computer-Aided Software Engineering) Tools Software that is used to support software process activities. Provides software process support by:- –
Modernization of official statistics Eric Hermouet Statistics Division, ESCAP
Creating Games with PowerPoint: The SECRET: Sequence – the order of the slides Connection – how the slides link together.
Self-assessing with adaptive exercises Chye-Foong Yong & Colin A Higgins University of Nottingham.
Interstage BPM v11.2 1Copyright © 2010 FUJITSU LIMITED CREATING A SIMPLE PROCESS.
Moving NAPLAN Online ACE 2015 National Conference 24 September, 2015 Stanley Rabinowitz, Ph.D General Manager, Assessment and Reporting.
Join Us Next Time on June 9th 9 -10:00 AM (PST) Sample Designer Tools.
Fundamentals of Windows Mouse n 4 Basic Operations: –Pointing –Clicking –Double Clicking –Dragging.
Publishing DDI-Related Topics Advantages and Challenges of Creating Publications Joachim Wackerow EDDI16 - 8th Annual European DDI User Conference Cologne,
Metadata in the Educational Sciences
Case Management System
Towards more flexibility in responding to users’ needs
How to use By Zainab Muman
Chapter Ten Managing a Database.
Michigan Questionnaire Documentation System (MQDS)
Adding Assignments and Learning Units to Your TSS Course
Online Testing System Assessment Viewing Application (AVA)
IASSIST , Toronto (Canada)
Rogatus - Questionnaire and Metadata Management System
What’s New in Colectica 5.3 Part 1
Online Testing System Assessment Viewing Application (AVA)
Questasy: Documenting and Disseminating Longitudinal Data Online with DDI 3 Edwin de Vet 11/14/2018.
Online Testing System Assessment Viewing Application (AVA)
WP 3: Data Quality Alexia Katsanidou
Curriculum Technology Coordinator

Data harmonization in International Surveys on Education
Design central EMODnet portal Objectives and Technical description Initial draft prepared by the Flanders Marine Institute.
Assessment and Development of Core Skills in Engineering Mathematics
Interim Assessment Training NEISD Testing Services
Online Testing System Assessment Viewing Application (AVA)
An update on Rogatus Supporting the survey workflow with open standards and tools Ingo Barkow (DIPF) – Senior Researcher, Data Manager David Schiller (IAB)
Question Banks, Reusability, and DDI 3.2 (Use Parameters)
Advances in metadata systems
The Generic Statistical Business Process Model
Prepared by Peter Boško, Luxembourg June 2012
Introducing the GSBPM Steven Vale UNECE
Educational Computing
Implementing DDI in a Survey Organisation
Modelling DDI-L into a combination of tools
Capitalising on Metadata
Remembering and Learning
EDDI Copenhagen (Denmark)
An open-source survey and case management system
The Role of Metadata in Census Data Dissemination
Palestinian Central Bureau of Statistics
Presentation transcript:

Assessment of the Swiss National Objectives in Education Ingo Barkow, Catharina Wasner University of Applied Sciences HTW Chur, Switzerland EDDI16 – 8th Annual European DDI User Conference December 6, 2016, Cologne

Surveys about the national objectives in education in Switzerland Overview Surveys about the national objectives in education in Switzerland Tasks of HTW Chur: IT support and data management A tool for survey management Where DDI helps Where DDI does not help Other metadata standards to be considered

Background: «EDK» and «HarmoS» In Switzerland, the main responsibility for education and culture lies with the cantons. They coordinate their work at the national level. The 26 cantonal ministers of education (called “directors of education”) together form a political body to carry out this work: the Swiss Conference of Cantonal Ministers of Education (EDK). Legally binding, intercantonal agreements (known as concordats) form the foundation for the work of the EDK. One of these agreements is the HarmoS concordat, the Intercantonal Agreement on the Harmonization of Compulsory Education. It came into effect in 2009 and is aimed at harmonizing some cornerstones of the educational system nationwide.

The Swiss National Objectives in Education and «ÜGK» The Swiss National Objectives in Education are some of these cornerstones of the HarmoS concordat. They were developed until 2011 and describe which core skills students in all cantons should have obtained after 2, 6 and 9 years of school in languages, math and natural sciences. They are now part of all curriculums in all cantons. In 2013 it was decided how to assess these objectives and if the harmonization has been successful. Therefor the Swiss National Core Skills Assessment Programme (in German: ÜGK – Überprüfung der Grundkompetenzen) was initiated. Samples in each canton: all in all 25.000 students First survey in spring 2016: math, 9th school year Second Survey in spring 2017: languages, 6th school year The cantons are thinking about institutionalizing these assessments and coordinating them with the PISA assessments 15 Partners conducting theses surveys

Tasks of HTW Chur in «ÜGK» Leading the work package IT support and data management IT support: Coordinating and organizing the hardware with the service provider Coordinating the use and set up of software and servers with DIPF Optimizing and developing the tools for the survey management Data management: Introducing and optimizing the data management Introducing metadata standards for educational data Consulting for FORS regarding the preservation of educational data and metadata

A Tool for Survey Management Will be developed for PISA and ÜGK Development starts probably in spring 2017 The tool should support the fieldwork and data management in an adequate manner: Support the administration and field monitoring during the data collection process Optimize the data documentation process by generating documentation und metadata as a byproduct without additional effort during the data collection process Tool bases on a modified version of Rogatus Survey 2.0 (specifically the case management system) Current version only supports CAPI – will be modified to use case “data collections in schools” Should be able to use different metadata standards for preservation

Metadata-driven research process Research project information stored in survey management system Items enriched with metadata during design process and stored in item bank Data collection using survey / case management system thus creating paradata / log files Storing of all metadata, paradata and data in research data facility Metadata and data are linked to existing publication (enhanced publication) All items and results are available for re-use in research data facility Research data facility ensures long-time preservation by rollover processes for format changes Here are examples for all processes

Metadata-driven Survey Management Software: IAB Metadata Portal (TBA21) Metadata Standard: DDI Lifecycle 3.2

Metadata-driven Item Design & Item Banking Software: TIPO Item Portal (TBA21/DIPF) / TAO (o.a.t. S.A.) Metadata Standard: QTI v2.1

Metadata-driven Data Collection Process (Mobile) Software: Rogatus Survey / Aitema Mobile Client (TBA21) Metadata Standard: DDI Lifecycle 3.2 (not for paradata)

Metadata-driven Data Dissemination Software: Rogatus Survey / Administrator Panel (TBA21) Metadata Standard: DDI Lifecycle 3.2

Metadata-driven Publication Process Software: Invenio 0.99 (CERN) Metadata Standards: MARC21 / DataCite

Metadata-driven Secondary Data Analysis / Retrospective Software: FORSbase (FORS) Metadata Standard: DDI Lifecycle 3.2

Where DDI can help Research process in the educational sciences is very similar to the social sciences Most large-scale studies (e.g. PISA, PIAAC, PIRLS, TIMMS) use a mix of questionnaires and cognitive items DDI Lifecycle v3.2 can be used in “ÜGK” to store survey information and questionnaire design including routing It can be the standard to forward these metadata and data from the data collection towards the data archive Currently it can be used for the questionnaires as well as very simple cognitive items (items with a simple stimulus like a picture plus a multiple choice battery)

Where DDI currently cannot help DDI Lifecycle v3.2 is lacking the following features for computer-based assessment Layout and screen size to avoid changes in item difficulty Scoring rules Response domains like “point-and-click”, “hotspot” or “highlighting” Use of paradata / logfile data analysis e.g. for diagnostic assessment Statistical parameters (e.g. Cronbach‘s Alpha) have to be stored with the cognitive item to support adaptive testing Support for more complex item types like simulations Very important are also the modifications due to the shift to DDI Lifecycle 4.0 Does DDI4 contain the same routing capabilities like DDI3.2 so a questionnaire or test can be rendered directly? Otherwise DDI will be useless to survey organisations

PIAAC Problem Solving Item DIPF PowerPoint-Präsentation

Other metadata standards in ÜGK Some interactions from IMS Questionnaire and Test Interoperability (QTI) ChoiceInteraction (multiple choice items with one or more correct answers) OrderInteraction (sorting answers in a list) AssociateInteraction (building pairs of answers) MatchInteraction (assigning correct responses to a matrix) GapMatchInteraction (matrix with open answers) InlineChoiceInteraction (choice of different answers within a text) TextEntryInteraction (open text boxes within different media) ExtendedTextInteraction (longer text box for free text passages) HotTextInteraction (choices of several text answers within a text) HotSpotInteraction (clickable areas in a graphic) SelectPointInteraction (marking an area in a graphic) GraphicOrderInteraction (marking hotspots with numbers) GraphicAssociateInteraction (building relations between hotspots in graphics) GraphicGapMatchInteraction (graphics with text boxes to enter data) PositionObjectInteraction (similar to HotSpot, but a predefined graphic can be used) SliderInteraction (offers a slider bar for a numeric answer) DrawingInteraction (offers a sketchboard for drawing) UploadInteraction (data created by another tool can be attached as an answer)

Conclusion DDI 3.2 is well suited for questionnaires, but is lacking some features to be the leading standard for educational sciences More work on paradata and response domains is needed if this domain is supposed to be included as well Nevertheless it is the standard of choice for many parts of the ÜGK survey

Thank you for your attention!