1 CORE – COmmon Reference Environment How it works Jean-Pierre Kent 11 January 2012.

Slides:



Advertisements
Similar presentations
Chapter 11 Designing the User Interface
Advertisements

Programming Paradigms and languages
1 Eurostat Unit B1 – IT Systems for Statistical Production IT outsourcing in Eurostat – our experience Georges Pongas, Adam Wroński Meeting on the Management.
1 Software Testing and Quality Assurance Lecture 13 - Planning for Testing (Chapter 3, A Practical Guide to Testing Object- Oriented Software)
10 Software Engineering Foundations of Computer Science ã Cengage Learning.
Software Modeling SWE5441 Lecture 3 Eng. Mohammed Timraz
Identifying trends in Accessible Content Processing Neil McKenzie, FNB (Dutch Library for the Blind), Amsterdam.
7M701 1 Software Engineering Software Requirements Sommerville, Ian (2001) Software Engineering, 6 th edition: Chapter 5
SE 464: Industrial Information systems Systems Engineering Department Industrial Information System LAB 02: Introduction to SAP.
Software Requirements
Software Engineering For Beginners. General Information Lecturer, Patricia O’Byrne, office K115A. –
Overview of Software Requirements
Software Development Overview CPSC 315 – Programming Studio Spring 2009.
CMP 131 Introduction to Computer Programming Violetta Cavalli-Sforza Week 1, Lab.
GSBPM and GSIM as the basis for the Common Statistical Production Architecture Steven Vale UNECE
Software Development Overview CPSC 315 – Programming Studio Spring 2008.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 5 Slide 1 Requirements engineering l The process of establishing the services that the.
[ §4 : 1 ] 4. Requirements Processes II Overview 4.1Fundamentals 4.2Elicitation 4.3Specification 4.4Verification 4.5Validation Software Requirements Specification.
Process-oriented System Automation Executable Process Modeling & Process Automation.
United Nations Economic Commission for Europe Statistical Division Applying the GSBPM to Business Register Management Steven Vale UNECE
GSIM Stakeholder Interview Feedback HLG-BAS Secretariat January 2012.
CORE Rome Meeting – 3/4 October WP3: A Process Scenario for Testing the CORE Environment Diego Zardetto (Istat CORE team)
Chapter 7 Software Engineering Objectives Understand the software life cycle. Describe the development process models.. Understand the concept of modularity.
WP.5 - DDI-SDMX Integration E.S.S. cross-cutting project on Information Models and Standards Marco Pellegrino, Denis Grofils Eurostat METIS Work Session6-8.
NSI 1 Collect Process AnalyseDisseminate Survey A Survey B Historically statistical organisations have produced specialised business processes and IT.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 6 Slide 1 Software Requirements.
Background to the Generic Statistical Information Model (GSIM) Briefing Pack December
ETICS2 All Hands Meeting VEGA GmbH INFSOM-RI Uwe Mueller-Wilm Palermo, Oct ETICS Service Management Framework Business Objectives and “Best.
Software Requirements Presented By Dr. Shazzad Hosain.
Master Thesis Defense Jan Fiedler 04/17/98
1 HLG-BAS workshop Session III Questionnaire responses of the HLG-BAS related groups A. Born / A. Götzfried / J.M. Museux.
Luxembourg January CORE ESSnet (COmmon Reference Environment) final meeting Carlo Vaccari Istat - Italy.
Eurostat Expression language (EL) in Eurostat SDMX - TWG Luxembourg, 5 Jun 2013 Adam Wroński.
Example Template for Project Presentation
United Nations Economic Commission for Europe Statistical Division Mapping Data Production Processes to the GSBPM Steven Vale UNECE
InSPIRe Australian initiatives for standardising statistical processes and metadata Simon Wall Australian Bureau of Statistics December
Software Development Overview CPSC 315 – Programming Studio Spring 2013.
The future of Statistical Production CSPA. 50 task team members 7 task teams CSPA 2015 project.
Eurostat SDMX and Global Standardisation Marco Pellegrino Eurostat, Statistical Office of the European Union Bangkok,
SDMX IT Tools Introduction
Modernisation Activities DIME-ITDG – February 2015 Item 7.
1 CORE Bringing the GSBPM to life! J. Linnerud & J.-P. Kent.
Java Programming: Advanced Topics1 Introduction to Advanced Java Programming Chapter 1.
Software Engineering, COMP201 Slide 1 Software Requirements BY M D ACHARYA Dept of Computer Science.
Architecture Tutorial 1 Overview of Today’s Talks Provenance Data Structures Recording and Querying Provenance –Break (30 minutes) Distribution and Scalability.
Requirement Elicitation Review – Class 8 Functional Requirements Nonfunctional Requirements Software Requirements document Requirements Validation and.
TTCN-3 Testing and Test Control Notation Version 3.
RPA – Robotic Process Automation
1 Software Requirements Descriptions and specifications of a system.
Administrative Data and Official Statistics Administrative Data and Official Statistics Principles and good practices Quality in Statistics: Administrative.
Applying Robotics Process Automation to drive Operational Excellence
System Design, Implementation and Review
Introduction to Advanced Java Programming
at Statistics Netherlands
SDMX Introduction and practical exercises
TDL: The ETSI Test Description Language
Tomaž Špeh, Rudi Seljak Statistical Office of the Republic of Slovenia
TDL: The ETSI Test Description Language
Statistical Information Technology
CSPA: The Future of Statistical Production
Mapping Data Production Processes to the GSBPM
TDL: The ETSI Test Description Language
Presentation to SISAI Luxembourg, 12 June 2012
Item 7.11 SDMX Progress report
Subject Name: SOFTWARE ENGINEERING Subject Code:10IS51
Business architecture
ESTP course on Statistical Metadata – Introductory course
Generic Statistical Information Model (GSIM)
COmmon REference Environment - CORE:
Eurostat Unit B3 – IT and standards for data and metadata exchange
Presentation transcript:

1 CORE – COmmon Reference Environment How it works Jean-Pierre Kent 11 January 2012

Contents Introduction – Design ≠ implementation Overview – A model of the user’s experience Presentation of the information model 2

Design ≠ Implementation Goal of design: – Deliver a concept apt to contribute to the industrialisation of official statistics (ref: HLG-BAS Vision) Result: – A model that exceeds the capacity of a 1-year project 3

Design ≠ Implementation Goal of implementation – Deliver Proof Of Concept with: Platform independence Model-driven Result: – Implementation of a subset of the model 4 design environment execution environment

What is Platform Independence ? Once implemented, a service can run on any platform (e.g..Net, Java,...) A process engine running on a platform (e.g. Java) can control services running on another platform (e.g..Net) A process can be distributed: – Manage microdata at Statistics Netherlands – Do aggregation at ISTAT – Produce the SDMX output at Eurostat – Under control of a process engine running at INSEE 5

Model-Driven: cutting costs Traditionally: 1.Designer makes models 2.Developer creates the system 3.System runs. In a model-driven environment: 1.Designer makes models 2.System runs 6

Model-Driven: effectivity 7 Flexibility Standardisation Taylor- made Standard package, ERP, CRM, DMS... Model- Driven Spaghetti

Model-driven: benefits Cost reduction: less manual work Reliability: manual work is error prone Time to market: less manual work Standardisation: system enforces standards Flexibility: incremental development, agile maintenance Reliability: build processes from well-designed and well- tested services Strict separation of design and execution Focus on process quality as a source of product quality … and some more

Questions (Part 1) ? 9

Overview What users see and use – This is not the information model – Nor the technical model – But a model of the user’s experience 10

11 CORE Run Time

12 CORE Design time This is where GSIM comes in

13 CORE: the whole picture

How do services interact? You can use different tools for different services – e.g. SPSS, SAS, R... Different tools expect different data formats Conversions are inevitable 14

Conversions are expensive! Between 2 formats – A B: 2 conversions Between 3 formats – A B: 6 conversions C Between N formats: N 2 -N 15

CORE reduces N 2- N to 2*N Standard CORE data format Conversion to and from CORE format Tool X Input (X)Output (X) Model (CORE) Convertor Input (CORE) Output (CORE) Model (X) Convertor Convertor is tool-specific – not service-specific Model (X) Model (CORE) Convertors are format-specific, not service-specific

Questions (Part 2) ? 17

18 CORE: the whole picture

The information model What the designer sees – Data set description and data set kind – Service and data set kind

Data set definition package

Communication channels Manage communication between a service and the execution environment Give support to Plug-and-Play coupling Implement messaging

5 types of channels

4 types of messages Service signature message Service configuration message Service execution message Service output message

Service signature message Service communicates to its environment the channels that it supports. Channels constrain the kinds of information – expected during execution or during configuration e.g. data set kind “microdata” – to produce during execution. e.g. data set kind “aggregate”

Service configuration message Environment communicates details about data sets that will be offered to the service during execution. – e.g. number of columns of a data set – value types for each of these columns This information must fit in the channels specified in the service signature message – e.g. a data set description must match the expected data set kind

Service execution message Service is requested to execute itself Service is offered a number of data sets and business objects as input. This information must comply with the service's signature message It must be consistent with the data set and business object details the service is configured with through the service configuration message – e.g. a data set must match the expected data set description

Service output message A service ends its execution by sending a service output message. The result of the service execution is documented by data sets which match the service's configuration message. – e.g. the output data sets are consistent with their data set descriptions.

Questions (Part 3) ? 28