Download presentation
Presentation is loading. Please wait.
Published byAubrie Dorsey Modified over 9 years ago
1
Reportnet standards and next steps Søren Roug, Information and Data Services (IDS)
2
Use of Standards Historically, Reportnet has been targeted towards the webbrowser. It is a standard called REST You can upload any file to CDR Communication between sites is done with XML-RPC Transfer of metadata uses RDF (Semantic Web) Reportnet does not use Service Oriented Architecture (SOA) SOAP INSPIRE
3
Introduction of XML (A standard for file formats) In 2004 Reportnet started to give preferential treatment to XML One single requirement: That the XML file has a schema identifier From this we can: Run QA scripts using the XQuery language Convert to other formats using XSL-T Edit the XML content using XForms for webforms
4
2008 focus Integration of national repositories into Reportnet Guidelines on How to implement a Reportnet/SEIS node Use of QA service from national node Use of conversion service from national node Registration of datasets Via a manifest file Via manual registration at website
5
2009 focus (next steps) How to register the datasets How to search for the datasets How to track updates to the datasets How to bookmark found datasets How to merge datasets How to trust the dataset How to trust the trust
6
Registering a SEIS dataset Discovered via manifest files and manual registration
7
Adding metadata
8
Bookmarking and searching the dataset
9
Working with files vs. records Now we know where the files are in the SEIS universe But we can do more: We can read the content of XML files Example of an XML snippet: <stations xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance xsi:noNamespaceSchemaLocation="http://water.eionet.europa.eu/stations.xsd"> 32301 St. Pölten 15.63166 48.21139 270 Industrial urban...
10
Merging principles Station structure as a table (austria.xml) Identifierlocal_codename... #3230132301St. Pölten... #3230232302Linz... Quadruple structure SubjectPredicateObjectSource #32301typeRiver Stationaustria.xml #32301local_code32301austria.xml #32301nameSt. Pöltenaustria.xml #32302typeRiver Stationaustria.xml #32302local_code32302austria.xml #32302nameLinzaustria.xml
11
Merging the datasets Austria Stations.xml Belgium Stations.xml Germany Stations.xml Aggregation Database XSL Transformation to quadruples SubjectPredicateObjectSource #32301nameSt. PöltenAu..xml #30299nameGentBe..xml #42882nameKölnGe..xml
12
Merging the datasets (with later updates) Austria Stations.xml Austria update1.xml Aggregation Database XSL Transformation SubjectPredicateObjectSource #32301nameSt. PöltenAu..xml #32301date2005-10-8Au..xml #32301nameSpratzernAu..update1.xml #32301date2008-6-18Au..update1.xml
13
Searching To find all river stations in Europe you search for subjects with the type=”River Station” The query will format it as a table for you Obviously you get duplicates because 32301 has been updated IdentifierLocal_codeNameDateLongitude #3230132301St. Pölten2005-10-815.63166 #32301Spratzern2008-6-18 #3029930299Gent2004-11-123.733333 #4288242882Köln2001-4-146.966667
14
QA work Let’s first colour the cells by their source IdentifierLocal_codeNameDateLongitude #3230132301St. Pölten2005-10-815.63166 #32301Spratzern2008-6-18 #3029930299Gent2004-11-123.733333 #4288242882Köln2001-4-146.966667
15
QA work Then we merge by letting the newer sources overwrite the older: IdentifierLocal_codeNameDateLongitude #3230132301Spratzern2008-6-1815.63166 #3029930299Gent2004-11-123.733333 #4288242882Köln2001-4-146.966667
16
QA work Don’t trust one source? Turn it off before you merge IdentifierLocal_codeNameDateLongitude #3230132301St. Pölten2005-10-815.63166 #32301Spratzern2008-6-18 #3029930299Gent2004-11-123.733333 #4288242882Köln2001-4-146.966667
17
QA work Then we merge IdentifierLocal_codeNameDateLongitude #3230132301St. Pölten2005-10-815.63166 #3029930299Gent2004-11-123.733333 #4288242882Köln2001-4-146.966667
18
QA work Gapfilling? Add your own source as a layer The layer is stored on QAW IdentifierLocal_codeNameDateLongitude #3230132301St. Pölten2005-10-815.63166 #32301Spratzern2008-6-18 #3029930299Gent2004-11-123.733333 #4288242882Köln2001-4-146.966667 #323012008-11-2715.65000 Hermann’s gapfilling layer created 2008-11-27
19
QA work Then we merge IdentifierLocal_codeNameDateLongitude #3230132301Spratzern2008-11-2715.65000 #3029930299Gent2004-11-123.733333 #4288242882Köln2001-4-146.966667 And we export to our working database for production...
20
Trusting the dataset and trusting trust Datasets and values can be evaluated by looking at the source Is the source URL from a reliable organisation/person? Is the methodology described? Are there reviews on QAW? Who wrote the reviews? Are there others who have used the data? Who are they?
21
Summary These new tools intend to solve the use of the Reportnet deliveries: Aggregation/Merging Manual QA and gap-filling Traceability to the sources Noticing when the source has been updated/deleted Review of the source for inclusion That was no problem before because only authorised parties could upload to CDR With SEIS now anyone can participate
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.