Data flow ACTRIS-2.

Slides:



Advertisements
Similar presentations
1 Towards an improved European Infrastructure for reactive trace gas monitoring within ACTRIS Christian Plass-Duelmer, Stefan Reimann, S. Gilge, A. Werner,
Advertisements

MICE Target Report Video Conference 15/03/07 Chris Booth Sheffield 15 th March 2007.
Planning for the International Assessment and Review process
Monitoring and Evaluation (M&E) System for the Government of Cape Verde Synergy International Systems, Inc.
John Stem 7 th Annual Summit on VR PEQA Louisville, Kentucky.
Monica Pantea, MANETPCC meeting in Groningen 2006 Time Schedule of the EURONS Annual Report End of reporting period Deadline of submission.
Emission Inventory Quality Assurance/Quality Control (QA/QC) Melinda Ronca-Battista ITEP/TAMS Center.
EU BON Meeting, Joensuu, March 2015 WP2 Task 2.3: Data sharing tools – Action Point MS232 : Technical workshop, review of documents, test versions of data.
Where Should the GALION Data Reside? Centrally or Distributed? Introduction to the Discussion Fiebig, M.; Fahre Vik, A. Norwegian Institute for Air Research.
Exit Capacity Substitution and Revision Transmission Workstream meeting, 3 rd December 2009.
Transition Plan Network Operations Model Change Requests 5/26/2010.
The CED’s & DED’s role in the Tenure & Promotion of Agents
Near-Real-Time Data Collection at WDCA: Why & What
New Developments in ACTRIS Surface In-Situ Data Flow and Handling
Data Entry Format 2017.
Aerosols, Clouds and Trace gases Research Infrastructure
Introduction to SMC’s Online Flex Tracking System
Teacher Evaluation Timeline
Interreg Europe support
Student SOLE Page – Living Page
EMEP monitoring – quality issues
WP3: Near-surface observations of aerosols, clouds and trace gases Task 3.1: Improvement of instrumentation, standardization and quality assessment of.
Data Flows in ACTRIS: Considerations for Planning the Future
Region 15 Regional Healthcare Partnership 14th Public Meeting
Implementation Strategy July 2002
Conducting the performance appraisal
Defending the Dissertation
Terex HR Process Timetable (1st half)
Conducting the performance appraisal
All Wales Safeguarding Procedures Review Project
USOAP Continuous Monitoring Approach (CMA) Workshop
SKADS Controller’s Meeting Timeline Annual Financial Report
Overview – Processes Overview Purpose Roles & Responsibilities
Correcting Entries Training
Amendment Invoice Task Force Progress Report
SPR&I Regional Training
To the ETS – Encumbrance Online Training Course
The MIS and Dashboard.
My Learning Plan End User training
How To Report QA Measure Outcomes With ACTRIS Surface In Situ Data
PCS CCB Group 22nd February, 2017.
Amendment Invoice Task Force Progress Report
Does my line manager support my application?
Claims Upload and Security
The Process for Final Approval: Ongoing Monitoring
Reviewing Course Outlines for C-ID Designation
PCS User Group 15th February, 2017.
Exit Capacity Substitution and Revision
Correcting Entries Training
Amendment Invoice Task Force Progress Report
Contribution from AQUILA to Air Policy Review
A New Tool for Evaluating Candidate PM FEM and PM2.5 ARM Monitors
Amendment Invoice Task Force Progress Report
Regional WIGOS Centre Tokyo
To the ETS – Encumbrance Online Training Course
Review plan of the nature reporting – update 6
Risk Adjustment User Group
Trainer one day Preparation Course (SAPERE office, Culham)
VOC and NOx under & Claude, S. Reimann, C. Plass-Dülmer, F. Rohrer, Matthias Hill, R. Holla, D. Kubistin, S. Solberg & ACTRIS colleagues EBAS Acknowledgement:
Reporting QA Measures to EBAS – and a Word on Flagging
Customer Issue Management CoMC Update
Uncertainties in atmospheric observations
Wenche Aas, Kjetil Tørseth, Cathrine Lund Myhre
Introduction – workshop on EBAS and Data Quality
UIG Task Force Progress Report
CRS Resolution 240 – Request by ECE/GRSP for essential parameters of a CRS side impact test method As an answer to the request (ref SC12 N 608) by.
UIG Task Force Progress Report
UIG Task Force Progress Report
UIG Task Force Progress Report
UIG Task Force Progress Report
Presentation transcript:

Data flow ACTRIS-2

Data flow ACTRIS-2 for NOx level 0: raw data level 1: corrected*1 data level 2: 1-hourly corrected*1 data (for all levels: EBAS format with optional columns containing metadata to check corrections) *2 Data meeting to discuss problems and progress (regular?) First Submission to EBAS http://ebas-submit-tool.nilu.no/ WCC-NOx Data Processing/Consistency Checks Feedback to stations: open issues in the issue tracker http://ebas-feedback.nilu.no/my_view_page.php QA process under lead by WCC-NOx Re- submission second revision or approval problems / questions have to be addressed by the data provider; this will be tracked until a decision maker (WCC-NOx) closes the issue (successful). Stations WCC-Cloud https://fz-juelich.sciebo.de/ Re- submission Final Submission to EBAS «ACTRIS» data http://ebas-submit-tool.nilu.no/ *1 according to GAW Measurement Guidelines *2 final format will be developed (Rohrer, EBAS), see http://ebas-submit.nilu.no/

NOx submission levels and flags Level 0 "Native" time resolution, mixing ratios Level 1 corrected data, "Native" time resolution Level 2 corrected data, 1-hour averages Confirmation of flags??? Are templates ok? Yes, all infomation is captured but some technical problems during submission (is discussed offline with EBAS) Status of WCC NOx: first priority Side-by-Side intercomparison (preparation and participation), correction/data checks on-going.

NOx Correction F. Rohrer WCC-NOx 1) Dark reaction with ambient O3 in inlet line : NO+O3 (hn = 0) O3 measurements required 2) H2O quenching: NO2* + H2O → NO2 + H2O H2O measurements required (or alternatively adequate drying) Dark reaction depends on residence time in the inlet line,temperature, and pressure Example taken from station with a long inlet line  most servere effect. If possible short line with high diameter to reduce the pressure gradient. @30 ppb O3 5s entspricht 5% 3) Offset correction: nighttime data when O3 > 20 ppb

data quality objectives (DQO‘s) for NO & NO2

Open questions NOx NOx rapid data delivery?  WHO will act as pilot?

Data flow ACTRIS-2 for VOCs Issue tracker: request to submit data 1. Submission level 0 (incl.target, zero) and/or Level 2 to EBAS via http://ebas-submit-tool.nilu.no/ March 31st Consistency checks ACTRIS QA(EMPA) and EMEP QA (NILU) Issue tracker: Feedback to stations April 30th Not approved Issue tracker: Stations answer open issues Data workshop – Discussion of data ~2nd half of May approval approval Yellow boxes require action from the data submitter! Short description: EMPA/NILU (WHO?) will open an Issue to start the data submission (1st March) Submission to the database – it is mandatory to use the submission tool! Consistency checks are performed by NILU (Sverre) and EMPA (Stefan and Matz) NILU/EMPA : Problems/ issues are defined in the issue tracker and assigned to the respective stations These issues are discussed at the data workshop Issues have to be answered by the stations – if ok (approval), issues are closed and a new issue is opened to initiate the re-submission of revised data. As long as the issues are not solved (no approval), the discussion is on-going and the data will not be published (????) If no revisison is necessary the data set submitted in the first submission process will be published. Re-submission to EBAS via submission tool Issue tracker: re-submit data Deadline: 30st June Publication of final data at EBAS: «ACTRIS» data

VOC submission levels and flags Level 0 (with target gas and zero data  HPB pilot station for target gas ) Level 2 same as Level 0 but without target gas and zero gas infromation NEW Submission template VOCs (July 2016): http://ebas-submit.nilu.no/Submit-Data/Regular-Annual-Data-Reporting/VOC/NMHC-VOC-OVOC-on-line

First VOC target gas data submitted for 2015 Target gas measurements monitor… …the reproducibility of the instrument, the uncertainty of the measurement (deviation from expected values),the quality of the calibration procedure on a monthly and long-term scale. Target gas for NOx (pilot station HPB) postboned  technical issues

Open questions VOCs VOC rapid data delivery?  WHO will act as pilot? Automated QA measures  task 3.3 RR Cylinders  are station interested in intercomparison?

data quality objectives (DQO‘s) for VOCs

Automated QA measures: Open questions VOCs Automated QA measures: EMEP QA checks (Sverre Solberg) Plot of target gas data with average stdv and comparison to previous years. X/y plots and ratios  workshop for discussion – date? Location ? 28.11.2018 DWD Met. Observatorium Hohenpeißenberg, November 18 Seite 12

red color: data rejected/outlier blue: data ok grey: historical data NILU consistency checks of new VOC data (colored) receieved through EBAS-submit compared to existing data in EBAS (grey) by S. Solberg red color: data rejected/outlier blue: data ok grey: historical data 28.11.2018 DWD Met. Observatorium Hohenpeißenberg, November 18 Seite 13