Reporting QA Measures to EBAS – and a Word on Flagging

Slides:



Advertisements
Similar presentations
Quality and Outcomes Framework Assessor Training Skills in note-making, summarising and report writing Module S5.
Advertisements

World Meteorological Organization Working together in weather, climate and water WMO Commission for Atmospheric Sciences: Quality Management Framework.
Ljupcho Grozdanovski Division for air quality monitoring Ministry of environment and physical planning Prizren, July 2013 Air Quality support meeting,
Quality Assurance for Tribal FIFRA Enforcement Grants David R. Taylor EPA Region 9 Quality Assurance Office.
Why do we lose analyzer data? Monitor malfunction DAS malfunction Power outages Environmental problems Wildlife damage Vandalism Operator error.
Operational Quality Control in Helsinki Testbed Mesoscale Atmospheric Network Workshop University of Helsinki, 13 February 2007 Hannu Lahtela & Heikki.
Short Course on Introduction to Meteorological Instrumentation and Observations Techniques QA and QC Procedures Short Course on Introduction to Meteorological.
Technical requirements  Sampling  Analysis  Reporting of the results.
Revisions To Heavy Duty Validation Report General Numerous clarifications to text as requested at December meeting Partial flow systems referred to consistently.
Meteorological Observatory Lindenberg – Richard Assmann Observatory The GCOS Reference Upper Air Network.
Chapter 8: Systems analysis and design
1 Dec. 8, 1997 LEADS Quality Assurance Summary Robert Brewer (512) Monitoring Operations Division Network QA Manager.
Q2010, Helsinki Development and implementation of quality and performance indicators for frame creation and imputation Kornélia Mag László Kajdi Q2010,
State of Kansas Travel Authorizations Statewide Management, Accounting and Reporting Tool Entering a Travel Authorization Navigation: Employee Self Service.
Synergies between EMEP and EUSAAR Wenche Aas and Kjetil Tørseth EMEP/CCC (NILU)
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
1 Saxony-Anhalt EU Twinning RO 04/IB/EN/09 State Environmental Protection Agency Wolfgang GarcheBukarest National Reference Laboratory for Air.
Where Should the GALION Data Reside? Centrally or Distributed? Introduction to the Discussion Fiebig, M.; Fahre Vik, A. Norwegian Institute for Air Research.
DEVA Data Management Workshop Devil’s Hole Pupfish Project Quality Assurance.
John Porter Sheng Shan Lu M. Gastil Gastil-Buhl With special thanks to Chau-Chin Lin and Chi-Wen Hsaio.
? Training removes doubt, instills confidence, and lays the foundation for everyone’s skill and experience level.
Learning Technology Development. edgehill.ac.uk Online Submission Workshop edgehill.ac.uk How to create an assignment dropbox? Assignment Template Dates.
Learning Technology Development. edgehill.ac.uk Online Submission Workshop edgehill.ac.uk How to create an assignment dropbox? Assignment Template Dates.
Traceability in Chemical Measurement Comparable results are needed in order to avoid duplicating measurements which cost time and money. Comparable results.
BACK TO BASICS Quality Assurance and Validation Gaseous NAAQS Pollutants (O3, NO2, SO2, CO) By Kendall Perkins May 17, 2006.
Annual Air Monitoring Data Certification and Concurrence Process 1.
Sharing solutions for better regional policies European Union | European Regional Development Fund Erika Fulgenzi Policy Officer | Interreg Europe JS
Auburn University COMP 2710 Software Construction Use Case Analysis – Examples and Exercises Dr. Xiao Qin Auburn University.
Ann Mari Fjæraa Philipp Schneider Tove Svendby
Data quality & VALIDATION
(Winter 2017) Instructor: Craig Duckett
New Developments in ACTRIS Surface In-Situ Data Flow and Handling
Core LIMS Training: Project Management
Project Management: Messages
Submitting an invoice with the Tungsten Portal
Data aggregation and products generation in the Mediterranean Sea
Requirements Validation – II
EMEP monitoring – quality issues
Making Data Providers’ Contribution Count
Data Flows in ACTRIS: Considerations for Planning the Future
CHAPTER 2 Testing Throughout the Software Life Cycle
Session 2 “Plug Discharge” Test
An Introduction to Quality Assurance in Analytical Science
Quality Assurance Measures for High Frequency Radar Systems
Evidence Based Practice 3
Data flow ACTRIS-2.
Testing and Test-Driven Development CSC 4700 Software Engineering
Diagnosis General Guidelines:
Tomaž Špeh, Rudi Seljak Statistical Office of the Republic of Slovenia
How To Report QA Measure Outcomes With ACTRIS Surface In Situ Data
Implementation Challenges
Inter-Laboratory Comparison Exercise CPC CALIBRATION
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Sr. Quality Engineering Manager,
X-DIS/XBRL Phase 2 Kick-Off
Education and Training Statistics Working Group – 2-3 June 2016
CRM OdooV11 Documented By : Odoo Team, Entiovi.
Planning longer answers:
A New Tool for Evaluating Candidate PM FEM and PM2.5 ARM Monitors
TITLE Business Case YOUR LOGO BUSINESS CASE PRESENTATION 00/00/0000
Mapping Data Production Processes to the GSBPM
Wrap up of agenda item no. 3
Wenche Aas, Kjetil Tørseth, Cathrine Lund Myhre
Introduction – workshop on EBAS and Data Quality
My name is VL, I work at the EEA, on EA, and particularly on developing a platform of exchange which aims at facilitating the planning and development.
Welcome TFMM workshop on the implementation of the EMEP monitoring strategy Introduction to the monitoring strategy QA/QC activities of EMEP organisation.
Speaker’s Name, SAP Month 00, 2017
Quality assurance and data quality in relation to trend assessments in EMEP Kjetil Tørseth, Wenche Aas and Jan Schaug Documenting changes in atmospheric.
Automated Exam Program File Submission Refresher Course
Presentation transcript:

Reporting QA Measures to EBAS – and a Word on Flagging Markus Fiebig, Paul Eckhardt, Cathrine Lund Myhre, Ann Mari Fjæraa, Richard Rud, Anne Hjellbrekke and Wenche Aas NILU - Norwegian Institute for Air Research

What Is Meant With Quality Assurance (QA) Measure? Nature of QA measure depends on type of measurement Round-robin: several laboratories analyse the same sample, reference value (“true value”) calculated from all results, used for offline methods on filter samples. Off-site comparison: Instrument is sent to inter-comparison workshop, lab reference traceable to primary metrological standards, used for online gas- & particle phase measurements. On-site comparison: Traveling standard instrument is calibrated against primary standard, and sent around to stations for comparison while testee is running, used for online gas- & particle phase measurements. …

How Is Quality Assurance Documented With EBAS Data? 2 sets of metadata items: Describing QA measure as a whole (off-site comparison, on-site comparison, round-robin, …) Describing individual execution of measure (individual instrument level) One QA measures has several instances (each station, instrument, …) Documentation provided by responsible calibration centre. QA measure QA instance ID date Description outcome (pass / no pass / not participated) title document name (describing result) Type (off-site comp., on-site comp., …) document URL (describing result) responsible instance Numeric outcome: bias URL (description of whole measure) Numeric outcome: variability

How Does This Look Like in Practice? Example from ozone template: Several relevant QA measures can be reported at the same time. Numerical (bias / variability) and non-numerical result (pass / nopass / not participated) By default, don’t correct data for bias found in QA measure.

A Few Words on Flagging General: Data are supposed to serve several types of applications (“use cases”). Points invalid in one case may be valid for another. Flag data invalid only for instrument malfunctions or breaches of SOP (“smoking below the inlet”) Other conditions (regional influence, etc.) are flagged, not invalidated. State today: Application of flags is very heterogeneous, even within frameworks. Makes data more difficult to use – user can’t rely on presence or non-presence of flags, e.g. when filtering data. Being homogeneous in flagging is an end in itself! Approach: Make data providers actively QA their data manually before submission. Implement automated checks during data upload process (planned). Simplify set of flags recommended for primary use.

Short-List of Flags Flags on short-list should be first to choose from V/I Description 000 V Valid measurement 390 Data completeness less than 50% 392 Data completeness less than 75% 394 Data completeness less than 90% 559 Unspecified contamination or local influence, but considered valid 683 I Invalid due to calibration 684 Invalid due to zero/span check 999 Missing measurement, unspecified reason Flags on short-list should be first to choose from Modifications of short-list depending type of instrument and observation. Full list of EBAS flags (http://www.nilu.no/projects/ccc/flags/flags.html) can still be used if reporter really has special condition not covered by short-list Note: many flags on full list are targeted towards offline, sampling-based observations. Need flag 110 to implement outlier check service. How detailed should information coveyed to user be? Indicate possible reason for contamination? Use detailed data completeness flags or just a few. Allow flags that invalidate outliers for unspecified reason?