Operational Quality Control in Helsinki Testbed Mesoscale Atmospheric Network Workshop University of Helsinki, 13 February 2007 Hannu Lahtela & Heikki.

Slides:



Advertisements
Similar presentations
Maintenance system and technology used in FMI´s observation network Jani Gustafsson Introduction FMI´s observation network.
Advertisements

Wind: Energy measurement and analysis services in FMI
Weather Station Data Quality and Interpolation Issues in Modeling Joe Russo International Workshop on Plant Epidemiology Surveillance for the Pest Forecasting.
TURKEY AWOS TRAINING 1.0 / ALANYA 2005 TRAINING COURSE ON AUTOMATED WEATHER OBSERVING SYSTEMS ( AWOS ) MODULE D: DATA PROCESSING SYSTEM SONER KARATAŞ ELECTRONIC.
Meteorological Observatory Lindenberg – Richard Assmann Observatory The GCOS Reference Upper Air Network.
Igor Zahumenský (Slovakia, CBS ET/AWS)
1 Alberta Agriculture and Food (AF) Surface Meteorological Stations and Data Quality Control Procedures.
The Integrated Surface Hourly (ISH) Global Database ESDIM/OGP funded, FCC effort 2 QC phases thus far Full POR online via FTP, partial POR via CDO ISH.
MODULE F: Quality Control and Quality Management in AWOS Network
CSCI 347 / CS 4206: Data Mining Module 02: Input Topic 03: Attribute Characteristics.
HTB data collection Mesoscale Atmospheric Network Workshop University of Helsinki, 12 February 2007 Heikki Turtiainen.
Lets remember about Types of Systems and what about Feedback? LECTURE-4 OBJECTIVE OF TODAY’S LECTURE Today we will discuss about several terms of business,
QARTOD 2: Remote Currents Working Group Definitions: Level 1 – refers to radials Level 2 – refers to total vectors Level 3 – refers to higher level data.
JEFS Status Report Department of Atmospheric Sciences University of Washington Cliff Mass, Jeff Baars, David Carey JEFS Workshop, August
Total Quality Management BUS 3 – 142 Statistics for Variables Week of Mar 14, 2011.
Short Course on Introduction to Meteorological Instrumentation and Observations Techniques QA and QC Procedures Short Course on Introduction to Meteorological.
Abstract has 6 upper air stations for GPS S SR2K2 Modemwith Radiosonde M2K2_DC, 6 Upper Air stations of RDF Radiotidolite RT20A ( Vaisala ) and three Upper.
Systems Analysis Chapter 8 P 94 to P 101
THE SYSTEMS LIFE CYCLE ANALYSE DESIGN IMPLEMENT MAINTENANCE IDENTIFY/INVESTIGATE.
Climate quality data and datasets from VOS and VOSClim Elizabeth Kent and David Berry National Oceanography Centre, Southampton.
World Renewable Energy Forum May 15-17, 2012 Dr. James Hall.
Leon Tolstoy, UPRM, UMASS Elena SaltikoffVaisala Internship in Helsinki, Finland January - February 2006.
System Testing There are several steps in testing the system: –Function testing –Performance testing –Acceptance testing –Installation testing.
Meteorological Observatory Lindenberg – Richard Assmann Observatory The GCOS Reference Upper Air Network.
WMO Climate-relevant operational Metadata Peer Hechler, Omar Baddour WMO; WIS DMA.
HTB weather and precipitation sensors Mesoscale Atmospheric Network Workshop University of Helsinki, 12 February 2007 Heikki Turtiainen.
A NEW STANDARD IN METEOROLOGICAL MONITORING SYSTEMS INSTALLED AT THE PERRY NUCLEAR POWER PLANT Jim Holian (SAIC) and Jamie Balstad (First Energy Corp)
Workshop on QC in Derived Data Products, Las Cruces, NM, 31 January 2007 ClimDB/HydroDB Objectives Don Henshaw Improve access to long-term collections.
End HomeWelcome! The Software Development Process.
Benchmark dataset processing P. Štěpánek, P. Zahradníček Czech Hydrometeorological Institute (CHMI), Regional Office Brno, Czech Republic, COST-ESO601.
Documentation of surface observation. Classification for siting and performance characteristics Michel Leroy, Météo-France.
April nd IBTrACS Workshop 1 Operational Procedures How can we build consistent, homogeneous, well- documented climate quality data?
Real-time Observation Monitoring and Analysis Network Haihe LIANG, etc. TECO-2008.
June 19, 2007 GRIDDED MOS STARTS WITH POINT (STATION) MOS STARTS WITH POINT (STATION) MOS –Essentially the same MOS that is in text bulletins –Number and.
® Kick off meeting. February 17th, 2011 QUAlity aware VIsualisation for the Global Earth Observation system of systems GEOVIQUA workshop February, the.
An Integrated Meteorological Surface Observation System By Wan Mohd. Nazri Wan Daud Malaysian Meteorological Department Due to the increasing and evolving.
Michael A. Palecki USCRN Science Project Manager National Climatic Data Center DOC/NOAA/NESDIS USCRN PROGRAM STATUS MARCH 3, United States Climate.
11 March 2013 Tim Oakley, GCOS Implementation Manager WIGOS TT Metadata Global Climate Observing System.
CBS Ext(2014) / RA III-16 TECO / RECO WIGOS Metadata Steve Foreman, WIGOS Secretariat For TT-WMD WMO; CBS Ext(2014) / RA III-16.
Systems Life Cycle A2 Module Heathcote Ch.38.
Quality control of daily data on example of Central European series of air temperature, relative humidity and precipitation P. Štěpánek (1), P. Zahradníček.
When weather means business  Managing the collection and dissemination of non-homogenous data from numerous, diverse, geographically scattered sources.
Quality management, calibration, testing and comparison of instruments and observing systems M. Leroy, CIMO ET on SBII&CM.
Meteorological Observatory Lindenberg Results of the Measurement Strategy of the GCOS Reference Upper Air Network (GRUAN) Holger Vömel, GRUAN.
Meteorological Observatory Lindenberg – Richard Assmann Observatory (2010) GCOS Reference Upper Air Network Holger Vömel Meteorological Observatory Lindenberg.
© Crown copyright Met Office Upgrading VOS to VOSClim Sarah North.
By Larry R. Bohman. 2 Our data collection infrastructure is already in place! Uses: Why USGS needed a policy… Not published before about 1990, but number.
Wenclawiak, B.: Fit for Purpose – A Customers View© Springer-Verlag Berlin Heidelberg 2003 In: Wenclawiak, Koch, Hadjicostas (eds.) Quality Assurance in.
GCSE ICT 3 rd Edition The system life cycle 18 The system life cycle is a series of stages that are worked through during the development of a new information.
1 Phase Testing. Janice Regan, For each group of units Overview of Implementation phase Create Class Skeletons Define Implementation Plan (+ determine.
Automated Operational Validation of Meteorological Observations in the Netherlands Wiel Wauben, KNMI, The Netherlands Introduction QA/QC chain Measurement.
The information systems lifecycle Far more boring than you ever dreamed possible!
Estimating Rainfall in Arizona - A Brief Overview of the WSR-88D Precipitation Processing Subsystem Jonathan J. Gourley National Severe Storms Laboratory.
Overview of Instrument Calibration Presents by NCQC, India.
Quality Control of Soil Moisture and Temperature For US Climate Reference Network Basic Methodology February 2009 William Collins USCRN.
SeaDataNet Technical Task Group meeting JRA1 Standards Development Task 1.2 Common Data Management Protocol (for dissemination to all NODCs and JRA3) Data.
MECH 373 Instrumentation and Measurements
U.S.-India Partnership for Climate Resilience
NAM in D-INSITU Slovenia.
Chapter 18 Maintaining Information Systems
Systems Design, Implementation, and Operation
Overview: Fault Diagnosis
Handbook on Meteorological Observations
Lidia Cucurull, NCEP/JCSDA
Weather Dependency of Quality Control Thresholds Mesoscale Atmospheric Network -project Vesa Hasu.
Quality Control Lecture 3
Environment, Natural Resources Conservation and Tourism
WMO RA I WIGOS Workshop on AWS NETWORKS
AWS Network Requirements Analysis and Network Planning
Commission on Instruments and Methods of Observation
Presentation transcript:

Operational Quality Control in Helsinki Testbed Mesoscale Atmospheric Network Workshop University of Helsinki, 13 February 2007 Hannu Lahtela & Heikki Turtiainen

©Vaisala | date | Ref. code | Page 2 What is Quality? “The degree to which a system, component, or process meets (1) specified requirements, and (2) customer or users needs or expectations” – IEEE ”Data are of good quality when they satisfy stated and implied needs... [such as] required accuracy, resolution and representativeness.” – WMO Guide to Meteorological Instruments and Methods of Observation

©Vaisala | date | Ref. code | Page 3 Quality Management and Quality Control (QC) “The purpose of quality management is to ensure that data meet requirements (for uncertainty, resolution, continuity, homogeneity, representativeness, timeliness, format, etc.) for the intended application, at a minimum practicable cost. Good data are not necessarily excellent, but it is essential that their quality is known and demonstrable.“ “Quality control is the best known component of quality management systems, and it is the irreducible minimum of any system. It consists of examination of data at stations and at data centres to detect errors so that the data may be either corrected or deleted *.” -WMO Guide to Meteorological Instruments and Methods of Observation *) ”Deleted” must be understood here in the sense that erroneous data is not used for applications – however, it should remain stored in the database, only flagged faulty.

©Vaisala | date | Ref. code | Page 4 Other Quality Management functions In addition to QC, Quality Management includes: equipment specification and selection station siting and sensor exposure planning maintenance & calibration procedures data acquisition and processing (sampling, averaging, filtering...) personnel training and education metadata management etc...

©Vaisala | date | Ref. code | Page 5 Levels of QC

©Vaisala | date | Ref. code | Page 6 Quality Flags Information about suspicious or certainly wrong data values detected in the QC process should be passed on together with an information label, or flag, in order to: indicate the quality level inform which control methods and control levels data have passed inform about the error type if an error or suspicious value was found Such flagging information is useful both in quality control phases (technical flags) and for users of meteorological information (end-user flags).

©Vaisala | date | Ref. code | Page 7 HTB uses FMI end-user flagging system Four-digit code, one digit for each QC level: HQCQC2QC1QC Value of the digit defines quality of the data: 0 no check4 calculated 1 OK5 interpolated (spatial) 2 suspicious, small difference8 missing 3 suspicious, large difference9 deleted Example: = QC0 at the site is OK 30 = QC1 found big difference (e.g. monthly limit exceeded) 500 = QC2 interpolated the value using neighbour station data 1000 = HQ accepted the interpolated value

©Vaisala | date | Ref. code | Page 8 Proposal for HTB QC process (by Jani Poutiainen) - so far implemented only partially and with some modifications.

©Vaisala | date | Ref. code | Page 9 Proposal for HTB QC process (by Jani Poutiainen) - so far implemented only partially and with some modifications.

©Vaisala | date | Ref. code | Page 10 Metman – QC1 Quality Control Quality control of weather observations is based on real time quality control, containing the following quality control tests: range test step tests ( 1hr and 3 hrs) persistence test spatial test At present the following observations are tested: wind speed (10 min. average) barometric pressure air temperature The best fit quality control algorithms and recommendations by NORDKLIM (KLIMA report no 8/2002) and Oklahoma Mesonet QC are superimposed on the Metman quality control process.

©Vaisala | date | Ref. code | Page 11 Metman QC: Control Domains Each weather stations must be part of quality control domain. Each quality control domain contains predetermined suspicious and erroneous limits for each parameters needed in each test. The values can be configured based on seasonal climate extremes. Also meteorologically non-representative and representative weather stations should be located in different quality control domains. Spatial test can be performed only with stations located on same representative quality control domain. In Helsinki Testbed project all weather stations belong to one and the same quality control domain. However, there are some special sites that should belong to a different domain. For example air temperatures in Heimoonkruoppi differ dramatically from the weather stations near by.

©Vaisala | date | Ref. code | Page 12 Metman QC1 process data flow without qc-flag Range test Step tests data flow with qc-flags erroneous Persistence test valid or erroneous Spatial test (under testing) valid, suspicious or erroneous QC1-process valid Suspicious erroneous Suspicious Technical flag code is stored in the MetMan database. Four-digit end-user flag code is composed, converted to FMML and posted to CDW together with the observation data.

©Vaisala | date | Ref. code | Page 13 Metman QC: Range test Range test is a test that determines if an observation lies between predetermined range. Erroneous ranges are based on sensor specifications and suspicious ranges can be configured based on seasonal climate extremes. Metman real time quality control process performs range test first. Range test doesn't need historical observations to perform. If range test: succeeds, step test will be performed next fails, the rest of the tests won't be performed, and observation is flagged with erroneous flag gets suspisious value, spatial test will be performed

©Vaisala | date | Ref. code | Page 14 Metman QC: Step tests Step tests use sequential observations (1-hour and 3-hours) to determine which data represent unrealistic 'jumps' during the observation time interval. Erroneous and suspicious step thresholds can be configured based on seasonal climate extremes. Metman real time QC process performs step tests after the range test. Step tests need historical observations to perform. If the tests: succeed, persistence test will be performed next fail, the rest of the tests won't be performed, and observation is flagged with erroneous flag get suspicious value, spatial test will be performed

©Vaisala | date | Ref. code | Page 15 Metman QC: Persistence test Persistence test analyses data on hourly basis to determine if observation underwent little or no variation. Metman real time quality control process persistence test after the step test. Persistence test needs historical observations to perform. If the test: succeeds, observation is flagged with valid flag fails, observation is flagged with erroneous flag gets suspicious value, spatial test will be performed

©Vaisala | date | Ref. code | Page 16 Metman QC: Spatial test Spatial test performs intercomparison between neighbour stations in the same quality control domain. Metman real time quality control processes spatial test only if one of the earlier tests returns suspicious value. Spatial test searches a nearby reference station and compares the parameter under test with that of the reference station. The reference station must belong to the same QC domain be sufficiently close have about the same altitude and installation heights the reference parameter must have passed range-, step- and persistence tests. The spatial test is currently under testing, not yet operational.

©Vaisala | date | Ref. code | Page 17 HTB QC: next steps Extension of QC1 to all measured parameters Implementation of spatial test Availability of end-user flags through Researcher’s Interface Addition of technical flagging to CDW? Special challenge for dense mesoscale networks: Large number of stations => maintenance based on immediate response too expensive => new methods and tools needed for QC, network diagnostics and maintenance!