METADATA TO DOCUMENT SURFACE OBSERVATION Michel Leroy, Météo-France.

Slides:



Advertisements
Similar presentations
Maintenance system and technology used in FMI´s observation network Jani Gustafsson Introduction FMI´s observation network.
Advertisements

Bridging the Gap Between Statistics and Engineering Statistical calibration of CFD simulations in Urban street canyons with Experimental data Liora Malki-Epshtein.
Meteorological Observatory Lindenberg – Richard Assmann Observatory The GCOS Reference Upper Air Network.
Commission for Instruments and Methods of Observation Fourteenth Session Geneva, 7 – 14 December 2006 INSTRUMENTS AND METHODS OF OBSERVATION FOR SURFACE.
Igor Zahumenský (Slovakia, CBS ET/AWS)
AOSC 634 Air Sampling and Analysis Lecture 6 Atmospheric Exposure and Siting Handbook: Chapters 4 & 5 Copyright Brock et al. 1984; Dickerson
Literature Review Kathryn Westerman Oliver Smith Enrique Hernandez Megan Fowler.
The Effect of Observing Environment on Temperature in North China Plain Area Jianxia Guo 1, Xin Li 2 1. Meteorological Observation Center of CMA, Beijing,
Manual Set-Up for Density by Hydrometer Method ASTM D1298, D287, D6822 and IP160 Feb
Field experiment on the effects of a nearby asphalt road on temperature measurement Mariko Kumamoto 1, Michiko Otsuka 2, Takeshi Sakai 1 and Toshinori.
1 ANSI/ANS American National Standard for Determining Meteorological Information at Nuclear Facilities R. Brad Harvey, CCM Physical Scientist.
Measurement of Radiation - Solar radiation - Long wave radiation - Net radiation - Exposure of radiation sensors.
SC.D CS The student knows that the water cycle is influenced by temperature, pressure, and the topography of the land. Content Limits: Items will.
Review of the current and likely future global NWP requirements for Weather Radar data Enrico Fucile (ECMWF) Eric WATTRELOT & Jean-François MAHFOUF (Météo-France/CNRM/GMAP)
Operational Quality Control in Helsinki Testbed Mesoscale Atmospheric Network Workshop University of Helsinki, 13 February 2007 Hannu Lahtela & Heikki.
Met 163: Lecture 2 Human aspects of measurement human perception vs sensor measurements reasons for automation design, implementation, and maintenance.
Build a Thermometer Screen Design and Technology – Thermometer Screen Project Name___________ Date ___________.
Climate.
Global Patterns & Relative Humidity
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss MeteoSwiss acceptance procedure for automatic weather.
CryoNet Overview and status Wolfgang Schöner Central Institute of Meteorology and Geodynamics Vienna, Austria.
REGIONAL INSTRUMENT CENTRE: REVIEW AND STRENGTHENING Dr Jérôme DUVERNOY
KINGDOM OF MOROCCO National Meteorological Service of Morocco MOROCCAN WIGOS DEMONSTRATION PROJECT 1 STANDARDIZATION AND TRACEABILITY OF THE MOROCCAN METEOROLOGICAL.
Lecture 2 (9/16) METR 1111 Meteorological Instruments.
Leon Tolstoy, UPRM, UMASS Elena SaltikoffVaisala Internship in Helsinki, Finland January - February 2006.
GLOBAL CLIMATE OBSERVING SYSTEM- REQUIREMENTS AND REALITIES OF PROVIDING OVERLAPPING RADIOSONDE FLIGHT SERIES DATA FOR LONG TERM CLIMATE CONTINUITY Carl.
European Metrology Research Program (EMRP) MeteoMet Project (October 2011) WP3. Traceable measurements methods and protocols for ground based meteorological.
Meteorological Observatory Lindenberg – Richard Assmann Observatory The GCOS Reference Upper Air Network.
Tower SystemsJanuary AMS Short Course on Instrumentation 1 Installation and Use of Meteorological Tower Systems Melanie A. Wetzel Desert Research.
WMO Climate-relevant operational Metadata Peer Hechler, Omar Baddour WMO; WIS DMA.
Investigation of Meteorological Tower Siting Criteria Ken Sejkora Entergy Nuclear Northeast – Pilgrim Station Presented at the 15 th Annual RETS-REMP Workshop.
Results of the WMO Laboratory Intercomparison of rainfall intensity gauges Luca G. Lanza University of Genoa WMO (Project Leader) DIAM UNIGE September.
Workshop on QC in Derived Data Products, Las Cruces, NM, 31 January 2007 ClimDB/HydroDB Objectives Don Henshaw Improve access to long-term collections.
1. Sensor Classification System Canadian Version: Siting Classification Rodica Nitu.
Siting classification for Surface Observing Stations on Land Michel Leroy, Météo-France.
Documentation of surface observation. Classification for siting and performance characteristics Michel Leroy, Météo-France.
Problems related to the use of the existing noise measurement standards when predicting noise from wind turbines and wind farms. Erik Sloth Vestas Niels.
Existing Scientific Instruments (of astronomical interest) AWS Concordia AWS Davis and AW11 (summer) 12 m Tower: Wind, Temperature, RH sensors at standard.
CIMO Survey National Summaries of Methods and Instruments Related to Solid Precipitation Measurement at Automatic Weather Stations - Very Preliminary results.
11 March 2013 Tim Oakley, GCOS Implementation Manager WIGOS TT Metadata Global Climate Observing System.
© TAFE MECAT 2008 Chapter 6(b) Where & how we take measurements.
Automated Weather Observations from Ships and Buoys: A Future Resource for Climatologists Shawn R. Smith Center for Ocean-Atmospheric Prediction Studies.
Quality control of daily data on example of Central European series of air temperature, relative humidity and precipitation P. Štěpánek (1), P. Zahradníček.
Why We Care or Why We Go to Sea.
–thermometer –barometer –anemometer –hygrometer Objectives Recognize the importance of accurate weather data. Describe the technology used to collect.
Climate data past and future: Can we more effectively monitor and understand our changing climate? Peter Thorne.
LECTURE 4: ICAO CHART requirements
Quality management, calibration, testing and comparison of instruments and observing systems M. Leroy, CIMO ET on SBII&CM.
Meteorological Observatory Lindenberg Results of the Measurement Strategy of the GCOS Reference Upper Air Network (GRUAN) Holger Vömel, GRUAN.
Manual Density Business Today
Climate: The average, year-after-year conditions of temperature, precipitation, winds and clouds in an area.
VI Seminar Homogenization, Budapest 2008 M.Mendes, J.Neto, A.Silva, L.Nunes, P.Viterbo Instituto de Meteorologia, Portugal “Characterization of data sets.
By Larry R. Bohman. 2 Our data collection infrastructure is already in place! Uses: Why USGS needed a policy… Not published before about 1990, but number.
WMO RTC-Turkey facilities, Alanya, Turkey Templates and Regulations
AMSR-E Vapor and Cloud Validation Atmospheric Water Vapor –In Situ Data Radiosondes –Calibration differences between different radiosonde manufactures.
Metrology within Meteorology. METROLOGY PYRAMID METROLOGY LABORATORIES MANUFACTURERS / PUBLIC NATIONAL LABORATORY BIPM.
Eidgenössisches Departement des Innern EDI Bundesamt für Meteorologie und Klimatologie MeteoSchweiz 1 METEO-Cert - Acceptance Procedure for Automatic Weather.
Rain gauge Lathiya milan. Methods of Measuring Rainfall: Remote Tipping bucket rain gauge -The bucket tips when precipitation of 0.2 mm,
Climate and Weather.
Radio Coverage Prediction in Picocell Indoor Networks
CTIO Weather Station renewal
Presented by Harry C. Elinsky, Jr. Filtech, Inc.
Risk Management with Minimum Weight
Assorted Observation Systems
Handbook on Meteorological Observations
Environment, Natural Resources Conservation and Tourism
WMO RA I WIGOS Workshop on AWS NETWORKS
Classification of JMA manned stations
Jemal Mohammed WMO RA I WORKSHOP ON AUTOMATIC WEATHER STATION (AWS) NETWORKS STRENGTHNING AND MODERNIZING OBSERVING SYSTEMS IN AFRICA,
Presentation transcript:

METADATA TO DOCUMENT SURFACE OBSERVATION Michel Leroy, Météo-France

METADATA  Metadata is necessary to use efficiently observed data.  Latitude, longitude, altitude, station Id., date and time are obvious metadata.  A detailed description of the site and the instruments used, their characteristics, the historic of any instrument and site change, etc. is highly recommended and wished by climatologists. But the way to document this information is not yet standardized, this information is often missing and when available, the information is not easy to use by automatic means, due to its complexity.  The site environment is one of the important factor affecting a field measurement and its representativeness for various applications.  Though quite well known by the meteorological services, WMO siting recommendations are not always followed in the real world (or cannot be followed).  It is the same thing for the measurement uncertainty when compared to recommended and achievable measurement uncertainty stated in WMO doc n°8 (CIMO Guide),

Some “condensed” metadata  In order to document the site environment and the sustained characteristics of the measurement system in an easy to handle way, Météo-France has defined two classifications :  A siting classification, ranging from 1 to 5, for each basic parameter.  A “maintained performance” classification, ranging from A to E, for each basic measurement.  Reducing the site characteristics and the equipment’s performances to single numbers or letters hide many interesting details, but a major advantage is to let the results easy to use. And these single numbers don’t restrict an additional detailed documentation (such as photos).  The definition of these classifications is coming from an initial analysis of quality factors influencing a measurement

Drawbacks  These classifications don’t allow any corrections of the data. They are not developed for that.  Especially for wind, may be for precipitation, some correction methods exist and could be applied. These methods need a detailed knowledge of the site environment and sometimes additional parameters. There would be a great interest in applying standardized methods to correct raw measurements using the available metadata of a site. But the set of metadata needed to apply corrections is not clearly defined or standardized (except for wind for the reduction of the measured wind to a “standard” wind at 10 meters with a roughness length of 0.03 m). It would be ideal to have them, but this approach may be impracticable in the real world.  The advantage of the proposed classification is its practicability in the real world, therefore adding a practicable value to the information.

Quality factors of a measurement  The intrinsic characteristics of sensors or measurement methods  The maintenance and calibration needed to maintain the system in nominal conditions.  The site representativeness

Site representativeness  Exposure rules from CIMO recommendations.  But not always followed and not always possible to follow, depending on the geographical situation.  In 1997, Météo-France defined a site classification for some basic surface variables. –Class 1 is for a site following WMO recommendations –Class 5 is for a site which should be absolutely avoided for large scale or meso-scale applications. –Class 2, 3 and 4 are intermediate  This classification has been presented during TECO98 in Casablanca.

Classification for wind measurements  Roughness classification : Davenport, see CIMO Guide, WMO Doc n°8  Siting classification  The existence of obstacles nearly always lead to a decrease of the mean wind speed. Extreme values are generally also decreased, but not always. Obstacles increase turbulence and may lead to (random) temporary increase of instantaneous wind speed.  The following classes are considering a conventional 10 m measurement.

 Class 1   The wind tower must be erected at a distance of at least 10 times the height of the nearby obstacles (therefore seen under an elevation angle below 5.7°)   An object is considered as an obstacle if it is seen under an angular width greater than 10°.   The obstacles must be below 5.5 m within a 150 m distance around the tower (and if possible be below 7 m within a 300 m distance).   The wind sensors must be located at a minimum distance of 15 times the width of thin nearby obstacles (mast, thin tree with angular width < 10°).   The surrounding country must not present any relief change within a 300 m radius. A relief change is a 5 m height change.

 Class 2 (error 10% ?)   The wind tower must be erected at a distance of at least 10 times the height of the nearby obstacles (elevation angle < 5.7°)   An object is considered as an obstacle if it is seen under an angular width greater than 10°.   A relief change within a 100 m radius is also considered as an obstacle.   The wind sensors must be located at a minimum distance of 15 times the width of thin nearby obstacles (mast, thin tree with angular width < 10°).  Class 3 (error 20% ?)   The wind tower must be erected at a distance of at least 5 times the height of the nearby obstacles (elevation angle < 11.3°)   A relief change within a 50 m radius is also considered as an obstacle.   The wind sensors must be located at a minimum distance of 10 times the width of thin nearby obstacles.

 Class 4 (error 30% ?)   The wind tower must be erected at a distance of at least 2.5 times the height of the nearby obstacles (elevation angle < 21.8°)   Class 5 (error > 40% ?)   Obstacles are existing at a distance less than 2.5 times their height.  - Obstacles with a height greater than 8 m, at a distance less than 25 m.

St-Sulpice Nord Est

St-Sulpice Sud Ouest

St-Sulpice. Relevé de masques  Class 4 for wind.  New Radome AWS settled at a distance of 60 m, away from the woods  class 3

Saint Sulpice, DIRCE Ratio of mean wind speed (10 min.) between Patac et Xaria South winds North winds

Classification of stations  Between 2000 and 2006, 400 AWS have been installed for the Radome network.  The objective was class 1 for each parameter (Temp, RH, wind, precip., solar radiation).  But class 2 or class 3 were accepted when class 1 not possible.  Météo-France is now classifying al the surface observing stations, including the climatological cooperative network: ~4300 sites, before the end of  Update at least every 5 years.

Where are we ?

Other quality factors  Intrinsic performances  Maintenance and calibration  Within a homogeneous network, these factors are known and generally the same. But Météo-France is using data from various networks: –Radome (554) –Non-proprietary AWS (~800) –Climatological cooperative network (> 3000)  The intrinsic performances, maintenance and calibration procedures are not the same.

Several reasons  The objectives may be different.  But some uncertainty objectives are sometimes (often) unknown ! –To get cheap measurements ?  The maintenance and/or the calibration are not always organized !  Within the ISO certification process, Météo- France was forced to increase his knowledge of the various networks’ characteristics.

Another classification !  After site classification (1 to 5), definition of an additional classification, to cover the two quality factors : –Intrinsic performances –Maintenance and calibration  5 levels were defined : –Class A : WMO/CIMO recommendations (Annex 1B of CIMO guide) –Class B : Lower specs, but more realistic or affordable : “good” performances and “good” maintenance and calibration. RADOME specs. –Class C: Lower performances and maintenance, but maintenance/calibration organized. –Class D : No maintenance/calibration organized. –Class E : Unknown performances and/or maintenance  This classification is called : Maintained performance classification

Air temperature  Class A: Overall uncertainty of 0.1°C. Therefore, the uncertainty of the temperature probe lower than 0.1°C and use of a “perfect” artificially ventilated screen. Achievable measurement uncertainty is 0.2°C.  Class B: Pt100 (or Pt1000) temperature probe of class A (  0.25°C). Acquisition uncertainty < 0.15°C. Radiation screen with known characteristics and over-estimation of Tx (daily max. temperature) < 0.15°C in 95% of cases. Laboratory calibration of the temperature probe every 5 years.  Class C: Temperature probe with uncertainty < 0.4°C. Acquisition uncertainty < 0.3°C. Radiation screen with known characteristics and over-estimation of Tx < 0.3°C in 95% of cases.  Class D: Temperature probe and/or acquisition system uncertainty lower than for class C. Radiation screen or with “unacceptable” characteristics (for example, over-estimation of Tx > 0.7°C in 5% of cases).

Relative humidity  Class A: Overall uncertainty of 1%! Achievable 2%.  Class B: Sensor specified for  6%, over a temperature range of –20°C to +40°C. Acquisition uncertainty < 1%. Calibration every year, in an accredited laboratory.  Class C: Sensor specified for  10%, over a temperature range of –20°C to +40°C. Acquisition uncertainty < 1%. Calibration every two years in an accredited laboratory, or calibration every year in a non-accredited laboratory.  Class D: Sensor with specifications worst than  10% over the common temperature conditions. Calibration not organized.

Global solar radiation  Class A: Pyranometer of ISO class 1. Uncertainty of 5% for daily total. Ventilated sensor. Calibration every two years. Regular cleaning of the sensor (at least weekly).  Class B: Pyranometer of ISO class 1. No ventilation. Calibration every two years. No regular cleaning of the sensor.  Class C: Pyranometer of ISO class 2. No ventilation. Calibration every five years. No regular cleaning of the sensor.  Class D: Sensor not using a thermopile. Calibration not organized.

Other parameters  Pressure  Amount of precipitation  Wind  Visibility  Temperature above ground  Soil temperature

Status of the RADOME network  Air temperature : Class B  RH : Class B  Amount of precipitation : Class B or Class C, depending on the rain gauge used.  Wind : Class A  Global solar radiation : Class A for manned station, class B for isolated sites.  Ground temperatures : Class B  Pressure : Class B  Visibility (automatic) : Class B

Status of the cooperative network  Air temperature (liquid in glass thermometers) : Class C  Amount of precipitation : Class B

Status of non-Météo-France additional networks  Air temperature : Class B to D  RH : Class B to D  Amount of precipitation : Class B to C  Wind : Class B to D  Global solar radiation : Class B to D  Ground temperature : Class B to C  Pressure : Class B to D

Metadata  These classification for each site are meta data, part of the climatological database.  Site classification is on going.  Maintained performance classification has been defined this year and is being applied : is it possible to “easily” classify the additional networks.  With these two classifications, a measurement on a site can be given a short description. –Example : C3 for global solar radiation is for a class 2 pyranometer without ventilation, calibrated every 2 years, installed on a site with direct obstructions, but below 7°.

An image of a network

An image of the RADOME network

Conclusion  These classifications are intended to describe the real world of measuring networks, which is sometimes far form the WMO/CIMO recommendations.  WMO (CIMO, CBS) has decided to develop a site classification, on the example of this classification. Such a standard would be further recognized by ISO.  This topic has been recently discussed by the CIMO Expert Team on Surface Technology and Measurement Techniques.  Any suggestions or comments are welcomed. To be addressed to Michel Leroy

Proposed change for precipitation  Change class 1 for having in class 1 a well protected site : homogeneous obstacles around the rain gauge which can reduce the wind speed at the gauge level.  Class 2 unchanged : no obstacles closer than 2 times their height.

Proposed change for temperature/humidity  To use the climatology of wind for temperature classification. –% of low wind speed ( < 1.5 or 2 m/s) ? Trappes St Denis, La Réunion

Proposed change for temperature/humidity  The perturbation from artificial surface is greatly reduce with wind. With a 1 m/s wind, the air moves by 60 m in one minute. The frequency of mean wind speed (at 10 m) below 1.5 m/s could be used to reduce the influence of artificial surface in the classification.  The shading conditions currently used are a big constraint. It could be partly replaced by the global angle of view of obstacles : –No obstacles, angle of view is 0 –Obstacles everywhere : angle of view is 2 . –Screen along a wall : angle of view is  (50%). –Angle of view thresholds could be 5%, 10%, 20% –But more difficult to evaluate.