Slide 1 TIGGE phase1: Experience with exchanging large amount of NWP data in near real-time Baudouin Raoult Data and Services Section ECMWF.

Slides:



Advertisements
Similar presentations
© GEO Secretariat THORPEX-TIGGE Overall Concept What? –TIGGE: THORPEX will develop, demonstrate and evaluate a multi- model, multi-analysis and multi national.
Advertisements

Slide 1 TIGGE-LAM Workshop Bologna Jan TIGGE-LAM: Archiving at ECMWF Manuel Fuentes Data and Services Section ECMWF.
ECMWF June 2006Slide 1 Access to ECMWF data for Research Manuel Fuentes Data and Services Section, ECMWF ECMWF Forecast Products User Meeting.
New Resources in the Research Data Archive Doug Schuster.
The THORPEX Interactive Grand Global Ensemble (TIGGE) Richard Swinbank, Zoltan Toth and Philippe Bougeault, with thanks to the GIFS-TIGGE working group.
1 SIMDAT SIMDAT demonstration – TECO/WIS 7 Nov SIMDAT: Presentation of the demonstration Baudouin Raoult Coordinator, SIMDAT Meteo Activity ECMWF.
Slide 1 TECO on the WIS, Seoul, 6-8 November 2006 Slide 1 TECO on the WIS: Stakeholder Session THORPEX and TIGGE Walter Zwieflhofer ECMWF.
TIGGE, GRIB to NetCDF converter Doug Schuster (NCAR/ECMWF)
ICOADS Archive Practices at NCAR JCOMM ETMC-III 9-12 February 2010 Steven Worley.
Systems Oceanography: Observing System Design. Why not hard-wire the system? Efficiency of interface management –Hard-wire when component number small,
Numerical weather prediction: current state and perspectives M.A.Tolstykh Institute of Numerical Mathematics RAS, and Hydrometcentre of Russia.
GEO Work Plan Symposium 2014 WE-01 Jim Caughey THORPEX IPO.
EGU 2011 TIGGE, TIGGE LAM and the GIFS T. Paccagnella (1), D. Richardson (2), D. Schuster(3), R. Swinbank (4), Z. Toth (3), S.
TIGGE Archive Highlights. First Service Date ECMWF – October 2006 NCAR – October 2006 CMA – June 2007.
GADS: A Web Service for accessing large environmental data sets Jon Blower, Keith Haines, Adit Santokhee Reading e-Science Centre University of Reading.
Metadata for the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) using the Earth System Modeling Framework (ESMF) Peter Bosler University.
Unidata TDS Workshop TDS Overview – Part I XX-XX October 2014.
TIGGE Data Archive and Access System at NCAR 5th GIFS-TIGGE Working Group South African Weather Service Pretoria March 2008 Steven Worley Doug Schuster.
Ensemble Forecasting: Thorpex-Tigge and use in Applications Tom Hopson.
Inter-comparison and Validation Task Team Breakout discussion.
Ensemble Handling in GrADS
Page 1 Pacific THORPEX Predictability, 6-7 June 2005© Crown copyright 2005 The THORPEX Interactive Grand Global Ensemble David Richardson Met Office, Exeter.
Data Access to Marine Surface Observations and Products from COADS 29 January, 2002 Steven Worley National Center for Atmospheric Research.
Slide 1 ECMWF, 5 September 2007 Slide 1 The SIMDAT project Baudouin Raoult Head of Data and Services Section ECMWF.
THORPEX Interactive Grand Global Ensemble (TIGGE) China Meteorological Administration TIGGE-WG meeting, Boulder, June Progress on TIGGE Archive Center.
Research Results from TIGGE and a Vision for a New Paradigm for Global Prediction David Parsons Chief, World Weather Research Division (WWRD)
1 Takuya KOMORI 1 Kiyotomi SATO 1, Hitoshi YONEHARA 1 and Tetsuo NAKAZAWA 2 1: Numerical Prediction Division, Japan Meteorological Agency 2: Typhoon Research.
Soil moisture generation at ECMWF Gisela Seuffert and Pedro Viterbo European Centre for Medium Range Weather Forecasts ELDAS Interim Data Co-ordination.
TIGGE, an International Data Archive and Access System Steven Worley Doug Schuster Dave Stepaniak Nate Wilhelmi (NCAR) Baudouin Raoult (ECMWF) Peiliang.
ECMWF – DCPC Status and plans© ECMWFSlide 1 ECMWF DCPC status and plans Baudouin Raoult Head of Meteorological Data Section.
Progress of CMA TIGGE Archive Data center (updated) Bian Xiaofeng,Li Xiang,Sun Jing (National Meteorological Information Centre,CMA) Chen Jing Hu Jiangkai,
TIGGE Data Archive and Access at NCAR November 2008 November 2008 Steven Worley National Center for Atmospheric Research Boulder, Colorado, U.S.A.
Slide 1 GO-ESSP Paris. June 2007 Slide 1 (TIGGE and) the EU Funded BRIDGE project Baudouin Raoult Head of Data and Services Section ECMWF.
The HDF Group Data Interoperability The HDF Group Staff Sep , 2010HDF/HDF-EOS Workshop XIV1.
18 September 2009: On the value of reforecasts for the TIGGE database 1/27 On the value of reforecasts for the TIGGE database Renate Hagedorn European.
TIGGE Archive Status at NCAR THORPEX Workshop and 6th GIFS-TIGGE Working Group Meetings WMO Headquarters Geneva September 2008 Steven Worley Doug.
SCD Research Data Archives; Availability Through the CDP About 500 distinct datasets, 12 TB Diverse in type, size, and format Serving 900 different investigators.
Slide 1 Thorpex ICSC12 and WWRP SSC7 18 Nov The Sub-seasonal to Seasonal (S2S) Prediction Project 1 “Bridging the gap between weather and climate”
TIGGE LAM – some issues related to interoperability aspects Tiziana Paccagnella Interoperability – ECMWF Reading January.
TIGGE Archive Access at NCAR Steven Worley Doug Schuster Dave Stepaniak Hannah Wilcox.
Page 1 Andrew Lorenc WOAP 2006 © Crown copyright 2006 Andrew Lorenc Head of Data Assimilation & Ensembles Numerical Weather Prediction Met Office, UK Data.
1 SIMDAT Simdat Project –GTD. Meteo Activity – SIMDAT Meteo Activity OGF June 2008 Barcelona Marta Gutierrez, Baudouin Raoult, Cristina.
5-7 May 2003 SCD Exec_Retr 1 Research Data, May Archive Content New Archive Developments Archive Access and Provision.
Send Environmental Data To the ORPG Design Review Wednesday, Nov. 16, 2005 Joanne Edwards Tom Kent.
WMO GRIB Edition 3 Enrico Fucile Inter-Program Expert Team on Data Representation Maintenance and Monitoring IPET-DRMM Geneva, 30 May – 3 June 2016.
The TIGGE Model Validation Portal: An Improvement in Data Interoperability 1 Thomas Cram Doug Schuster Hannah Wilcox Michael Burek Eric Nienhouse Steven.
TIGGE-LAM archive development in the frame of GEOWOW Richard Mladek, Manuel Fuentes (ECMWF)
GSICS Data and Products Servers Volker Gärtner (EUMETSAT) For GSICS Data Working Group.
2013 GSICS Joint Meeting, Williamsburg VA, USA, March GSICS Collaboration Servers Status: 2013 Peter Miu (EUMETSAT) CMA, CNES, EUMETSAT, ISRO,
Google Meningitis Modeling Tom Hopson October , 2010.
Status of CMA S2S Archiving & Web portal
Precipitation Data: Format and Access Issues
Contributing Centres to S2S database
SRNWP Interoperability Workshop
Tom Hopson, NCAR (among others) Satya Priya, World Bank
TIGGE Archives and Access
TIGGE Data Archive and Access System at NCAR
Jennifer Boehnert Emily Riddle Tom Hopson
Meningitis Forecasting using Climate Information Tom Hopson
Google Meningitis Modeling
Links with GEO.
Development and Futures of Research Data Archives
TIGGE Data Archive at NCAR
GIFS-TIGGE project Richard Swinbank, and Young-Youn Park,
Steven Worley, Douglas Schuster,
Implementation and Plans for TIGGE at NCAR and ECMWF
MOGREPS developments and TIGGE
CXML data exchange Beth Ebert
Data Curation in Climate and Weather
ECMWF usage, governance and perspectives
Presentation transcript:

Slide 1 TIGGE phase1: Experience with exchanging large amount of NWP data in near real-time Baudouin Raoult Data and Services Section ECMWF

Slide 2 The TIGGE core dataset THORPEX Interactive Grand Global Ensemble Global ensemble forecasts to around 14 days generated routinely at different centres around the world Outputs collected in near real time and stored in a common format for access by the research community Easy access to long series of data is necessary for applications such as bias correction and the optimal combination of ensembles from different sources

Slide 3 Building the TIGGE database Three archive centres: CMA, NCAR and ECMWF Ten data providers: -Already sending data routinely: ECMWF, JMA (Japan), UK Met Office (UK), CMA (China), NCEP (USA), MSC (Canada), Météo- France (France), BOM (Australia), KMA (Korea) -Coming soon: CPTEC (Brazil) Exchanges using UNIDATA LDM, HTTP and FTP Operational since 1st of October TB, growing by ~ 1 TB/week -1.5 millions fields/day

Slide 4 Archive Centre Current Data Provider Future Data Provider NCAR NCEP CMC UKMO ECMWF MeteoFrance JMA KMA CMA BoM CPTEC TIGGE Archive Centres and Data Providers

Slide 5 Strong governance Precise definition of: -Which products: list of parameters, levels, steps, units,… -Which format: GRIB2 -Which transport protocol: UNIDATA’s LDM -Which naming convention: WMO file name convention Only exception: the grid and resolution -Choice of the data provider. Data provider to provide interpolation to regular lat/lon -Best possible model output Many tools and examples: -Sample dataset available -Various GRIB2 tools, “tigge_check” validator, … -Scripts that implement exchange protocol Web site with documentation, sample data set, tools, news….

Slide 6 Using SMS to handle TIGGE flow

Slide 7 Quality assurance: homogeneity Homogeneity is paramount for TIGGE to succeed -The more consistent the archive the easier it will be to develop applications There are three aspects to homogeneity: -Common terminology (parameters names, file names,…) -Common data format (format, units, …) -Definition of an agreed list of products (Parameters, Steps, levels, …) What is not homogeneous: -Resolution -Base time (although most provider have a run a 12 UTC) -Forecast length -Number of ensemble

Slide 8 QA: Checking for homogeneity E.g. Cloud-cover: instantaneous or six hourly?

Slide 9 QA: Completeness The objective is to have 100% complete datasets at the Archive Centres Completeness may not be achieved for two reasons: -The transfer of the data to the Archive Centre fails -Operational activities at a data provider are interrupted and back filling past runs is impractical Incomplete datasets are often very difficult to use Most of the current tools (e.g. epsgrams) used for ensemble forecasts assume a fixed number of members from day to day -These tools will have to be adapted

Slide 10 QA: Checking completeness

Slide 11 GRIB to NetCDF Conversion t, EGRR, 1 t (1,2,3,4) d (1,2,3,4) Metadata t, ECMF, 2 t, EGRR, 2 t, ECMF, 1 d, EGRR, 1 d, EGRR, 2 d, ECMF, 1 d, ECMF, 2 Gather metadata and message locations Create NetCDF file structure Populate NetCDF parameter arrays (1,2,3,4) represents ensemble member id (Realization) GRIB FileNetCDF File

Slide 12 Ensemble NetCDF File Structure NetCDF File format -Based on available CF conventions -File organization built according to Doblas-Reyes (ENSEMBLES project) proposed NetCDF file structure -Provides grid/ensemble specific metadata for each member  Data Provider  Forecast type (perturbed, control, deterministic) -Allows for multiple combinations of initialization times and forecast periods within one file.  Pairs of initialization and forecast step

Slide 13 Ensemble NetCDF File Structure NetCDF Parameter structure (5 dimensions): -Reftime -Realization (Ensemble member id) -Level -Latitude -Longitude “Coordinate” variables are use to describe: -Realization  Provides metadata associated with each ensemble grid. -Reftime  Allows for multiple initialization times and forecast periods to be contained within one file

Slide 14 Tool Performance GRIB-2 Simple Packing to NetCDF 32 BIT -GRIB-2 size x ~2 GRIB-2 Simple Packing to NetCDF 16 BIT -Similar size GRIB-2 JPEG 2000 to NetCDF 32 BIT -GRIB-2 size x ~8 GRIB-2 JPEG 2000 to NetCDF 16 BIT -GRIB-2 size x ~4 Issue: packing of 4D fields (e.g. 2D + levels + time steps) -Packing in NetCDF similar to simple packing in GRIB2:  Value = scale_factor * packed_value+ add_offset ; -All dimensions shares the same scale_factor and add_offset -For 16 bits, only different values can be encoded. This is a problem if there is a lot of variation in the 4D matrices

Slide 15 GRIB2 WMO Standard Fine control on numerical accuracy of grid values Good compression (Lossless JPEG) GRIB is a record format -Many GRIBs can be written in a single file GRIB Edition 2 is template based -It can easily be extended

Slide 16 NetCDF Work on the converter gave us a good understanding of both formats NetCDF is a file format -Merging/splitting NetCDF files is non-trivial Need to agree on a convention (CF) -Only lat/long and reduced grid (?) so far. Work in progress for adding other grids to the CF -There is no way to support multiple grids in the same file Choose a convention for multi fields per NetCDF files -All levels? All variables? All time steps? Simple packing possible, but only a convention -2 to 8 times larger than GRIB2

Slide 17 Conclusion True interoperability -Data format, Units -Clear definition of the parameters (semantics) -Common tools are required (only guarantee of true interoperability) Strong governance is needed GRIB2 vs NetCDF -Different usage patterns -NetCDF: file based, little compression, need to agree on a convention -GRIB2: record based, easier to manage large volumes, WMO Standard