HARMO13, 1-4June 2010, Paris, France1 Institute for Environment and Sustainability Procedure.

Slides:



Advertisements
Similar presentations
DG-JRC, IES – EEA-EPA Meeting Sept.04 Research Needs for Implementation and Inter-comparisons for Environment & Health Indicators Andreas N. Skouloudis.
Advertisements

Institute for Environment and Sustainability1 POMI Kick-off Meeting 07/03/2008.
Task Force on Modelling and measurement activity : synergies with FAIRMODE Laurence Rouïl (INERIS) Co-chair of the TFMM.
Jenny Stocker, David Carruthers & Sam Royston Comments on DELTA version 3.2 with the ADMS-Urban London dataset & updates to the PASODOBLE Myair Model Evaluation.
Applying the DELTA-FAIRMODE tool to support AQD: the validation of the TCAM Chemical Transport Model C. Carnevale, G. Finzi, A. Pederzoli, P. Thunis, E.
AQMEII Phase 1 and 2: A comparative analysis of off line versus on line models for EU air quality application over two year of simulation S. Galmarini,
FAIRMODE SG4: Benchmarking & modelling quality objectives P. Thunis, JRC Antwerp,
CMIP5: Overview of the Coupled Model Intercomparison Project Phase 5
A protocol for model validation ACCENT/GLOREAM Workshop Paris, France, October 2006 Peter Builtjes, TNO-the Netherlands and FU-Berlin.
TAP-ET: TRANSLATION ADEQUACY AND PREFERENCE EVALUATION TOOL Mark Przybocki, Kay Peterson, Sébastien Bronsart May LREC 2008 Marrakech, Morocco.
Inputs from the PROMOTE/MACC projects Laurence Rouïl (INERIS)
15 / 05 / 2008 Model ensembles for the simulation of air quality over Europe Robert Vautard Laboratoire des Sciences du Climat et de l’Environnement And.
1 Swedish experiences of applying the Quality Objectives for NO2 and PM10 modelling introduction model evaluation for Swedish - street/road stations -
NCPP – needs, process components, structure of scientific climate impacts study approach, etc.
WG2 Meeting, SG4 Models Benchmarking SMHI, Norrkoping, 14 th June 2011 DELTA tool testing for a Portuguese City Ana Miranda and Helena Martins University.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling and Analysis Division S.T. Rao Director, Atmospheric Modeling.
PM Model Performance Goals and Criteria James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling Meeting Denver, CO May 26,
Regional Haze Modeling RPO Update Gary Kleiman, NESCAUM National RPO Meeting, Dallas, TX December 3, 2002.
Conformance Mark Skall Lynne S. Rosenthal National Institute of Standards and Technology
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.
The Euro- and City-Delta model intercomparison exercises P. Thunis, K. Cuvelier Joint Research Centre, Ispra.
Inter-comparison and Validation Task Team Breakout discussion.
Wish-list to the Emission community.  TFMM annual meeting held in Zagreb on the 6-8 May 2013  Main issues :  Review of the implementation of the EMEP.
“Development of the Co-operation within the Convention on Long Range Transboundary Air Pollution” Karin Kindbom IVL Swedish Environmental Research.
University of North Carolina at Chapel Hill Carolina Environmental Programs Models-3 Adel Hanna Carolina Environmental Program University of North Carolina.
Work Group Meeting on IT Techniques, Tools and Philosophies for Model Intercomparison Ispra, JRC, March 14, 2004 Data handling approaches, software tools,
Research Progress Discussions of Coordinated Emissions Research Suggestions to Guide this Initiative Focus on research emission inventories Do not interfere.
European Environment Agency FAIRMODE – status quo WG1 activities Anke Lükewille Air and Climate Change Programme European Environment Agency (EEA) SMHI.
10 October 2008, Cavtat (CROATIA) – First Planery Meeting FAIRMODE1 IES - Institute for Environment and Sustainability Ispra - Italy
FAIRMODE Forum for Air Quality Modelling in Europe Re-organisation & future work programme 18th EIONET Air Quality Meeting, Dublin, October 2013.
Fairmode: Latest developments P. Thunis + Fairmode chairs & co-chairs + Fairmode Community.
Analysis of station classification and network design INERIS (Laure Malherbe, Anthony Ung), NILU (Philipp Schneider), RIVM (Frank de Leeuw, Benno Jimmink)
AeroCom organisation Core team : Christiane Textor, Sarah Guibert, Stefan Kinne, Joyce Penner, Michael Schulz, Frank Dentener (LSCE-MPIM-JRC-UMI) Initial.
Possible use of Copernicus MACC-II modeling products in EEAs assessment work Leonor Tarrasón, Jan Horálek, Laure Malherbe, Philipp Schneider, Anthony Ung,
FAIRMODE MEETING NORRKÖPING JUNE Session: The use of Receptor Models in Source Apportionment (coordinator C. Belis) General considerations.
Evaluation of CMAQ Driven by Downscaled Historical Meteorological Fields Karl Seltzer 1, Chris Nolte 2, Tanya Spero 2, Wyat Appel 2, Jia Xing 2 14th Annual.
FAIRMODE meeting, Norrkoping, June Institute for Environment and Sustainability SG4: Benchmarking.
Science plan S2S sub-project on verification. Objectives Recommend verification metrics and datasets for assessing forecast quality of S2S forecasts Provide.
13 / 10 / 2006 Uncertainty and regional air quality model diversity: what do we learn from model ensembles? Robert Vautard Laboratoire des Sciences du.
Comments from Austria D. Oettl, A. Kaiser et al. Antwerp,
Data BaseCompareVerifyDrivers EDGARv4 UNFCCC EMEP GAINS REAS MICS-Asia PKU VULCAN TNO-MACC EPRTR HTAP_V2 Econo- metrics Decom- position Inverse modeling.
1 RPO Data Analysis/Monitoring Grant Guidance Review Extracted from the EPA’s 3/5/02 RPO 4 th Year Policy, Organizational & Technical Guidance.
The FAIRMODE PM modelling guide Laurence ROUIL Bertrand BESSAGNET
Research Progress Discussions of Coordinated Emissions Research Suggestions to Guide this Initiative Focus on research emission inventories Do not interfere.
The FAIRMODE ways of communication
Extended Bureaux EMEP & WGE, Geneva March 21th 2017
Forum for Air quality Modelling FAIRMODE ew. eea
RDE Task Force Meeting, 28th November 2013, Brussels
DAY 1 09: SGs Plenary – Introduction on WG2 ad SG break outs
Current activities WG2-SG3 Urban Emissions and Projections
Contribution from AQUILA to Air Policy Review
Urban Emissions and Projections
QUALITY ASSURANCE OF MODELS
QUALITY ASSURANCE OF MODELS
FAIRMODE Update Fairmode Steering Group.
FAIRMODE WG2 – SG4 activity
SG3 outcome General agreement on the check-list approach
Work plan and next steps – RDE-LDV working group
IMPROVING PUBLIC INFORMATION
Bruce Rolstad Denby FAIRMODE 4th Plenary, Norrkjoping Sweden June 2011
Introduction- Link with WG E activity CMEP PLENARY MEETING-PRAGUE
Data Extraction Facility
ETS WG role and working methods
EURODELTA III RCG-Model
FAIRMODE WG2 The main aim of WG2 activities is to create a European Framework for Model Evaluation which will include the development of widely accepted.
The EuroDelta inter-comparison, Phase I Variability of model responses
Welcome in Ispra ! 2010 Fall TFMM workshop.
Forum for Air Quality Modelling in Europe
Working Group 2A ECOSTAT progress report Presented by Wouter van de Bund Joint Research Centre Institute for Environment and Sustainability Inland.
UN-GGIM: Europe – Work Plan
Presentation transcript:

HARMO13, 1-4June 2010, Paris, France1 Institute for Environment and Sustainability Procedure for Air Quality Models Benchmarking P. Thunis, E. Georgieva, S. Galmarini FAIRMODE WG2 – SG4 activity

HARMO13, 1-4June 2010, Paris, France 2 Outline Objectives & Background Key elements of the procedure The Benchmarking service Usage of the procedure Workplan Contributions & links to other SG

HARMO13, 1-4June 2010, Paris, France 3 Objectives Develop a procedure (sequence of operations) for the benchmarking of AQ models in order to evaluate their performances and indicate ways for improvements. Support both model users & model developers in the implementation of the AQD (Assessment & plans) Provide an aid to policy bodies in their judgment on the quality of model results

HARMO13, 1-4June 2010, Paris, France 4 Background (I) - First Proposal for a Benchmarking Tool as a diagnostic instrument for checking quality of model results : WG2 Meeting November Document “ Procedure for AQ models Benchmarking” sent out to SG4 participants April 2010 (uploaded on FAIRMODE web page) - Only a few feed-backs up to now…

HARMO13, 1-4June 2010, Paris, France 5 Background (II) Application types: AQ assessment and Planning Models included: from regional to local scale –But should provide data with sufficient spatial and temporal resolution. Focus on pollutants considered in the AQ Directive (NO2, PM and O3) depending on the spatial scale addressed.

HARMO13, 1-4June 2010, Paris, France 6 Background (III) REFERENCES BOOT software (Chang and Hanna, 2005) ASTM Guidance (ASTM, 2000) USA-EPA AMET package (Appel and Gilliam, 2008) EPA Guidance (2007, 2009) AIR4EU conclusions (Borrego et al. 2008) CityDelta and EuroDelta projects ENSEMBLE platform (Galmarini S. et al. 2001, 2004). PM model performance metrics (Boylan and Russell 2006) Summary diagrams (Jolliff et al. 2009) SEMIP project Mesoscale Model Evaluation – COST728 (Schluenzen & Sokhi, 2008)

HARMO13, 1-4June 2010, Paris, France 7 Key elements of the procedure (I) DELTA: Evaluation tool based on City- & Euro- Delta intercomparison exercises ENSEMBLEJRC web based multi-model evaluation and inter- comparison platform used by several modeling communities (e.g. Galmarini S. et al. 2001, 2004a and b). Data Extraction JRC based (AirBase, Emissions, BC), links to other projects data (GMES, EC4MACS…) Benchmarking Service JRC based (performance indicators, criteria and goals, summary reports)  to be developed

HARMO13, 1-4June 2010, Paris, France 8 Key elements of the procedure (II) Model results DELTA JRC USER Data Extraction Facility Unofficial Working Report BENCHMARKING service Official Reports Unofficial Working Report

HARMO13, 1-4June 2010, Paris, France 9 The benchmarking service (I) Main elements Decomposition of the evaluation in temporal and spatial segments Definition of a core set of statistical indicators and summary diagrams. Definition of criterias, goals and observation uncertainty Selection of tests (model vs. observations, model vs. model, model sensitivities) Summary model performance reporting (automatic)

HARMO13, 1-4June 2010, Paris, France 10 The benchmarking service (II) Core set of statistical indicators RCorrelation BBias SDStandard deviation RMSERoot Mean Square Error RMSEsSystematic RMSE RMSEuUnsystematic RMSE CRMSECentered RMSE IOAIndex of Agreement MFBMean Fractional Bias MFEMean Fractional Error RDE Relative Directive Error RPE Relative Percentile Error

HARMO13, 1-4June 2010, Paris, France 11 Cos -1 R CMRSE The benchmarking service (II) Summary Diagrams

HARMO13, 1-4June 2010, Paris, France 12 Criteria: Model acceptable performance for a given type of application (e.g. PM: MFE=75%, MFB=+/-60%) Goal: Best performance a model should aim to reach given its current capabilities (e.g. PM: MFE=50%, MFB=+/-30%) Observation Uncertainty: Accuracy level of the measurements The benchmarking service (III)

HARMO13, 1-4June 2010, Paris, France 13 The benchmarking service (IV) Performance summary report

HARMO13, 1-4June 2010, Paris, France 14 Usage of the procedure Model results DELTA JRC USER Data Extraction Facility Unofficial Working Report BENCHMARKING service Official Reports Unofficial Working Report

HARMO13, 1-4June 2010, Paris, France 15 Work Plan - Discussion and consensus on overall methodology (FAIRMODE meeting 09/2010) - Development of the DELTA and benchmarking service prototypes (Dec 2010) - Testing of the prototypes on existing datasets (2011) - Development of the JRC Web facilities (data extraction, links ENSEMBLE-Benchmarking service…) - Set-up of a joint exercise for testing of the whole system (2012)

HARMO13, 1-4June 2010, Paris, France 16 Contributions Discussion and definition of the benchmarking protocol elements (species, statistics, goals and criterias…) for model performance reporting Links to other SGs Definition of the joint activities