FAIRMODE SG4: Benchmarking & modelling quality objectives P. Thunis, JRC Antwerp, 04-2013.

Slides:



Advertisements
Similar presentations
Statistical evaluation of model uncertainties in Copert III, by I. Kioutsioukis & S. Tarantola (JRC, I)
Advertisements

Institute for Environment and Sustainability1 POMI Kick-off Meeting 07/03/2008.
Jenny Stocker, David Carruthers & Sam Royston Comments on DELTA version 3.2 with the ADMS-Urban London dataset & updates to the PASODOBLE Myair Model Evaluation.
An experience on modelling-based assessment of the air quality within the Air Quality Directive framework Ana Isabel Miranda, Isabel Ribeiro, Patrícia.
Presentation of the Air Quality e-reporting User Interface (AQUI 1.0) Wim Mol Presentation AQUI 1.0 Dublin, Ireland October 2013 European Environment.
Applying the DELTA-FAIRMODE tool to support AQD: the validation of the TCAM Chemical Transport Model C. Carnevale, G. Finzi, A. Pederzoli, P. Thunis, E.
UK feedback on Delta V3.0 Presented by: John Stedman, Daniel Brookes, Keith Vincent, Emily Connolly 10 April 2013.
Sampling: Final and Initial Sample Size Determination
UK feedback on MQO Presented by: John Stedman, Daniel Brookes, Brian Stacey, Keith Vincent, Emily Connolly 10 April 2013.
G. Pirovano – CESIRICERCA, Italy Comparison and validation of long term simulation of PM10 over 7 European cities in the frame of Citydelta project Bedogni.
PM mapping in Scotland, 2007 Andrew Kent. What are we presenting today? 1) Context to the work 2) Modelling process 3) Model results 4) Future work possibilities.
15 / 05 / 2008 Model ensembles for the simulation of air quality over Europe Robert Vautard Laboratoire des Sciences du Climat et de l’Environnement And.
1 Swedish experiences of applying the Quality Objectives for NO2 and PM10 modelling introduction model evaluation for Swedish - street/road stations -
WG2 Meeting, SG4 Models Benchmarking SMHI, Norrkoping, 14 th June 2011 DELTA tool testing for a Portuguese City Ana Miranda and Helena Martins University.
Sampling. Concerns 1)Representativeness of the Sample: Does the sample accurately portray the population from which it is drawn 2)Time and Change: Was.
PM Model Performance Goals and Criteria James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling Meeting Denver, CO May 26,
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 8: Estimating with Confidence Section 8.1 Confidence Intervals: The.
DANMARKS MILJØUNDERSØGELSER AARHUS UNIVERSITET September 21, 2010 Helge Rørdam Olesen Fairmode: Some considerations on the Delta tool and model performance.
The Euro- and City-Delta model intercomparison exercises P. Thunis, K. Cuvelier Joint Research Centre, Ispra.
EFFICIENT CHARACTERIZATION OF UNCERTAINTY IN CONTROL STRATEGY IMPACT PREDICTIONS EFFICIENT CHARACTERIZATION OF UNCERTAINTY IN CONTROL STRATEGY IMPACT PREDICTIONS.
Institute for Environment and Sustainability1 DELTA Tool Questionnaire - Feedback and main points for discussion FAIRMODE 4 th Plenary and Working Group.
Lecture 2 Forestry 3218 Lecture 2 Statistical Methods Avery and Burkhart, Chapter 2 Forest Mensuration II Avery and Burkhart, Chapter 2.
Georgia Environmental Protection Division IMPACTS OF MODELING CHOICES ON RELATIVE RESPONSE FACTORS IN ATLANTA, GA Byeong-Uk Kim, Maudood Khan, Amit Marmur,
| Folie 1 Assessment of Representativeness of Air Quality Monitoring Stations Geneva, Wolfgang Spangl.
10 October 2008, Cavtat (CROATIA) – First Planery Meeting FAIRMODE1 IES - Institute for Environment and Sustainability Ispra - Italy
Chapter 10 Verification and Validation of Simulation Models
Process Quality in ONS Rachel Skentelbery, Rachael Viles & Sarah Green
Establishment of Freeway Link Volume Validation Targets based on Traffic Count Distributions in the Dallas-Fort Worth Region Behruz Paschai, Arash Mirzaei,
Fairmode: Latest developments P. Thunis + Fairmode chairs & co-chairs + Fairmode Community.
Analysis of station classification and network design INERIS (Laure Malherbe, Anthony Ung), NILU (Philipp Schneider), RIVM (Frank de Leeuw, Benno Jimmink)
Uncertainties in emission inventories Wilfried Winiwarter Joint TFEIP & TFMM workshop on uncertainties in emission inventories and atmospheric models Dublin,
Possible use of Copernicus MACC-II modeling products in EEAs assessment work Leonor Tarrasón, Jan Horálek, Laure Malherbe, Philipp Schneider, Anthony Ung,
GlobEmission (ITT 6721) new ESA contract starting on Oct. 11 KNMI/BIRA/FMI/TNO/VITO.
Joint Research Centre the European Commission's in-house science service JRC Science Hub: ec.europa.eu/jrc 38th UNECE IWG PMP MEETING Non- exhaust particle.
FAIRMODE MEETING NORRKÖPING JUNE Session: The use of Receptor Models in Source Apportionment (coordinator C. Belis) General considerations.
FAIRMODE meeting, Norrkoping, June Institute for Environment and Sustainability SG4: Benchmarking.
D EVELOPMENT OF A NATIONAL METHODOLOGY FOR ESTIMATING POPULATION EXPOSURE Maxime Beauchamp, Laurent Létinois, Laure Malherbe*, Frédéric Tognet
13 / 10 / 2006 Uncertainty and regional air quality model diversity: what do we learn from model ensembles? Robert Vautard Laboratoire des Sciences du.
Comments from Austria D. Oettl, A. Kaiser et al. Antwerp,
HARMO13, 1-4June 2010, Paris, France1 Institute for Environment and Sustainability Procedure.
E-PRTR incompleteness check Irene Olivares Industrial Pollution Group Air and Climate Change Programme Eionet NRC workshop on Industrial Pollution Copenhagen.
C. Cuvelier and P. Thunis JRC, European Commission Ispra - Italy Harmonisation in AQ Modelling Fairmode, Cavtat, 10 Oct 2008.
Ministry of Infrastructure of Ukraine (former Ministry of Transport and Communications) State Enterprise «State Road Transport Research Institute» (SE.
The FAIRMODE PM modelling guide Laurence ROUIL Bertrand BESSAGNET
GlobEmission (ITT 6721) new ESA contract starting on Oct. 11 KNMI/BIRA/FMI/TNO/VITO.
1 European air indicator reporting Process and experience © iStockphoto.com/cmisje.
Evaluation of pollution levels in urban areas of selected EMEP countries Alexey Gusev, Victor Shatalov Meteorological Synthesizing Centre - East.
Impact of various emission inventories on modelling results; impact on the use of the GMES products Laurence Rouïl
Using uncertainty to test model complexity Barry Croke.
SHERPA for e-reporting
Materials for Lecture 18 Chapters 3 and 6
Chapter 10 Verification and Validation of Simulation Models
Forum for Air quality Modelling FAIRMODE ew. eea
Representative Measurements – AQ-Workshop Bucharest, July 2008
Current activities WG2-SG3 Urban Emissions and Projections
Contribution from AQUILA to Air Policy Review
Jan Horálek (CHMI) Peter de Smet, Frank de Leeuw (RIVM),
QUALITY ASSURANCE OF MODELS
QUALITY ASSURANCE OF MODELS
FAIRMODE Update Fairmode Steering Group.
Screening for Abnormal Values in AirBase Datasets
Bruce Rolstad Denby FAIRMODE 4th Plenary, Norrkjoping Sweden June 2011
Data Extraction Facility
M. Kezunovic (P.I.) S. S. Luo D. Ristanovic Texas A&M University
EURODELTA III RCG-Model
Uncertainties in emission inventories
The EuroDelta inter-comparison, Phase I Variability of model responses
On the validity of the incremental approach to calculate the impact of cities on air quality Philippe Thunis JRC- C5 TFMM - Geneva May 2018.
Summary: TFMM trends analysis
Forum for Air Quality Modelling in Europe
Presentation transcript:

FAIRMODE SG4: Benchmarking & modelling quality objectives P. Thunis, JRC Antwerp,

Model Quality Objectives Wed.14:00-15:00Uncertainty based MQO: overview & discussionP. Thunis 15:00-15:30UK Feedback on MQOD. Brookes & B. Stacey 15:30-15:50DELTA tool: Version 2 updatesP. Thunis DELTA tool experience 15:50-16:10Exp. In model inter-comparison exercise in BelgiumB. Maiheu 16:10-16:30UK FeedbackJ. Stedman 16:30-16:50CALIOPE forecast evaluationJ. Baldasano Thur.09:00-09:20TCAM model evaluationC. Carnevale 09:20-09:40CERC’s experience & FeedbackJ. Stocker 09:40-10:00Aveiro Univ. experience & feedbackA.Miranda 10:00-10:20The Fairmode deltatool: a MS experienceE. Trimpeneers 10:20-10:40Austria FeedbackD. Oettl Benchmarking activities related to SG4 10:40-11:10ATMOSYS model evaluation tool (On-line DELTA tool)N. Veldeman

Model Quality Objectives

Antwerp, P. Thunis Why do we need MQO? Tell whether a model is fit for purpose Support model users in their evaluation work (e.g. is R=0.6 good?) Facilitate experience exchanges among modelers based on a common performance scale

Antwerp, P. Thunis Little insight into performances (timing of events not considered, only in the LV neighborhood) Model can perform well for the wrong reason  ?? Planning applications ?? Current MQO (Guidance document) RDE=10% Why a new MQO?

Antwerp, P. Thunis Model: CTM TCAM Area: Po-Valley IT Year: 2009 Model: PM10=50ug/m3 Area: Everywhere Year: Any time

Antwerp, P. Thunis Current MQO (Guidance document) MQO current status

Antwerp, P. Thunis Part I: Derivation of the MQO Part II: Setting a value for U I.Performance criteria to evaluate air quality modelling applications. P. Thunis, A. Pederzoli, D. Pernigotti, 2012, Atmos. Env., 59, I.Model quality objectives based on measurement uncertainty. Part I: Ozone. P. Thunis, D. Pernigotti, M. Gerboles, 2013, Submitted to Atmos. Env. II.Model quality objectives based on measurement uncertainty. Part II: NO2 and PM10, D. Pernigotti, P. Thunis, M. Gerboles, C. Belis, 2013, Submitted to Atmos. Env. Acknowledgments: AEA

DERIVATION OF THE MQO Antwerp, P. Thunis

Event A: Measured Event B: Modeled (1) (2)

Antwerp, P. Thunis Accuracy Precision 2 conditions imposed on model results systematic errorrandom error

Antwerp, P. Thunis Condition 1: Accuracy Condition on distance between distributions

Antwerp, P. Thunis Condition 2: Precision Condition on model distribution spread (σ)

Antwerp, P. Thunis Conditions (1) & (2): Accuracy and Precision Conditions on distance between distributions model distribution spread (σ) Constrained systematic and random errors

Antwerp, P. Thunis Year Hour/day MQO What value for U? Annual & hourly/daily MQO

SETTING A VALUE FOR U Antwerp, P. Thunis

Approach 2: uncertainty varies according to concentration (estimates based on monitoring) Concept of maximum uncertainty Approach 1: uncertainty assumed constant & equal to LV uncertainty over the whole range of concentrations

Antwerp, P. Thunis What value for U? p np

Antwerp, P. Thunis Applications  O3  NO2  Hourly  Yearly  PM10  Daily  Yearly

Antwerp, P. Thunis O3  GUM approach

Antwerp, P. Thunis Hourly O3 RV=120 ug/m3

Antwerp, P. Thunis O3 knownFit U kRV Low range (~20ug/m3) LV range (120 ug/m3) High range (~200 ug/m3) MQO 120%26%20% U MQO

Antwerp, P. Thunis P. Thunis Hourly NO2: GUM approach 8760 hours 1000 AIRBASE stations 95th percent

Antwerp, P. Thunis Hourly NO2 knownFit U kRV Low range (~10ug/m3) LV range (200ug/m3) High range (>200 ug/m3) MQO 144%48% U MQO

Antwerp, P. Thunis Yearly NO2: GUM approach 1000 AIRBASE stations 95th percent

Antwerp, P. Thunis Yearly NO2 knownFit U kRVNpNp N np Low range (~10ug/m3) LV range (40ug/m3) High range (100 ug/m3) MQO 57%26%23% U MQO

Antwerp, P. Thunis U 2 PM 10 PM stations x 10 days Gravimetric vs. gravimetric Daily PM10  GDE approach

Antwerp, P. Thunis Daily PM10 knownFit U kRV Low range (~10ug/m3) LV range (120 ug/m3) High range (~100 ug/m3) MQO 70%56%54% U MQO

Antwerp, P. Thunis MethodSourceNb of stations GravimetricJRC intercomparison30 (10 days)13.8 % TEOMJRC intercomparison26 (10 days)19.0 % Beta-RayAIRBASE5 stations19.0 % Uncertainties for other measurement techniques

Antwerp, P. Thunis U 2 PM 10 PM stations Beta-ray & Gravimetric Yearly PM10  GDE approach

Antwerp, P. Thunis Yearly PM10 KnownFit U kRVNpNp N np MQO Low range (~10ug/m3) LV range (40ug/m3) High range (100 ug/m3) 46%14%10% U MQO

Antwerp, P. Thunis MQO “Tuning”

Antwerp, P. Thunis M M OO Data Assimilation Data fusionNo assimilation O M O M

Antwerp, P. Thunis MQO “Tuning”

Antwerp, P. Thunis [C] NO2 Station typeRepresentative areaCell resolution UrbanA few km2~ 1-3 km SuburbanSome tens of km2~ 3-10 km RuralSome hundred of km2~ km Rural background km2~ km Traffic?? 1-3km

Antwerp, P. Thunis Conclusions (I) The most important feature of this MQO is the link to measurement uncertainty, i.e. the need for a better understanding of measurements during model evaluation. The stringency of the current MQO should not be seen as an obstacle since possible expert tuning can be made (based on SG4 current & future experience exchanges) Approach available for O3, NO2, PM10. Can be generalised to other variables (e.g. Meteo, NOx) and adapted to other modelling approaches (e.g. data assimilation) MQO are in testing phase. They should not be used as reference for the time being.

Antwerp, P. Thunis Conclusions (II) Uncertainty estimates are quite robust for NO2 values and daily PM10 (representative sample). But sample size is reduced for yearly estimates, especially PM10. Values assigned to Np and Nnp probably need further investigation (e.g. more years). Around the LV (for hourly/daily) the new MQO values are close to the 50% AQD model uncertainty for both PM10 and NO2. For O3 and PM10 Biases are also close to performance criteria proposed in literature previously (Boylan and Russel, 2005, Chemel et al., 2010)

Antwerp, P. Thunis Conclusions (III) O3 MQO probably too strict (k=1.4  k=1.6) implying (MQO=26%  MQO=30% around the Limit value) Data & experience exchanges would help assigning “final” values to the MQO parameters. Each parameter can be tested by any user (configuration file).

GRAPHICAL VIEW AND PERFORMANCE CRITERIA Antwerp, P. Thunis

The Target diagram Right - Left X - Y

Antwerp, P. Thunis All error inMPC Bias R Std. Dev. Bias=0 R=1 Complementary Performance criteria for hourly/daily resolution Each station type for a given location and for a given period of time will have a specific signature in terms of mean and standard deviation, leading to specific MQO and MPC. This is implicitly accounted for through the formulation.

Antwerp, P. Thunis 8h max O3Hourly NO2Daily PM10 BiasRSDBiasRSDBiasRSD Rural 37% %159%>0200%67% % Urban 41%0.5197%79% %65% % Traffic 44% %65% %64% % Po-Valley 37%0.8063%68% %66% % Paris 42% %77% %63% % Krakow 40%0.6386%91%>0139%69%0.5496% Model Performance Criteria for bias, R and SD in terms of location and station type (based on EU Airbase 2009 measurement data)

Antwerp, P. Thunis Conclusions MPC are necessary but not sufficient conditions Both the MQO and each MPC have been assigned a specific diagram in the Delta tool

BENCHMARKING REPORTS Antwerp, P. Thunis

BIAS < 0 BIAS > 0 RSD

DELTA tool Version 3.3 FAIRMODE – SG4

Antwerp, P. Thunis Main Changes in Version 3.3 Split hourly/daily  yearly 90%  75% data availability threshold Input formatting netcdf  xls leap year Check-IO: a necessary step

Antwerp, P. Thunis Everyone can test new MQO parameters Configuration file: goals & criteria 0; PM10; ALL; OU; PMEAN; 27.8; 0.027; 40; 1; 50

Antwerp, P. Thunis Planning time decomposition

Antwerp, P. Thunis Other: Specific features for inter-comparing models Linux version in development

Future activities FAIRMODE – SG4

Antwerp, P. Thunis NO2 PM10 O3 Pollutant Uncertainty budgets Data Assimilation90% threshold [C] NO2 Sp. Repres JRC-AEA JRC-Univ Bresc. Assessment of MQO