Download presentation
Presentation is loading. Please wait.
Published byGary Chapman Modified over 9 years ago
1
Model-Data Fusion Approaches for Exposure Estimation Charles Stanier Assistant Professor University of Iowa CENTER FOR GLOBAL AND REGIONAL ENVIRONMENTAL RESEARCH
2
Oct 12, 2010CMAS Conference2 –Gregory Carmichael, Chemical and Biochemical Engineering –Sinan Sousan, Chemical and Biochemical Engineering –Naresh Kumar, Geography –R. William Field, Epidemiology –Jacob Oleson, Biostatistics –Jaemeen Baek, Center for Global and Regional Environmental Research –Scott Spak, Center for Global and Regional Environmental Research –Sang Rin Lee, Center for Global and Regional Environmental Research –Daniel Krewski & Michelle Turner, University of Ottawa, R. Samuel McLaughlin Centre for Population Health Risk Assessment –Adam Beranek Collins, Chemical and Biochemical Engineering This research has been supported by a grant from the U.S. Environmental Protection Agency's Science (USEPA) to Achieve Results (STAR) program grant R833865.
3
Oct 12, 2010CMAS Conference3 With respect to statistical methods for infusing air quality data and models into health studies –Data fusion (AQS / MODIS / CMAQ) We want the spatial resolution of the model (4-12 km) Without the inaccuracies of the model Simple approach - use computational efficient weighted averaging techniques –Optimal Interpolation –Model evaluation using statistical metrics to select model parameters/settings
4
Chemical Mass Balance Model Ground Observation Data 3D Air Quality Model Health Data - Census Tracts Spatial-Temporal Analysis Emissions Meteorology MODIS aerosol optical depth data Data assimilation
5
I – Traditional Source-Resolved Exposure Estimation Techniques Source-Oriented Modeling Source-Receptor Analysis
6
I – Traditional Source-Resolved Exposure Estimation Techniques II – Model-Measurement Hybrid (Data Assimilation) Source-Oriented Modeling Source-Receptor Analysis
7
I – Traditional Source-Resolved Exposure Estimation Techniques II – Model-Measurement Hybrid (Data Assimilation) Source-Oriented Modeling Source-Receptor Analysis
8
When we assimilate, we try to think in terms of the underlying processes Emissions Regional Transport Boundary Layer Height Boundary Condition Chemical Processes… –This can help select strategies for temporal and spatial binning, whether we want the nudging to be short-lived or persistant, etc.
9
Evaluation - statistics
10
CMAQ evaluation - statistics
11
PM 2.5 performance evaluation 11 North East Mean Fractional Bias (FB) Mean Fractional Error (FE) South Central Mean Fractional Bias (FB) Mean Fractional Error (FE) Compared to STN sites: Average to excellent performance.
12
MODIS OPTIMAL INTERPOLATION WORK 12 OI Posterior PM 2.5 IMPROVE STN CMAQ 10km MODIS Temporal Averaging Settings Evaluation
13
MODIS AODCMAQ AOD (Case a) 13 OI results for May: Case a: Average and correct all hours Case b: Average overpass hours and correct all hours AOD 0.22 AOD 0.07 Posterior CMAQ-derived AODCMAQ-derived AOD logarithmic scaling factors 1 16 4 0.3
14
OI Result for May 2002 Case: 2a, lmxlmy: 1 (Average and correct all hours) Posterior CMAQ PM 2.5 14 MFB (Pacific) IMPROVE: -19% to 85% STN : -4% to 91% MFB (Mountain) IMPROVE: -86% to 67% STN : -41% to 95% MFB (Midwest) IMPROVE: -49% to 17% STN : -12% to 24% MFB (North East) IMPROVE: -31% to -21% STN : -24% to 21% MFB (South Central) IMPROVE: -76% to 1% STN : -59% to 9% MFB (South Atlantic) IMPROVE: -56% to -7% STN : -38% to 6%
15
3-D Modeling Settings Weather Research and Forecasting (WRF) model 3.1.1 SMOKE 2.5 CMAQ 4.7 with aero 5 and CB05 mechanism Preliminary results shown in WRF-CMAQ comparison is based on MM5 – CMAQ 4.6 modeling 36km resolution domain 12km resolution domain Seattle Los Angeles and Phoenix Northeastern US 4km resolution domain over Chicago WRF evaluation sections 7700 met stations
16
WRF settings [1] North American Regional Reanalysis (NARR) data is used for initial and boundary data NARR data is a high resolution (every 3 hour data with a 32km resolution) reanalysis including assimilated precipitation 3 days spin up time and 15 days run Objective analysis of bindary and initial data (OBSGRID) Grid nudging with NARR data – Analysis nudging for 36 and 12km domains – Interval: every three hours – Nudging on over the planetary boundary layer
17
WRF settings [2] Microphysics, radiative transfer, land surface model – Morrison double-moment scheme for microphysics – RRTMG scheme for longwave and shortwave physics – Pleim-Xiu land surface model with two soil layers – ACM2 (Asymmetric convective model) PBL scheme – Kain-Fritsch scheme Observation nudging with Automated Data Processing (ADP) surface and upper air measurements – Observation data was screened using OBSGRID – 12km and 4km – Interval: every hour – Radius of influence: 50km for 12 and 4km resolution
18
Other studies Gilliam et al (2010) MM5 settings (Baker, 2004) -Explicit moisture : Reisner I mixed phase - Cumulus: Kain-Fritsch 2 - PBL: Pleim-Chang (ACM) -Radiation: RRTM -Multi-layer soil model: Pleim-Xu - No shallow convection - Analysis nudging on above PBL (4-D Data assimilation0 - No moist physics table
19
WRF evaluation – statistical benchmarks Wind speedRMSE≤ 2m/s Bias ≤ ±0.5m/s IOA≥ 0.6 Wind directionGross Error≤30 deg Bias ≤ ±10 deg TemperatureGross Error≤ 2 K Bias ≤ ± 0.5 K IOA≥ 0.8 HumidityGross Error≤ 2 g/kg Bias ≤ ±1 g/km IOA≥ 0.6 Emery et al. 2001
20
WRF – extended statistical benchmarks Legends in figures1 st 2 nd 3 rd 4 th Wind speed RMSEws_rmse<= 2.0<= 2.5<= 3.0> 3.0 Wind speed IOAws_ioa>= 0.6>= 0.5>= 0.4< 0.4 Wind direction gross errorwd_error<= 30<= 40<= 50> 50 Temperature gross errortp_error<= 2.0<= 4.0<= 6.0> 6.0 Temperature IOAtp_ioa>= 0.8>= 0.6>= 0.4< 0.4 Relative humidity gross errorrh_error<= 2.0<= 2.5<= 3.0> 3.0 Relative humidity IOArh_ioa>= 0.6>= 0.5>= 0.4< 0.4 Distribution of evaluation statistics is simplified as histograms to understand overall trends better
21
WRF evaluation – Northeastern US with a 12km resolution (Feb. 2002) All sections are in the 2 nd bin except the section 20 and 28, which are costal regions
22
Comparing WRF (currently MM5) statistics and CMAQ mean fractional error WG-CGWB-CG WG-CBWB-CB S28 S8 S20 S15 S22 S13 S9 Needed for Quantification of skill for health study To guide assimilation strategy To identify model weaknesses Compensating for transport, removal, ventilation? Emissions BCs?
23
MM5 vs. CMAQ performances [1] (STN Sites. Jan. 2002) S8 S22 S28 S20 S24 S4,18 S3, 9,10 S11 S22 S8 S28S20 S15 S5 S13 S8 S22 S6 S13 S15 S3 S28 S24 S8 S22 S5 S6 S8 S13
24
Conclusions & Recommendations Use statistical approaches (with benchmarks) to evaluate model- measurement skill Divide into geographical regional Consider different timescales (sensitive to different types of errors) Consider stratification of data into clean, moderate and polluted periods Often require specific analysis by season and for urban and rural areas OI of MODIS can work, but there are issues to be worked out Current nationwide 2002 WRF run has most trouble in Mountain West and in northern New England Met error (RH / T / WS / WD ) CTM error
25
Additional slides
26
PM 2.5 performance evaluation For both IMPROVE and STN networks in 2002 (Independent of location) 26 Excellent - Good Average Problematic Compared to IMPROVE sites the model show less PM 2.5 estimate correlation than STN sites. Major PM 2.5 species that contribute to model biases are OC, sulfate, and nitrate. PM 2.5 bias is due to OC and nitrate. % 1 BOYLAN, et al., 2006a. PM and light extinction model performance metrics, goals, and criteria for three-dimensional air quality models. Atmospheric Environment 40, 4946-4959.
27
WRF evaluation – Northeastern US with a 12km resolution (Jun. 2002) Performance of wind direction simulation in section 20 and 28 is better than that in February.
28
WRF evaluation - Chicago with a 4km resolution
29
MM5 vs. CMAQ performances [2] (STN Sites. Jan. 2002)
30
MM5 vs. CMAQ performances [3] (IMPROVE. Jan. 2002)
31
MM5 vs. CMAQ performances [4] (IMPROVE. Jan. 2002)
32
October 12, 2010CMAS Conference 32 Techniques are Aimed at Helping to Solve the Component Toxicity Problem
33
Smith, Jerrett et al. Lancet (2009)
35
Exposure estimates are means across metropolitan statistical areas Krewski et al. (2009) Extended followup ACS Lall et al. (2004) Atmos. Environ.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.