VISTAS Meteorological Modeling November 6, 2003 National RPO Meeting St. Louis, MO Mike Abraczinskas North Carolina Division of Air Quality
Contract with Baron Advanced Meteorological Systems (BAMS) –Formerly known as MCNC –Don Olerud, BAMS Technical Lead –Contract initiated January 2003
Meteorological Modeling Goals Phase I: Test model to define the appropriate set up for our region Investigate -> Model -> Evaluate -> Make decisions
Meteorological Modeling Goals Phase I Summary of recent and relevant MM5 sensitivity studies Draft delivered: January 2003 Learn from what others have done –Inter-RPO collaboration Will serve a starting point for VISTAS Recommend a set of sensitivity tests –Draft delivered: January 2003 –Different physics options and inputs proposed for testing
Meteorological Modeling Goals Phase I Evaluation methodologies –Draft delivered: January 2003, Updated April 2003 –Assessing Model Performance Conceptual understanding correct? – placement, timing of features Are diurnal features adequately captured Are clouds reasonably well modeled Are precipitation fields reasonable Do wind fields generally match observations Do temperature and moisture fields match observations Million dollar question… Do the meteorological fields produce acceptable air quality model results?
Evaluation: Spatial Products Spatial Aloft Products Timeseries Products Sounding Products Spatial Statistics Products Timeseries Statistics Products Combination Products Timeseries Statistics Aloft Products Statistical Tables Form Profiler Products Cross Sensitivity products Meteorological Modeling Goals Phase I
Meteorological Modeling Goals Phase I: Test model to define the appropriate set up for our region Investigate -> Model -> Evaluate -> Make decisions Periods that we’re modeling ? Geographical extent of testing ?
Sensitivity episodes January 1 – 20, 2002Episode 1 July 13 – 27, 2001Episode 2 July 13 – 21, 1999Episode 3 Choice of episode periods was based on: –Availability of robust AQ databases –Full AQ cycle (clean-dirty-clean) –Availability of meteorological data –Air quality and meteorological regime
36 km 12 km
Sensitivity Tests –PX_ACMPleim-Xiu land-surface model, ACM pbl scheme –NOAH_MRFNOAH land-surface model, MRF pbl scheme –Multi_BlkdrMulti-layer soil model, Blackadar pbl scheme –NOAH ETA M-YNOAH land-surface model, ETA Mellor-Yamada pbl BASE CASE
January 2002 – Episode 1 PX_ACM case significantly cold-biased PX_ACM runs are continuous (i.e. soil/moisture values from one modeling segment serves as initial conditions for following segment) Significantly better results obtained by making each P-X run independent (PX_ACM2)
T T
T T
T T
T T
1.5m Temperature stats 12 km domain - All hours - Episode 1 RunBias abserrIA PX PX
T
Daytime CFRAC (alt)
Daytime CFRAC (alt) Diff
Nighttime CFRAC (alt)
Nighttime CFRAC (alt) Diff
24-h Pcp
24-h Pcp Diff
Daytime Pcp
Daytime Pcp Diff
T
PBL Heights Subjective observations NOAH_MRF by far the highest and smoothest Probably too high –PX_ACM2 ~= Multi_blkdr PX_ACM2 subject to some suppressed PBL heights (in areas) during the day –Some of this may be real ? (over melting snow, or in presence of clouds/precipitation) –Lack of observations make this nearly impossible to evaluate PX_ACM2 very low at night –NOAH_ETA-MY lowest during day
Time Series Statistics 3-Panel Plots –Bias, Error, Index of Agreement for t, q, cld, spd, dir, RH –Bias, Accuracy, Equitable Threat Score for pcp (0.01, 0.05, 0.10, 0.25, 0.5, 1.0 in) –Labels sometimes difficult to see, so colors remain consistent px_acm(2): Blue noah_mrf: Red multi_blkdr: Black noah_eta-my: Purple –Pcp plots only available for “Full” regions
Temp Stats (Episode 1)
Mixing Ratio
Wind Speed
Wind Direction
Cloud Fraction
Cloud Fraction (Alt)
Relative Humidity
T (~500 m aloft)
T (~1600 m aloft)
T (~3400 m aloft)
Q (aloft)
D (aloft)
Precipitation (0.01 in)
Spatial Statistics Station-specific statistical ranking px_acm, noah_mrf, multi_blkdr, noah_eta-mypx_acm, noah_mrf, multi_blkdr, noah_eta-my Best sensitivity displayed Hourly (Composite UTC day), Total stats available PAVE date label just a placeholder Bias, error, rmse (total only) Warning: Possibly little difference between “best” and “worst” sensitivity
T
QV
SPD
DIR
UV
CLD2
RH
Episode 1 summary PX_ACM2 seems best overall Winds best in NOAH_ETA-MY (but PX-ACM2 not bad) Mixing ratio best in NOAH_MRF RH/Temp best in PX_ACM2 Significant differences in PBL heights (NOAH_MRF > PX_ACM2 > NOAH_ETA-MY )
Qualitative Analysis Uses only the Time Series Statistics Based on overall trend and model performance Not based on any quantitative values, although bias and error trends are considered
Qualitative Analysis Episode 1 January 2002 VISTAS 12 KM Episode 2 July 2001 VISTAS 12 KM Episode 3 July 1999 VISTAS 12 KM
T T Peachtree City, GA 00Z soundings
T T Nashville, TN00Z soundings
T T Greensboro, NC12Z soundings
T T Tampa, FL00Z soundings
Conclusions No definite winner… but… PX-ACM probably “best” overall –No very poor statistical quantity –PBL behavior a concern –PX-ACM or PX-ACM2 ? More air quality results likely needed before “best and final” sensitivity –NOAH_ETA-MY likely to show significantly different air quality results due to different PBL behavior –Wind performance a concern for NOAH_MRF –Temperature/precip performance a concern for NOAH_ETA-MY
Aug 2003: Emissions Inventory Base 2002 Dec 2003: Revised Em Inv Base 2002 Jan 2004: Modeling Protocol Mar 2004: Draft Em Inv 2018 July 2004: Revised State Em Inv Base 2002 Sept 2004: Annual Base Year Model Runs Dec 2004: Annual Run 2018 Apr 2004: DDM in CMAQ Oct 2004: Sensitivity Runs episodes Jan 2004: Met, Em, AQ model testing 3 episodes Sept 2004: Revised Em Inv 2018 Oct-Dec 2004: Control Strategy Inventories Jan 2005: Sensitivity Runs 2018 episodes Jan-Jun 2005: Control Strategy Runs 2018 Mar 2004: CART:select sensitivity episodes July-Dec 2005: Observations Conclusions Recommendations After Jun 2005 Model Runs: e.g. Power Plant Turnover Before Jun 2005 Other Inventory: e.g. Power Plant Turnover Meteorological, Emissions, and Air Quality Modeling Deliverables State Regulatory Activities Jan-Mar 2004 Define BART sources Optional June 2004 Identify BART controls Draft 10/31/03 EPA- approved Modeling Protocol
Contact information