Download presentation
Presentation is loading. Please wait.
Published byMolly Hodge Modified over 8 years ago
1
VISTAS Meteorological Modeling November 6, 2003 National RPO Meeting St. Louis, MO Mike Abraczinskas North Carolina Division of Air Quality
2
Contract with Baron Advanced Meteorological Systems (BAMS) –Formerly known as MCNC –Don Olerud, BAMS Technical Lead –Contract initiated January 2003
3
Meteorological Modeling Goals Phase I: Test model to define the appropriate set up for our region Investigate -> Model -> Evaluate -> Make decisions
4
Meteorological Modeling Goals Phase I Summary of recent and relevant MM5 sensitivity studies Draft delivered: January 2003 Learn from what others have done –Inter-RPO collaboration Will serve a starting point for VISTAS Recommend a set of sensitivity tests –Draft delivered: January 2003 –Different physics options and inputs proposed for testing
5
Meteorological Modeling Goals Phase I Evaluation methodologies –Draft delivered: January 2003, Updated April 2003 –Assessing Model Performance Conceptual understanding correct? – placement, timing of features Are diurnal features adequately captured Are clouds reasonably well modeled Are precipitation fields reasonable Do wind fields generally match observations Do temperature and moisture fields match observations Million dollar question… Do the meteorological fields produce acceptable air quality model results?
6
Evaluation: Spatial Products Spatial Aloft Products Timeseries Products Sounding Products Spatial Statistics Products Timeseries Statistics Products Combination Products Timeseries Statistics Aloft Products Statistical Tables Form Profiler Products Cross Sensitivity products Meteorological Modeling Goals Phase I
7
Meteorological Modeling Goals Phase I: Test model to define the appropriate set up for our region Investigate -> Model -> Evaluate -> Make decisions Periods that we’re modeling ? Geographical extent of testing ?
8
Sensitivity episodes January 1 – 20, 2002Episode 1 July 13 – 27, 2001Episode 2 July 13 – 21, 1999Episode 3 Choice of episode periods was based on: –Availability of robust AQ databases –Full AQ cycle (clean-dirty-clean) –Availability of meteorological data –Air quality and meteorological regime
9
36 km 12 km
10
Sensitivity Tests –PX_ACMPleim-Xiu land-surface model, ACM pbl scheme –NOAH_MRFNOAH land-surface model, MRF pbl scheme –Multi_BlkdrMulti-layer soil model, Blackadar pbl scheme –NOAH ETA M-YNOAH land-surface model, ETA Mellor-Yamada pbl BASE CASE
11
January 2002 – Episode 1 PX_ACM case significantly cold-biased PX_ACM runs are continuous (i.e. soil/moisture values from one modeling segment serves as initial conditions for following segment) Significantly better results obtained by making each P-X run independent (PX_ACM2)
12
T T
13
T T
14
T T
15
T T
16
1.5m Temperature stats 12 km domain - All hours - Episode 1 RunBias abserrIA PX-2.683.150.854 PX2-1.382.250.877
20
T
21
Daytime CFRAC (alt)
22
Daytime CFRAC (alt) Diff
23
Nighttime CFRAC (alt)
24
Nighttime CFRAC (alt) Diff
25
24-h Pcp
26
24-h Pcp Diff
27
Daytime Pcp
28
Daytime Pcp Diff
29
T
34
PBL Heights Subjective observations NOAH_MRF by far the highest and smoothest Probably too high –PX_ACM2 ~= Multi_blkdr PX_ACM2 subject to some suppressed PBL heights (in areas) during the day –Some of this may be real ? (over melting snow, or in presence of clouds/precipitation) –Lack of observations make this nearly impossible to evaluate PX_ACM2 very low at night –NOAH_ETA-MY lowest during day
35
Time Series Statistics 3-Panel Plots –Bias, Error, Index of Agreement for t, q, cld, spd, dir, RH –Bias, Accuracy, Equitable Threat Score for pcp (0.01, 0.05, 0.10, 0.25, 0.5, 1.0 in) –Labels sometimes difficult to see, so colors remain consistent px_acm(2): Blue noah_mrf: Red multi_blkdr: Black noah_eta-my: Purple –Pcp plots only available for “Full” regions
36
Temp Stats (Episode 1)
37
Mixing Ratio
38
Wind Speed
39
Wind Direction
40
Cloud Fraction
41
Cloud Fraction (Alt)
42
Relative Humidity
43
T (~500 m aloft)
44
T (~1600 m aloft)
45
T (~3400 m aloft)
46
Q (aloft)
49
D (aloft)
52
Precipitation (0.01 in)
53
Spatial Statistics Station-specific statistical ranking px_acm, noah_mrf, multi_blkdr, noah_eta-mypx_acm, noah_mrf, multi_blkdr, noah_eta-my Best sensitivity displayed Hourly (Composite UTC day), Total stats available PAVE date label just a placeholder Bias, error, rmse (total only) Warning: Possibly little difference between “best” and “worst” sensitivity
54
T
55
QV
56
SPD
57
DIR
58
UV
59
CLD2
60
RH
61
Episode 1 summary PX_ACM2 seems best overall Winds best in NOAH_ETA-MY (but PX-ACM2 not bad) Mixing ratio best in NOAH_MRF RH/Temp best in PX_ACM2 Significant differences in PBL heights (NOAH_MRF > PX_ACM2 > NOAH_ETA-MY )
62
Qualitative Analysis Uses only the Time Series Statistics Based on overall trend and model performance Not based on any quantitative values, although bias and error trends are considered
63
Qualitative Analysis Episode 1 January 2002 VISTAS 12 KM Episode 2 July 2001 VISTAS 12 KM Episode 3 July 1999 VISTAS 12 KM
64
T T Peachtree City, GA 00Z soundings
65
T T Nashville, TN00Z soundings
66
T T Greensboro, NC12Z soundings
67
T T Tampa, FL00Z soundings
68
Conclusions No definite winner… but… PX-ACM probably “best” overall –No very poor statistical quantity –PBL behavior a concern –PX-ACM or PX-ACM2 ? More air quality results likely needed before “best and final” sensitivity –NOAH_ETA-MY likely to show significantly different air quality results due to different PBL behavior –Wind performance a concern for NOAH_MRF –Temperature/precip performance a concern for NOAH_ETA-MY
69
Aug 2003: Emissions Inventory Base 2002 Dec 2003: Revised Em Inv Base 2002 Jan 2004: Modeling Protocol Mar 2004: Draft Em Inv 2018 July 2004: Revised State Em Inv Base 2002 Sept 2004: Annual Base Year Model Runs Dec 2004: Annual Run 2018 Apr 2004: DDM in CMAQ Oct 2004: Sensitivity Runs 2018 3 episodes Jan 2004: Met, Em, AQ model testing 3 episodes Sept 2004: Revised Em Inv 2018 Oct-Dec 2004: Control Strategy Inventories Jan 2005: Sensitivity Runs 2018 episodes Jan-Jun 2005: Control Strategy Runs 2018 Mar 2004: CART:select sensitivity episodes July-Dec 2005: Observations Conclusions Recommendations After Jun 2005 Model Runs: e.g. Power Plant Turnover Before Jun 2005 Other Inventory: e.g. Power Plant Turnover Meteorological, Emissions, and Air Quality Modeling Deliverables State Regulatory Activities Jan-Mar 2004 Define BART sources Optional June 2004 Identify BART controls Draft 10/31/03 EPA- approved Modeling Protocol
71
Contact information http://vistas-sesarm.org/ http://www.baronams.com/projects/VISTAS/ Michael.Abraczinskas@ncmail.net 919-715-3743
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.