Download presentation
Presentation is loading. Please wait.
Published byFranklin Barton Modified over 9 years ago
1
CMAQ Evaluation Preliminary 2002 version C WRAP 2002 Visibility Modeling: Annual CMAQ Performance Evaluation using Preliminary 2002 version C Emissions Gail Tonnesen, Bo Wang, Chao-Jung Chien, Zion Wang, Mohammad Omary University of California, Riverside Zac Adelman, Andy Holland University of North Carolina Ralph Morris et al. ENVIRON Corporation Int., Novato, CA
2
CMAQ Evaluation Preliminary 2002 version C Summary of RMC 2002 Modeling Annual MM5 Simulations run at the RMC in December 2003 (additional MM5 testing in progress) Emissions processed with SMOKE –Preliminary 2002 Scenario C used here. CMAQ version 4.3 (released October 2003) Data summaries, QA, results are posted on the RMC web page: www.cert.ucr.edu/aqm/308
3
CMAQ Evaluation Preliminary 2002 version C MM5 Modeling Domain (36 & 12 km) National RPO grid –Lambert conic Projection –Center: -97 o, 40 o –True lat: 33 o, 45 o MM5 domain –36 km: (165, 129, 34) –12 km: (220, 199, 34) 24-category USGS data –36 km: 10 min. (~19 km) –12 km: 5 min. (~9 km)
4
CMAQ Evaluation Preliminary 2002 version C MM5 Physics Physics OptionConfigurationConfigure.user MicrophysicsReisner2 (with graupel)IMPHYS = 7 Cumulus SchemeKain-FritschICUPA = 6 PBLPleim-Chang (ACM)IBLTYP = 7 RadiationRRTMFRAD = 4 Land-surface modelPleim-XiuISOIL = 3 Shallow ConvectionNoISHALLO = 0 Snow Cover EffectSimple snow modelISNOW = 2 Thermal RoughnessGarratIZ0TOPT = 1 Varying SSTYesISSTVAR = 1 Time step90 seconds(PX uses an internal timestep of 40 seconds)
5
CMAQ Evaluation Preliminary 2002 version C Subdomains for 36/12-km Model Evaluation 1 = Pacific NW 2 = SW 3 = North 4 = Desert SW 5 = CenrapN 6 = CenrapS 7 = Great Lakes 8 = Ohio Valley 9 = SE 10 = NE 11 = MidAtlantic
6
CMAQ Evaluation Preliminary 2002 version C Evaluation Review Evaluation Methodology –Synoptic Evaluation –Statistical Evaluation using METSTAT and surface data WS, WD, T, RH –Evaluation against upper-air obs Statistics: –Absolute Bias and Error, RMSE, IOA (Index of Agreement) Evaluation Datasets: –NCAR dataset ds472 airport surface met observations –Twice-Daily Upper-Air Profile Obs (~120 in US) Temperature Moisture
7
CMAQ Evaluation Preliminary 2002 version C METSTAT Evaluation Package Statistics: –Absolute Bias and Error, RMSE, IOA Daily and, where appropriate, hourly evaluation Statistical Performance Benchmarks –Based on an analysis of > 30 MM5 and RAMS runs –Not meant as a pass/fail test, but to put modeling results into perspective
8
CMAQ Evaluation Preliminary 2002 version C Evaluation of 36-km WRAP MM5 Results Model performed reasonably well for eastern subdomains, but not the west (WRAP region) –General cool moist bias in Western US –Difficulty with resolving Western US orography? May get better performance with higher resolution –Pleim-Xiu scheme optimized more for eastern US? More optimization needed for desert and rocky ground? MM5 performs better in winter than in summer –Weaker forcing in summer July 2002 Desert SW subdomain exhibits low temperature and high humidity bias 2002 MM5 Model Evaluation 12 vs. 36 km Results Chris Emery, Yiqin Jia, Sue Kemball-Cook, and Ralph Morris (ENVIRON International Corporation) & Zion Wang (UCR CE-CERT), Western Regional Air Partnership (WRAP) National RPO Meeting, May 25, 2004
9
CMAQ Evaluation Preliminary 2002 version C WRAP 36km/12km July Wind Performance Comparison 0 20 40 60 80 100 120 00.511.522.533.5 Wind Speed RMSE (m/s) Wind Direction Error (degrees) Benchmark12 km SubdomainsMM5/RAMS Runs36 km Subdomains DesertSW North SW PacNW
10
CMAQ Evaluation Preliminary 2002 version C
13
Additional MM5 Testing The RMC is continuing to test alternative MM5 configurations – to be completed at the end of 2004. Final MM5 results will be used with final 2002 emissions inventory, beginning early 2005.
14
CMAQ Evaluation Preliminary 2002 version C Emissions Inventory Summary Preliminary 2002 Scenario C based on the 1996 NEI, grown to 2002, with many updates by WRAP contractors and other RPOs. Processed for CMAQ using SMOKE. Extensive QA plots on the web page –Both SMOKE QA and post-SMOKE QA
15
CMAQ Evaluation Preliminary 2002 version C Emissions Sources by Category & RPO
16
CMAQ Evaluation Preliminary 2002 version C WRAP 2002 Annual NOx Emissions Area Biogenic On Road Non Road Road Dust Point Rx Fire Ag Fire Wildfire Offshore
17
CMAQ Evaluation Preliminary 2002 version C 2002 WRAP NOx Emissions by Source & State 0 200000 400000 600000 800000 1000000 1200000 1400000 Arizona California Colorado Idaho Montana Nevada New Mexico North Dakota Oregon South Dakota Utah Washington Wyoming [Tons/Yr] Ag Fire Rx Fire Wildfire Area Point Nonroad Onroad
18
CMAQ Evaluation Preliminary 2002 version C WRAP 2002 Annual SO2 Emissions Area Biogenic On Road Non Road Road Dust Point Rx Fire Ag Fire Wildfire Offshore
19
CMAQ Evaluation Preliminary 2002 version C 2002 WRAP SO2 Emissions by Source & State 0.00E+00 5.00E+04 1.00E+05 1.50E+05 2.00E+05 2.50E+05 3.00E+05 Arizona California Colorado Idaho Montana Nevada New Mexico North Dakota Oregon South Dakota Utah Washington Wyoming [Tons/Yr] Onroad Ag Fire Rx Fire Wildfire Area Nonroad Point
20
CMAQ Evaluation Preliminary 2002 version C 2002 WRAP NH3 Emissions by Source Category 0.00E+00 5.00E+04 1.00E+05 1.50E+05 2.00E+05 2.50E+05 Arizona California Colorado Idaho Montana Nevada New Mexico North Dakota Oregon South Dakota Utah Washington Wyoming Tons/Yr Nonroad Ag Fire Rx Fire Point Onroad Wildfire Area
21
CMAQ Evaluation Preliminary 2002 version C Emissions Summary Preliminary 2002 version C EI Used here. Next iteration is version D, will include: –New EI data from other RPOs. –New NH3 EI –Fugitive Dust Model Final 2002 EI will include: –2002 NEI –Reprocess in SMOKE using final MM5 –Canada point source emissions.
22
CMAQ Evaluation Preliminary 2002 version C CMAQ Simulations CMAQ v4.3 36-km grid, 112x148x19 Annual Run CB4 chemistry Evaluated using: IMPROVE, CASTNet, NADP, STN, AIR/AQS BC from 2001 GEOS-CHEM global model (Jacob et al)
23
CMAQ Evaluation Preliminary 2002 version C PM Performance Criteria Guidance from EPA not yet ready: –Difficult to assert that model is adequate. –Therefore, we use a variety of ad hoc performance goals and benchmarks to display CMAQ results.
24
CMAQ Evaluation Preliminary 2002 version C Goal of Model Evaluation We completed a variety of analyses: –Compute over 20 performance metrics –Scatter-plots & time-series plots –Soccer plots –Bugle plots Goal is to decide whether we have enough confidence to use the model designing emissions control strategies: –Is this a valid application of the model?
25
CMAQ Evaluation Preliminary 2002 version C Soccer Goal Plots Plot error as as a function of bias. Ad hoc performance goal: –15% bias, 35% error based on O3 modeling goals. –Too demanding for PM and clean for western conditions? –Larger error & bias are observed can exist among different PM data methods and monitoring networks. Performance benchmark: –30% bias, 70% error (2x performance goals) –PM models can achieve this level in many cases.
26
CMAQ Evaluation Preliminary 2002 version C CMAQ vs. IMPROVE Summary SO4 : negative bias in summer, and positive bias in winter, good performance in spring and fall. NO3: large negative bias in summer, large positive bias in winter, and small bias but large error in March and October. OC: large negative bias in summer, small positive bias in winter. EC: Good performance each month. Coarse Mass: generally large negative bias Soil: Small bias most months, except large positive bias in winter PM2.5 and PM10: CMAQ over predicts in winter, under predicts in summer, small bias in spring and fall.
27
CMAQ Evaluation Preliminary 2002 version C CMAQ vs. CASTNet Summary CMAQ performance is better for CASTNet (longer averaging period helps) but has same trend as IMPROVE: over prediction in winter and under prediction in summer. SO4 & NO3: large negative bias in summer, large positive bias in winter. In summer both SO2 and SO4 are under predicted, in winter both are over predicted (thus problem is not in partitioning) Total Nitrate (NO3+HNO3) is much better than aerosol nitrate performance, probably reflects errors in sampling.
28
CMAQ Evaluation Preliminary 2002 version C CMAQ vs. STN Summary NO3: Large negative bias each month. SO4: Negative bias in winter. EC: Positive bias in summer. Generally good performance for other species, within performance benchmarks.
29
CMAQ Evaluation Preliminary 2002 version C CMAQ vs. NADP Summary CMAQ over predicts wet dep for SO4, NO3 and NH4. Generally small positive bias but large error terms. Largest positive bias is in summer (opposite of other networks)
30
CMAQ Evaluation Preliminary 2002 version C Annual Average metrics: CMAQ vs IMPROVE
31
CMAQ Evaluation Preliminary 2002 version C Spring Summer FallWinter
32
CMAQ Evaluation Preliminary 2002 version C Annual Average Metrics: CMAQ vs CASTNet
33
CMAQ Evaluation Preliminary 2002 version C Spring Summer FallWinter
34
CMAQ Evaluation Preliminary 2002 version C Annual CMAQ vs STN
35
CMAQ Evaluation Preliminary 2002 version C Spring Summer FallWinter
36
CMAQ Evaluation Preliminary 2002 version C Annual CMAQ vs NADP
37
CMAQ Evaluation Preliminary 2002 version C Spring Summer FallWinter
38
CMAQ Evaluation Preliminary 2002 version C WRAP 2002 CMAQ Pre02c Run Monthly Analysis
39
CMAQ Evaluation Preliminary 2002 version C
53
Comparison of VISTAS CMAQ in VISTAS states to WRAP CMAQ in WRAP States
54
CMAQ Evaluation Preliminary 2002 version C VISTAS WRAP
55
CMAQ Evaluation Preliminary 2002 version C VISTAS WRAP
56
CMAQ Evaluation Preliminary 2002 version C
60
Summary of WRAP & VISTAS VISTAS Sulfate performance much better: –Southeast SO4 levels are much higher than WRAP WRAP EC performance is better: –Order of EC reversed for IMPROVE and STN. Coarse mass lower in WRAP Similar performance for other species.
61
CMAQ Evaluation Preliminary 2002 version C Performance Goals and Criteria - Proposed by Jim Boylan Based on MFE and MFB calculations Vary as a function of species concentrations –Goals: FE +50% and FB ±30% –Criteria: FE +75% and FB ±60% –Less abundant species should have less stringent performance goals and criteria
62
CMAQ Evaluation Preliminary 2002 version C Performance Goals and Criteria - Proposed by Jim Boylan PM Performance Goals Proposed PM Performance Criteria
63
CMAQ Evaluation Preliminary 2002 version C Monthly SO4 Fractional Bias
64
CMAQ Evaluation Preliminary 2002 version C Monthly SO4 Fractional Error
65
CMAQ Evaluation Preliminary 2002 version C Monthly NO3 Fractional Bias
66
CMAQ Evaluation Preliminary 2002 version C Monthly NO3 Fractional Error
67
CMAQ Evaluation Preliminary 2002 version C Monthly NH4 Fractional Bias
68
CMAQ Evaluation Preliminary 2002 version C Monthly NH4 Fractional Error
69
CMAQ Evaluation Preliminary 2002 version C Monthly OC Fractional Bias
70
CMAQ Evaluation Preliminary 2002 version C Monthly OC Fractional Error
71
CMAQ Evaluation Preliminary 2002 version C Monthly EC Fractional Bias
72
CMAQ Evaluation Preliminary 2002 version C Monthly EC Fractional Error
73
CMAQ Evaluation Preliminary 2002 version C Monthly PM25 Fractional Bias
74
CMAQ Evaluation Preliminary 2002 version C Monthly PM25 Fractional Error
75
CMAQ Evaluation Preliminary 2002 version C CMAQ Versions & EI Versions Performance evaluation used CMAQ 4.3 Previous CMAQ runs used CMAQ 4.3 with Preliminary 2002 B emissions (no fires) January & July test case using CMAQ v4.4beta with emissions version Preliminary 2002 C
76
CMAQ Evaluation Preliminary 2002 version C CMAQ v4.3 & v4.4 versus IMPROVE July
77
CMAQ Evaluation Preliminary 2002 version C CMAQ Ozone Performance CMAQ v4.3 Mean fractional bias (no filter) January +25% MFB July –20% mean MFB Slightly worse January O3 performance in CMAQ v4.4beta
78
CMAQ Evaluation Preliminary 2002 version C CMAQ Emissions B & C versus IMPROVE Summer
79
CMAQ Evaluation Preliminary 2002 version C 2002 CMAQ Model Performance for Best and Worst 20% Days Observed and estimated extinction (B ext ) calculations at each WRAP IMPROVE sites –Site-specific f(RH) adjustment factors –Rank days by observed total extinction (Mm -1 ) B Tot = B SO4 + B NO3 + B OC + B EC + B Soil + B CM + B Ray Examine performance at each site for each component of extinction average across Worst and Best 20% days
80
CMAQ Evaluation Preliminary 2002 version C Kalmiopsis Model Performance for Average of Worst 20% Days at WRAP IMPROVE Sites Preliminary 2002 CMAQ Simulation SO4 NO3 OC EC
81
CMAQ Evaluation Preliminary 2002 version C Kalmiopsis Phoenix Saguro Model Performance for Average of Worst 20% Days at WRAP IMPROVE Sites Preliminary 2002 CMAQ Simulation Coarse Matter (CM) and Soil
82
CMAQ Evaluation Preliminary 2002 version C Model Performance for Average of Best 20% Days at WRAP IMPROVE Sites Preliminary 2002 CMAQ Simulation SO4 NO3 OC EC
83
CMAQ Evaluation Preliminary 2002 version C Model Performance for Average of Best 20% Days at WRAP IMPROVE Sites Preliminary 2002 CMAQ Simulation Coarse Matter (CM) and Soil
84
CMAQ Evaluation Preliminary 2002 version C Grand Canyon NP, AZ Chiricahua NM, AZ Extinction (Mm -1 ) model performance for average of Worst 20% observed days Grand Canyon Chiricahua
85
CMAQ Evaluation Preliminary 2002 version C Bandelier NM, NM Rocky Mtn. NP, CO Extinction (Mm -1 ) model performance for average of Worst 20% observed days Bandelier NM Rocky Mountain NP
86
CMAQ Evaluation Preliminary 2002 version C Yellowstone NP, WY Glacier NP, MT Extinction (Mm -1 ) model performance for average of Worst 20% observed days Yellowstone Glacier
87
CMAQ Evaluation Preliminary 2002 version C Mount Ranier NP, WA Kalmiopsis, OR Extinction (Mm -1 ) model performance for average of Worst 20% observed days Mount Ranier Kalmiopsis
88
CMAQ Evaluation Preliminary 2002 version C Point Reyes, CA San Gorgoinio, CA Extinction (Mm -1 ) model performance for average of Worst 20% observed days Point Reyes San Gorgonio
89
CMAQ Evaluation Preliminary 2002 version C Yellowstone NP, WY Grand Canyon NP, AZ Extinction (Mm -1 ) model performance for average of Best 20% observed days Grand Canyon Yellowstone
90
CMAQ Evaluation Preliminary 2002 version C Conclusions Positive bias in winter, negative bias in summer. CMAQ meets “benchmark” goals for most species and networks. Disagreements among different monitoring networks. Negative bias for summer O3 might contribute to negative PM bias.
91
CMAQ Evaluation Preliminary 2002 version C Next Steps Analysis of CMAQ performance on best & worst days still in progress: –However, we expect CMAQ will tend to over predict lows & under predict highs. –Recommend using CMAQ results unpaired in time for each month or season. Is this set of Emissions/MM5/CMAQ adequate for developing emissions control strategies? Will we get performance improvements with new MM5 and new EI? New NH3 EI and might improve performance.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.