Presentation is loading. Please wait.

Presentation is loading. Please wait.

Western Air Quality Study (WAQS) Intermountain Data Warehouse (IWDW) Model Performance Evaluation CAMx and CMAQ 2011b University of North Carolina (UNC-IE)

Similar presentations


Presentation on theme: "Western Air Quality Study (WAQS) Intermountain Data Warehouse (IWDW) Model Performance Evaluation CAMx and CMAQ 2011b University of North Carolina (UNC-IE)"— Presentation transcript:

1 Western Air Quality Study (WAQS) Intermountain Data Warehouse (IWDW) Model Performance Evaluation CAMx and CMAQ 2011b University of North Carolina (UNC-IE) Ramboll-Environ (Environ) January 12, 2016 IWDW-WAQS Technical Committee Call

2 2 Base 2011b MPE Outline Simulation Base11b specs 4-km domain-wide performance stats State-wide performance stats Site-specific performance stats IWDW MPE Image Browser

3 3 Models – CAMx v6.10 (CB6r2): 32 cores x hybrid OpenMP/MPI – CMAQ v5.0.2 (CB05): 32 cores x MPI (mvapich2) – SMOKE v3.5.1 Run Time (36/12/4km x annual simulation) – CAMx: 40 days (“wall clock”) – CMAQ: 69.5 days (“wall clock”) Output Data Volumes (36/12/4km x annual simulation) – CAMx: 8.8 Tb – CMAQ: 20 Tb WAQS 2011b Simulations and MPE Operational Statistics

4 4 WAQS 2011b Simulations and MPE CAMx vs CMAQ

5 5 WAQS Simulation Base11b MDA8 Ozone Performance: All AQS sites 4-km Domain Comparing 2011a/b for CMAQ/CAMx All simulations within performance goals in all months CAMx lower bias in Jan-Mar, CMAQ lower bias in the summer and December Performance is similar for each model between 2011a and 2011b DRAFT DO NOT CITE

6 6 WAQS Simulation Base11b MDA8 Ozone Performance: All CASTNet sites 4-km Domain DRAFT DO NOT CITE CAMx has lower bias in the first half of the year and then both CAMx and CMAQ are similar through the rest of the year On average, CMAQ estimates lower O3 concentrations than CAMx across the domain

7 7 WAQS Simulation Base11b Hourly NO2 Performance: All AQS sites 4-km Domain DRAFT DO NOT CITE High NO2 biases are reduced in Base11b at AQS sites across the 4- km domain NO2 is still overestimated in most months Domain-wide NO 2 biases are lower in CMAQ than CAMx

8 8 WAQS Simulation Base11b Hourly CO Performance: All AQS sites 4-km Domain DRAFT DO NOT CITE CO performance is similar in Base11a vs Base11b High winter CO positive biases in CAMx persist in Base11b

9 9 WAQS Simulation Base11b Daily Max Total PM2.5 Performance: All sites 4-km Domain DRAFT DO NOT CITE Total PM2.5 is lower in simulation Base11b, leading to improvement in overestimates seen in Base11a. Zeroing the dust boundary conditions reduce the total PM2.5 in all months and penalizes the rural model performance outside of the winter months. CSNIMPROVE

10 10 WAQS Simulation Base11b Nitric Acid and Ammonia Performance: All sites 4-km Domain DRAFT DO NOT CITE CAMx simulates nitric acid well through August; CMAQ systematically over estimates nitric acid Ammonia is still severely underestimated by both models CASTNet HNO 3 AMoN NH 3

11 11 WAQS Simulation Base11b Wet Deposition Performance: All sites 4-km Domain DRAFT DO NOT CITE Wet deposition species are all underestimated. These plots do not include adjustments for the bias in the modeled precipitation SO 4 NO 3 NH 4

12 12 WAQS Simulation Base11b MDA8 Ozone Performance: Colorado AQS Sites DRAFT DO NOT CITE Although CAMx estimates higher O3 on average, CMAQ estimates higher peaks

13 13 WAQS Simulation Base11b MDA8 Ozone Performance: Utah AQS Sites DRAFT DO NOT CITE CAMx doing better in the winter than CMAQ. Slightly lower winter O3 in Base11b than Base11a

14 14 WAQS Simulation Base11b MDA8 Ozone Performance: Wyoming AQS Sites DRAFT DO NOT CITE Q-Q plot distorts the CMAQ performance at the upper tail of observations by pairing a summer ozone concentration with the winter ozone observations.

15 15 WAQS Simulation Base11b MDA8 Ozone Performance: New Mexico AQS Sites DRAFT DO NOT CITE All models are estimating too much O3 in the last 7 months of the year; these biases are highlighted in the model performance at the NM AQS sites.

16 16 WAQS Simulation Base11b MDA8 Ozone Performance: Individual Sites

17 17 WAQS Simulation Base11b MDA8 Ozone Performance: Gothic, CO DRAFT DO NOT CITE CAMx is estimating too much ozone at Gothic Both models fail to estimate any of the NAAQS violations in May-June possibly due to missed LR transport/STE events

18 18 WAQS Simulation Base11b MDA8 Ozone Performance: Mesa Verde, CO DRAFT DO NOT CITE CAMx and CMAQ provide good models of ozone at Mesa Verde Both models miss 4 out of 5 elevated ozone events The models simulate the annual and daily trends well

19 19 WAQS Simulation Base11b MDA8 Ozone Performance: Rocky Flats North, CO DRAFT DO NOT CITE Ozone season performance Skill plots highlight the ability of the models to simulate NAAQS violations CAMx has more “hits” and less “misses” than CMAQ, but also more “false alarms”

20 20 WAQS Simulation Base11b MDA8 Ozone Performance: Rocky Flats North, CO DRAFT DO NOT CITE Ozone season performance Diurnal and DOW plots Both models generally capture the diurnal profile of O3 – Excessive titration in afternoon rush hour CAMx captures the day of week profile better than CMAQ

21 21 WAQS Simulation Base11b MDA8 Ozone Performance: Canyonlands, UT DRAFT DO NOT CITE CAMx and CMAQ provide good models of ozone at Canyonlands Both models missed 2 of 3 high ozone events The models simulate the annual and daily trends well

22 22 WAQS Simulation Base11b MDA8 Ozone Performance: Hawthorn, UT DRAFT DO NOT CITE Ozone season performance Both models underestimate the observed ozone concentrations While the average performance is pretty good, the observed temporal trends are not captured

23 23 WAQS Simulation Base11b MDA8 Ozone Performance: Hawthorn, UT DRAFT DO NOT CITE Nighttime ozone is severely underestimated by CAMx The day of week profiles are also poorly represented As this site is near secondary and local roads, these trends point to problems with the onroad mobile emissions

24 24 WAQS Simulation Base11b MDA8 Ozone Performance: Navajo Lake, NM DRAFT DO NOT CITE Ozone season performance Both models tend to overestimate the observed ozone concentrations Several high observed ozone events are missed

25 25 WAQS Simulation Base11b MDA8 Ozone Performance: Navajo Lake, NM DRAFT DO NOT CITE The diurnal trends are similar in both models and generally pretty good The CAMx day of week profile is too flat The large discrepancy in the day of week profiles between the two models suggests the performance difference is not driven by anthropogenic emissions

26 26 WAQS Simulation Base11b MDA8 Ozone Performance: Thunder Basin, WY DRAFT DO NOT CITE Although both models are low in January- February, they tend to overestimate ozone at this site Seasonal trends are captured well

27 27 WAQS Simulation Base11b MDA8 Ozone Performance: Pinedale, WY DRAFT DO NOT CITE High winter ozone site Both models miss the high ozone events at this site CAMx tends to overestimate and CMAQ underestimates the rest of the year.

28 28 WAQS Simulation Base11b MDA8 Ozone Performance: Pinedale, WY DRAFT DO NOT CITE Winter season performance CAMx captures the diurnal trend fairly well, although with a low bias The day of week trends are missed by moth models CMAQ is systematically way too low

29 29 WAQS Simulation Base11b Winter PM2.5 Performance: CSN sites 4-km Domain DRAFT DO NOT CITE Winter OC at urban CSN sites reduced in Base11b, leading to improvement over Base11a; model still over estimates OC. Dust reductions improve overall CSN performance in Base11b vs Base11a CAMx and CMAQ Base11b performance are similar CAMx B11a vs B11bB11b CAMx vs CMAQ

30 30 WAQS Simulation Base11b Spring PM2.5 Performance: IMPROVE sites 4-km Domain DRAFT DO NOT CITE Significant dust (PM Other) reductions improve overall CSN performance in Base11b vs Base11a, although now underestimating dust, NO3, and OC CAMx and CMAQ Base11b performance are similar CAMx B11a vs B11bB11b CAMx vs CMAQ

31 31 WAQS Simulation Base11b Total PM2.5 Performance: Rocky Mountain National Park, CO DRAFT DO NOT CITE Overadjusted the dust BC error OC is too low in both models

32 32 WAQS Simulation Base11b Total PM2.5 Performance: Canyonlands, UT DRAFT DO NOT CITE Overadjusted the dust BC error NO3 and NH4 are too low in the spring and summer

33 33 WAQS Simulation Base11b Total PM2.5 Performance: Bridger, WY DRAFT DO NOT CITE Overadjusted the dust BC error Otherwise, the model performance is good at this site

34 34 WAQS Simulation Base11b Total PM2.5 Performance: Bandelier, NM DRAFT DO NOT CITE A large fire event that impacted the monitor in early July was included in the models Model performance is good at this site

35 35 WAQS Simulation Base11b Regional Haze: 20% Worst Days DRAFT DO NOT CITE CAMx vs CMAQ on 20% worst days CAMx has too much sea salt Models both low for OC, NO3, and SO4

36 36 WAQS Simulation Base11b Regional Haze: 20% Worst Days comparison to Base11a DRAFT DO NOT CITE CAMx Base11b extinctions decreased relative to Base11a, driven by SO4 and soil: performance degrades in Base11b CMAQ Base11b extinctions increased relative to Base11a, driven by organic PM: performance improves in Base11b

37 37 WAQS Simulation Base11b MPE Next Steps Additional MPE plots on the IWDW Model Performance Image Browser MPE draft report by January 2016 Release the platform in February 2016 Recommendations discussion


Download ppt "Western Air Quality Study (WAQS) Intermountain Data Warehouse (IWDW) Model Performance Evaluation CAMx and CMAQ 2011b University of North Carolina (UNC-IE)"

Similar presentations


Ads by Google