Download presentation
Presentation is loading. Please wait.
Published byIlene Morris Modified over 9 years ago
1
Corroborative and Weight-of-Evidence Development and Analyses 11-08CCOS Envair Charles Blanchard Shelley Tanenbaum Alpine Geophysics James Wilkinson May 29, 2012
2
2 Overview of Presentation Project objectives Weight-of-evidence framework Trends in emissions, O 3 precursors, and O 3 Generalized additive model (GAM) Using the GAM to project future O 3 response (Uncertainty analyses and design value variability) (Model demonstration after presentation)
3
3 Objectives Identify innovative new methods to reduce uncertainty in O 3 attainment demonstrations Develop and demonstrate use of new methods for weight-of-evidence evaluation: –Enhance confidence in future year projections –Provide additional evidence for effectiveness of VOC and NO x emission reductions –Further assess local and regional influences Provide data, software, and documentation suitable for ongoing Study Agency use
4
4 Weight-of-Evidence Framework Corroborate VOC and NO x emission reductions by comparison with trends in ambient NMOC & NO x concentrations Quantify O 3 reductions Link O 3 reductions to observed ambient VOC (NMOC) and NO x concentrations Project O 3 response(s) to future emission reductions and precursor concentrations Reconcile weight-of-evidence analyses with modeling predictions
5
5 Our Study Domain - Central California Showing O 3 Monitoring Sites in 15 Subregions
6
6 Ambient CO, NO, NO 2, NMOC Trends Are Downward and Significant Central San Joaquin Valley (CSJ) Mean of 7 – 10 a.m. CO Fresno 1st Mean of 7 – 10 a.m. NO 2 Mean of daily max NO All sites
7
7 Ambient CO Trends Are Consistent With Emission Trends in Most Subregions
8
8 Ambient NO 2 Trends Exceed NO x Emission Trends in Some Subregions
9
9 Ambient NMOC Trends Exceed VOC Emission Trends, But Limited Data
10
10 CSJ Peak 8-Hour O 3 Metrics Comparison
11
11 Downward Trends in Peak 8-Hour O 3
12
12 Conclusions from Trends Analyses Ambient precursor trends confirm emission reductions Peak 8-hour O 3 is trending downward at rates of ~0.2 – 0.7 ppbv per year with exception of CBA (upward) and SCC (downward at 1.5 – 2 ppbv per year) The top 10% of days and the top 60 days per subregion per year provide good subsets for study – trends are relevant to 4 th -highest and to subregion mean daily excess of 75 ppbv Season average subregion daily maximum peak 8-hour O 3 is also useful metric
13
13 Generalized Additive Model (GAM) GAM developed by U.S. EPA to determine meteorologically-adjusted O 3 trends We adapted the GAM to link peak 8-hour O 3 to ambient NO, NO 2, and other precursors, while accounting for the influence of weather Tested many meteorological and air-quality variables as predictors – focus on final model We developed estimates of uncertainty and an approach for projecting future O 3 response
14
14 EPA published GAM in Atmospheric Environment, 2007 Area of application was eastern US EPA used GAM to determine meteorologically-adjusted O 3 trends GAM generates sensitivity of O 3 to each predictor variable
15
15 The Basic GAM Model says that predicted log O 3 on day “i” of year “k” is an additive function of: –Overall mean of data from all days of all years, –Mean effects Y=year, W=day of week, J=julian day –Contributions due to nonlinear functions, f, of meteorological and air quality predictor variables Log transform of O 3 is useful but optional Flexible choice of functions “f i ” –GAM is set up to use natural splines –Natural splines are (special) cubic polynomials log(O 3 ) ik = + Y k + W d + f 1 (J i ) + f 2 (x ik ) + …
16
16 Programming Aspects of GAM Original software written by EPA as R program (R is nonproprietary, available, runs under LINUX, Windows, MacOS ) We modified program to generate output files –Graphs of annual average trends (various formats) –Statistical summaries (text files) –Daily data linking O 3 to predictors (CSV files) Output files can be manipulated to select subsets of data and develop projections
17
17 GAM Application to Central California Predict subregion max daily peak 8-hour O 3 –Find daily peak 8-hour O 3 for each site –Take maximum site for each day Meteorological variables –Daily max T, 10 a.m. – 4 p.m. RH, 7 – 10 a.m. & 1 – 4 p.m. WS & WD, HYSPLIT 24-hour back trajectory distance & direction, solar radiation, 850 mb T, delta 850 mb T – surface min T, pressure gradients –Tested precipitation, 925 mb T, lagged met data Air quality variables –Subregion mean daily max NO, 7 – 10 a.m. NO 2 –Tested CO, NMOC, visibility, PM TC
18
18 Data Used for Application 1995 – 2010 O 3 season (March – October): 3920 days (3424 – 3661 days data available) One surface meteorological site per subregion (Redding, Sacramento, etc.) – also ran HYSPLIT for each surface met site Nearest upper-air site (Medford, Oakland, San Diego) Means of CIMIS data in each subregion Means of NO, NO 2 data in each subregion – (CO and NMOC data tested, not in final) IMPROVE data in each subregion (tested)
19
19 CIMIS Sites
20
20 NMOC Data Limitations Inconsistencies between measurement methods, changes in methods, incomplete canister sampling – longest, consistent record is for continuous NMOC coded as Method 164 (TEI 55 instrument) 14 NMOC sites in 8 subregions 5 Bay area sites with 5 – 6 years data plus 9 other sites with 11 – 13 years Variability of NMOC data greater than variability of CO, NO, and NO 2 measurements
21
21 GAM Results Fit Sensitivity coefficients Factors contributing to high O 3 Projections Uncertainty
22
22 Coastal Bay AreaSJV and Sequoia Sacramento Valley & Sierra
23
23 Which Variables Are Important? (Higher values of F-to-remove statistics indicate greater importance)
24
24 Sensitivity to Daily Max Temperature Sacramento Valley and Sierra Bay Area SJV and SequoiaCoastal
25
25 Sensitivity to Mid-day RH Sacramento Valley and Sierra Bay Area SJV and SequoiaCoastal Bay Area
26
26 Sensitivity to 850 mb Temperature Sacramento Valley and Sierra Bay Area SJV and SequoiaCoastal
27
27 Sensitivity to Daily Max NO Sacramento Valley and Sierra SJV and SequoiaCoastal Bay Area
28
28 Sensitivity to 7 – 10 a.m. NO 2 Sacramento Valley and Sierra SJV and SequoiaCoastal Bay Area
29
29 Sensitivity to Day of Week
30
30 Declining NO 2 Has Reduced Peak O 3
31
31 Declining NO Has Increased Peak O 3
32
32 Net Effect of Declining NO x Has Been to Decrease Mean Peak 8-Hour O 3
33
33 Net NO x Effect is Robust to Change in Model Formulation
34
34 Higher Peak O 3 is Related to Stagnation (Shorter Transport Distances) a. WBAb. CBAc. EBA
35
35 Multiple Factors Enhance Peak O 3 on High O 3 Days (Top 60)
36
36 Precursor Reductions Lowered O 3 in CSJ CSJ Top 60 Days per Year
37
37 Precursor Reductions Lowered O 3 in NSJ NSJ Top 60 Days per Year
38
38 Projecting Future Progress Method I: Combine annual O 3 sensitivities to NO x with projections of NO x emissions Method II: Combine daily O 3 sensitivities to NO x with projected ambient NO x concentrations generated from synthetic data Implicit assumption in both methods: ratio of VOC/NOx remains constant or follows trends similar to historical trends
39
39 Projection Method I Project historical trend lines to estimate effects of future basin NO x emissions
40
40 Projection Method II Use 2008 – 2010 as base period, utilizing daily monitoring data with daily R sensitivities For each month and day of week, remove date with highest NO 2 – for ties, remove date with highest NO For each month and day of week, retain 5 dates using random selection Recode data as 2011 Repeat steps to generate 2012 – 2020 Aggregate daily sensitivities to NO, NO 2, NO x
41
41 NO 2 and NO Concentrations “Continue” Declining at Historical Rates
42
42 Decreasing NO x Concentrations Will Continue to Decrease Peak O 3
43
43 Decreasing Peak O 3 on High O 3 Days NSJ High O 3 Days (Top 60)
44
44 Compare and Contrast Modeling and Weight-of-Evidence Analyses Need to consider prediction uncertainties for modeling and weight-of-evidence analyses GAM uncertainties quantified in two ways –Parameter standard errors from R –Bootstrap uncertainties Design value variability assessment used to characterize one type of modeling uncertainty
45
45 GAM Prediction Uncertainties Parameter standard errors are computed for each day by R – but are they realistic? Tested using bootstrap uncertainties –Leave out one year at a time (16 combinations) –Leave out one group of meteorological variables at a time (10 combinations) –Add AQ variables one at a time (4 combinations) –Generate variances from each –Sum variances
46
46 Bootstrap and R Standard Errors are Comparable and ~10% of Coefficients NBA SAC NSJCSJ
47
47 Annual Effects of NO, NO 2, and NO x on Peak O 3 with Uncertainties, CSJ
48
48 Design Value Variability Assessment – Baseline Design Values Vary 1 – 14 ppbv
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.