Modeling & Monitoring Update

Slides:



Advertisements
Similar presentations
Harmonization of Part 60 and Part 75 CEM Requirements Robert Vollaro
Advertisements

Florida Department of Environmental Protection Looking Ahead.
What options do states have? What is Georgia planning to do? What are some of the other states doing? What are the possible implications to permit fees?
DEP’s Air Regulatory Update
National Ambient Air Quality Standards for NO 2 and SO 2 – New Modeling Challenges August 4, 2011 Air & Waste Management Association – Southern Section.
1 An Update on EPA Attainment Modeling Guidance for the 8- Hour Ozone NAAQS Brian Timin EPA/OAQPS/EMAD/AQMG November 16, 2005.
EPA Update- Bob Judge Maine Air Quality Monitoring Committee April 18, ) NAAQS schedule 2) Budget 3) Technical Systems Audit.
IOWA Department of Natural Resources Air Quality Program Development Jim McGraw Environmental Program Supervisor  8 hr Ozone and PM2.5 NAAQS Implementation.
| Philadelphia | Atlanta | Houston | Washington DC SO 2 Data Requirements Rule – A Proactive Compliance Approach Mark Wenclawiak, CCM |
Environmental Protection Division Air Quality Update Georgia EPD Jimmy Johnston Georgia Environmental Protection Division August 5, 2010.
PA Department of Environmental Protection Continuous Source Monitoring Manual (Manual, Revision 8)
Gordon Pierce WESTAR Fall Business Meeting Denver, CO November 6, 2013.
Sound solutions delivered uncommonly well Understanding the Permitting Impacts of the Proposed Ozone NAAQS Pine Mountain, GA ♦ August 20, 2015 Courtney.
Next Generation Air Monitoring: An Overview of US EPA Activities National Air Quality Conference RTP, NC February 12, 2014 Tim Watkins US EPA/Office of.
EPA’s DRAFT SIP and MODELING GUIDANCE Ian Cohen EPA Region 1 December 8, 2011.
National Ambient Air Quality Standards and Current Status of Air Quality Laura Boothe North Carolina Division of Air Quality MCIC Workshops March 2012.
PM 2.5 Continuous FEMs; Update and Assessments For AMTAC April 12, 2011 Kate Hoag – US EPA, Region 9.
Designations for 24-Hour PM2.5 NAAQS: Overview and Guidance Amy Vasu PM2.5 Workshop June 20-21, 2007.
Title V Operating Permits: A Compliance and Enforcement Tool Candace Carraway US Environmental Protection Agency Office of Air Quality Planning and Standards.
Proposed Revisions to the Guideline on Air Quality Models
1 Modeling Under PSD Air quality models (screening and refined) are used in various ways under the PSD program. Step 1: Significant Impact Analysis –Use.
Modeling, Impacts, and Effects Review Dom Ruggeri, P.E., Manager Technical Program Support Section TCEQ, Air Permits Division Austin, Texas September.
NACAA MSC, North Carolina
1 Special Information Session on USEPA’s Carbon Rules & Clean Air Act Section 111 North Carolina Division of Air Quality Special Information Session on.
PM 2.5 Continuous FEMs; Update and Assessments For NESCAUM Monitoring Meeting April 29, 2011 Tim Hanley – US EPA, OAQPS 1.
PM Methods Update and Network Design Presentation for WESTAR San Diego, CA September 2005 Peter Tsirigotis Director Emissions, Monitoring, and Analysis.
OAQPS Update WESTAR April 3,  On March 12, 2008, EPA significantly strengthened the National Ambient Air Quality Standards (NAAQS) for ground-level.
N EW Y ORK S TATE D EPARTMENT OF E NVIRONMENTAL C ONSERVATION Short Term Ambient Air Quality Standards and The Effect on Permitting Margaret Valis NESCAUM,
NAAQS Status in GA & PSD Inventory Update James W. Boylan Georgia EPD – Air Protection Branch Manager, Planning & Support Program AWMA Regulatory Update.
Lenoir County Public Schools New North Carolina Principal Evaluation Process 2008.
New Ozone NAAQS Impacts: What Happens Next with a Lower O3 Standard? Nonattainment Designation and Industry’s Opportunity to Participate New Ozone NAAQS.
Annual Air Monitoring Data Certification and Concurrence Process 1.
Air Modeling Updates 2015 Region 4 Grants/Planning Meeting May 19-21, 2015 Atlanta, Georgia 1.
Statewide Compliance Monitoring and Local Monitoring Projects
World Health Organization
Department of Environmental Quality
NIEP Evaluation PO&A “How-to” Guide and Issue Classification
WESTAR Increment Recommendations
EPA Method Equivalency
Region 4 Air Sensor Projects
Triennial Review of Water Quality Standards Proposed Rulemaking
Draft Modeling Protocol for PM2.5
Overview of the FEPAC Accreditation Process
GTEC 2017 Annual UST Management & Compliance Assistance Seminar
EPA Method Equivalency
EPA Region 10 Alternate Test Procedures and Method Update Rule
Preparing A Useful 5-Year Network Assessment
Indian Policies and Procedures (IPPs) OASIS December 7, 2017
Department of Environmental Quality
Proposed Ozone Monitoring Revisions Ozone Season and Methods
SPR&I Regional Training
Julie Woosley, Division of Waste Management
Department of Environmental Quality
PMcoarse , Monitoring Budgets, and AQI
Exceptional Events Rulemaking Proposal
Enforcing the NAAQS Case Study Sean Taylor
SDWA Collaborative Efforts Overview
Exceptional and Natural Events Rulemaking
National Environmental Monitoring Conference
TCEQ AMBIENT Air Monitors in Corpus christi
A New Tool for Evaluating Candidate PM FEM and PM2.5 ARM Monitors
Sulfur Dioxide 1-Hour NAAQS Implementation
Region 4 Air Directors Spring Meeting May 20, 2015
TEXAS DSHS HIV Care services group
TCEQ Environmental Trade Fair Water Quality Division
Status of Exceptional Events Implementation Guidance
RHPWG – Control Measures Subcommittee Oil & Gas Source Coordination
2019 AWOP National Meeting Office of Ground Water and Drinking Water
EPA Region 4 Spring Grants/Planning Meeting
EPA/OAQPS Pollutant Emissions Measurement Update 2019
Presentation transcript:

Modeling & Monitoring Update Todd Rinck EPA Region 4, APTMD Fall 2016 Region 4 Air Directors’ Meeting St. Petersburg, Florida November 9, 2016

Alternative Model Approvals Alternative model approval requirements are contained in EPA’s Guideline on Air Quality Models (40 CFR Part 51, Appx. W, Section 3.2.2) EPA Regions approve alternative models, including AERMOD beta options. Approvals must be done in consultation and concurrence with the Model Clearinghouse, which allows for national consistency in approvals and transparency with stakeholders. NO2 Tier 3 proposals require RO approval, no Model Clearinghouse concurrence Dec. 10, 2015 EPA memo clarified recommended/preferred model “beta options” If a beta option in a EPA preferred model is used, then preferred model status is changed to alternative model and is subject to Appendix W, Section 3.2.2 requirements. * Alternative model approval requirements are contained in EPA’s Guideline on Air Quality Models (40 CFR Part 51, Appendix W, Section 3.2.2) *EPA Regional Offices (ROs) approve alternative models, including the application of AERMOD beta options. Approvals must be done in consultation and concurrence with the Model Clearinghouse, which allows for national consistency in approvals and transparency with stakeholders. * NO2 Tier 3 proposals require EPA RO approval but not Model Clearinghouse concurrence. * EPA released a clarification memorandum on recommended/preferred model “beta options” on December 10, 2015. * If a beta option within an EPA preferred model is used in a regulatory application, then the preferred model status is changed to that of an alternative model and is subject to Appendix W, Section 3.2.2 requirements. * NO2 Tier 3 proposals require RO approval, no Model Clearinghouse concurrence

Modeling for SO2 DRR Sources Round 3 Designations required by December 2017 All R4 states provided required Modeling Protocols and Monitor Siting information by the July 1, 2016, due date. Thank You! R4 staff have reviewed, provided comments on Modeling Protocols for 50 DRR facilities R4 staff are working with states to address issues identified in our comments Modeling reports/results are due by January 13, 2017 Alabama – 8 facilities Georgia – 5 facilities Florida – 11 facilities Kentucky – 10 facilities Mississippi – 3 facilities (1 shutting down) North Carolina – 5 facilities South Carolina – 5 facilities Tennessee – 3 facilities * Some states in other Regions are still submitting the initial protocols that were due in July. * Some R4 states are providing revised protocols while others are handling them more informally with emails/conference calls Contact: Rick Gillam EPA Region 4 404/562-9049 gillam.rick@epa.gov Round 3 Designations required by December 2017 o All Region 4 states provided required Modeling Protocols and Monitor Siting documentation by the July 1, 2016, due date. Thank You! o EPA R4 modeling staff has reviewed and provided comments on Modeling Protocols for 50 DRR facilities o EPA R4 modeling staff are working with states to address issues identified in our comments o Modeling reports/results are due by January 13, 2017

Monitoring for the SO2 DRR Sources Round 4 Designations required to be complete by December 2020 R4 approved/expects to approve monitors in 7 areas where states are choosing ambient monitoring to characterize impacts by 2020 (Round 4 designations) 1 area in AL, 1 area in GA, 1 area in KY (2 sources-1 monitor), 4 areas in NC (maybe 5) R4 reviewed modeling used to site SO2 DRR monitors. Siting followed the SO2 NAAQS Designations Source-Oriented Monitoring Technical Assistance Document R4 monitoring and modeling staff worked closely with state staff to identify appropriate monitoring locations R4 monitoring staff visited each proposed monitoring site with states staff Round 4 Designations required to be complete by December 2020 o As part of the 2016 air monitoring network plans, EPA R4 has approved or expects to approve monitors in 7 areas where the state is choosing to use ambient monitoring to characterize impacts by 2020 (Round 4 designations) 4 areas in North Carolina 1 area in Kentucky (2 separate DRR sources located together with 1 monitor) 1 area in Georgia and 1 area in Alabama R4 appreciates that states began working with us early to get the sites approved o EPA R4 reviewed modeling to support siting each of the SO2 DRR monitors. Each of the monitors was sited using the process described in the SO2 NAAQS Designations Source-Oriented Monitoring Technical Assistance Document o EPA R4 monitoring and modeling staff have worked closely with staff from each state to identify appropriate monitoring locations o EPA R4 monitoring staff has conducted site visits with the states to each of the proposed monitoring sites

NATTS Updates Final NATTS TAD was sent to agencies on October 25th EPA received 1,200 comments on revision 2 of the TAD A workgroup including SLT stakeholders addressed the comments Plan is to disseminate comment resolution details Compliance is expected by the end of October 2017 Updates include: VOCs – subambient vs pressurized Siting criteria Method Detection Limits (MDLs) AQS guidance for reporting Analyte identification guidance Quality systems guidance & reqmts Equipment calibration Validation tables VOCs – subambient vs pressurized capped at 3 psig to limit humidity and condensation Siting criteria Clarified the difference between collocated vs duplicate Determined distance of inlets in relation to each other Should conduct annual re-evaluation of siting criteria MDLs Watered down version of method update rule Blanks need to be considered because interference plays a part Examples for determining spiking levels provided Data seen below method detection limits (MDLs) needs to be reported Those data will need to be qualified This will be set up automatically in AQS. AQS guidance will be an appendix to the TAD Non-detect is only reported when a value is not seen AQS guidance for reporting There is a data reporting appendix Clarifications are provided in the appendix Analyte identification guidance Carbonyls, PAHs, VOCs Clarification of criteria Illustrations of acceptable identification Beefed up quality systems guidance and requirements QAPP elements SOPs Corrective actions Document control Training documentation Records retention Equipment calibration Relaxed to require initial calibration, when changes affect response, and when calibration checks fail Defines calibration needs for support instruments Validation tables For all 4 methods (PAHs, VOCs, carbonyls, and metals) Covers all aspects of sampling and analysis Will be providing as an Excel file so items may be sorted

Network Plan Technical Requirements 40 CFR§58.10(a)(1) -- The plan shall include a statement of whether the operation of each monitor meets the requirements of appendices A, B, C, D, and E of this part, where applicable. A recent Region 6 EPA IG report found: “The annual plans did not provide evidence that each monitoring site met regulatory siting criteria.” EPA “could improve its review process to better ensure that annual plans are more complete and accurate, to provide reasonable assurance that monitors are located in representative areas and are operated in accordance with EPA requirements.”

Network Plan Public Inspection and Comment The recent Monitoring Rule Revision modified the network plan public inspection and comment requirements (40 CFR §58.10(a)(1)) The annual monitoring network plan must be made available for public inspection and comment for at least 30 days prior to submission to the EPA and The submitted plan shall include and address, as appropriate, any received comments.

Evidence of Meeting QA Requirements in Network Plans Region 4 asks agencies to include a list of their QA documents and the approval dates in their annual network plans QA requirements are found in Appendices A (NAAQS) and B (PSD)

Evidence of Meeting Siting Criteria in Network Plans In the past, agencies have stated in their plans that all of their sites meet siting criteria. Recent TSAs have found siting issues. Region 4 is now requiring that agencies provide a minimal amount of information to verify that sites meet these criteria. *This tape measure is not NIST certified...probably.

Evidence of Meeting Siting Criteria in Network Plans Photos: N, S, E, W At a minimum, photos should be updated for each 5yr assessment. Ideally photos should be updated each year Date of last site evaluation Ideally agencies should evaluate sites once a year Include information in the plan: Probe height Distance to nearest obstructions Corrective actions planned or taken to correct deficiencies

Questions?

CitySpace Air Sensor Research Project in Memphis, Tennessee Participants and Collaborators: EPA Regions 4, 6, and 7; EPA Office of Research and Development (ORD); Memphis and Shelby County Health Department; Mississippi Department of Environmental Quality, Arkansas Department of Environmental Quality; Memphis Area Transit Authority; University of Memphis Objectives: Field-test new, lower-cost PM sensors in the Memphis area. Understand how this emerging technology can add valuable information about air pollution patterns in neighborhoods. Fact Sheet: https://www.epa.gov/air-research/cityspace-air-sensor-network-project-conducted-test-new-monitoring-capabilities

CitySpace Air Sensor Research Project in Memphis, Tennessee Sensor pods were installed at 16 sites in October 2016, and will continue monitoring until February 2017 Two additional sites to be installed in November Each sensor pod continuously measures: PM in various size increments Wind speed, wind direction, temperature, and humidity Monitoring locations were selected with input from the local community, and by using mapping tools developed by EPA’s Sustainable and Healthy Communities research program Several monitors are co-located with regulatory PM2.5 monitors Blue dots are the sites that have been installed, grey dots are the ones that will be installed by the end of November.

Community Air Sensor Network (CAIRSENSE) Project Overview Participants: EPA Regions 4, 1, 5, 7, and 8; EPA Office of Research and Development (ORD); EPA Office of Air Quality Planning and Standards (OAQPS); and Georgia Environmental Protection Division (EPD), Colorado Department of Public Health and the Environment; Jacobs Technology (ORD contract support). Objectives: 1. Evaluate in situ the long-term comparability of several lower cost sensors of interest against regulatory monitors. 2. Determine the capabilities and limitations of a long-term multi-node wireless sensor network applied for community air monitoring, in terms of operational stability (communications, power) and long-term data quality under ambient conditions. Research findings available: https://www.epa.gov/air-sensor- toolbox/air-sensor-toolbox-resources-and-funding#RTF Some of the takeaways from the CAIRSENSE results: There was a wide range of data quality from different sensors tested – some sensors produced very useful data, and some did not. In some cases, there was also variability between different units of the same model of sensor tested It is important to build good QA practices (e.g. collocation with other sensors and a reference) into any future sensor applications Sensors may potentially be useful for a wide range of applications, such as providing community- or individual-level pollution data. However, many sensors have not yet been evaluated to determine their reliability and accuracy. The Community Air Sensor Network (CAIRSENSE) project is a collaboration among multiple EPA regions, the Office of Research and Development (ORD), and Office of Air Quality Planning and Standards (OAQPS) to understand the field performance and utility of the next-generation ambient air quality measurement instrumentation, and is part of EPA’s Regional Methods Program.

CAIRSENSE Correlation matrix (Pearson correlation) of 12-hr average PM between sensors and co-located FEM Moderate to high correlation between most identical units SAFT-Egg 3 Dust SAFT-Egg 1 Dust SAFT-Egg 2 Dust SAFT-Shinyei 2 SAFT-Shinyei 1 SAFT- Dylos 1 S SAFT- Airbeam 3 SAFT- Airbeam 2 SAFT- Airbeam 1 SAFT- Dylos 3 S SAFT- Dylos 2 S FEM PM2.5 SAFT- MetOne 3 SAFT- MetOne 1 SAFT- MetOne 2 WSN-N4 Shinyei SAFT-Egg 3 Dust SAFT-Egg 1 Dust SAFT-Egg 2 Dust SAFT-Shinyei 2 SAFT-Shinyei 1 SAFT- Dylos 1 S SAFT- Airbeam 3 SAFT- Airbeam 2 SAFT- Airbeam 1 SAFT- Dylos 3 S SAFT- Dylos 2 S FEM PM2.5 SAFT- MetOne 3 SAFT- MetOne 1 SAFT- MetOne 2 WSN-N4 Shinyei * After the sensor field deployment, Multiple sensors reporting the same pollutant of interest were compared against readings recorded by the regulatory NCore instruments. For duplicate or triplicate sensors evaluated in SAFT, readings were compared between or among sensors to understand the reproducibility of signal from different units of the same sensor type. * This presents presents selected results for ambient PM, O3 and NO2 concentration measurements, representing approximately 9 months of continuous field data collection (Aug 2014to May 2015). Correlation matrices are presented for 12-hr average PM, and hourly O3 and NO2 readings measuring the same pollutant of interest. * The correlation plots show both comparison between sensors measuring the same pollutant as well as comparing against the South Dekalb federal equivalent method (FEM) monitor. These plots represent the correlation values between all pairs of data by shape (ellipses), color and the numeric values. The ellipses are visual representations of scatter plot; for example, perfect positive correlation is shown as a red 1:1 line and has a numerical value 100 (Pearson R = 1.0) and no correlation is indicated as a yellow circle with numerical value close to zero. A negative numerical value represents a negative relationship between paired variables. Generally, sensors from the same manufacturer demonstrate much higher between-sensor correlations (e.g., R>0.9) than when paired with sensors from other manufacturers. Correlations between individual sensor and FEM measurements are shown within the purple rectangle. The performance of PM sensors are widely variable, with the correlation coefficient between individual sensor and co-located FEM instrument ranging from -0.06 to 0.68 for 12-hr averaged concentrations. Variable correlation with reference (r = -0.06 to 0.68)

CAIRSENSE Correlation Matrix of Hourly Average O3 between Sensors and Co-located FEM Strong correlation between identical units FEM O3 WSN N4-Aeroqual SAFT-Aeroqual 1 SAFT-Aeroqual 2 SAFT-AQMesh 1 SAFT-AQMesh 2 SAFT_CairClip 1 (-FEM NO2) WSN N4- CairClip (-FEM NO2) FEM O3 WSN N4-Aeroqual SAFT-Aeroqual 1 SAFT-Aeroqual 2 SAFT-AQMesh 1 SAFT-AQMesh 2 SAFT_CairClip 1 (-FEM NO2) WSN N4-CairClip (-FEM NO2) In comparison, the correlation coefficients between O3 sensor and FEM varies from 0.15 to 0.95 for hourly data. For sensors such as Cairpol Cairclip that report combined O3/NO2 readings, individual pollutant concentrations were separated from each other by subtracting the reference FEM data. Variable correlation with reference (r = 0.15 to 0.95)

The following slides are topics that I expect Chet to cover, but in case he doesn’t address all of them we can include these slides. Otherwise, we can leave them out.

Proposed Revisions to Appendix W Guidelines EPA Proposed Revisions to Appendix W on July 14, 2015 Public Hearing held and Comment Period closed October 27, 2015 Final revisions expected November 2016 – Currently in OMB review process EPA’s OAQPS and Model Clearinghouse are responsible for the development and proposal of all preferred models or techniques per Appendix W, Section 3.1. Major Proposed changes include: Codifying the process of the Regional Offices consulting and coordinating with the Model Clearinghouse on all approvals of alternative models and techniques For long-range air quality assessments, the EPA is proposing to remove CALPUFF as a preferred model Incorporate current modeling techniques to address the secondary chemical formation of fine particle and ozone pollution from direct, single source emissions To provide more flexibility and improve the meteorological inputs used for regulatory modeling Major Proposed changes include: -Codifying the long-standing process of the Regional Offices consulting and coordinating with the Model Clearinghouse on all approvals of alternative models and techniques -For long-range air quality assessments, the EPA is proposing to remove CALPUFF as a preferred model and recommending its use as a screening technique along with other Lagrangian models for addressing PSD increment beyond 50 km from a new or modifying source. -Incorporate current modeling techniques to address the secondary chemical formation of fine particle and ozone pollution from direct, single source emissions of pollutants that form them such as sulfur dioxide, oxides of nitrogen, volatile organic compounds -To provide more flexibility and improve the meteorological inputs used for regulatory modeling * Most commenters were supportive of the revisions to Appendix W pertaining to the Model Clearinghouse, especially considering that the revisions were only solidifying an existing process in clear and transparent regulatory text. * The most common comment expressed concern that the expanded Model Clearinghouse role would potentially delay alternative model approvals and subsequently extend the permitting process. * Contact George Bridgers at OAQPS for questions about the Appendix W proposal

PM2.5 and Ozone Significant Impact Level (SIL) Guidance Recommends PM2.5 and Ozone SILs and provides a stronger technical basis for SILs Draft ozone and PM2.5 SIL guidance posted online on August 1, 2016, updated on August 8, 2016 Guidance and Supporting documents (including the technical basis document) were made available for informal review and comment through September 30, 2016 EPA plans to issue final guidance by the end on 2016 Additional information available at the following website: https://www.epa.gov/nsr/webinar-draft-sils-guidance-august-24-2016 Please contact Jennifer Shaltanis at OAQPS with any questions at 919-541- 2580 or shaltanis.jennifer@epa.gov (updated August 8, 2016)

Model Emission Rates for Precursors (MERPs) Guidance SILs guidance will be complemented by the development of MERPs guidance (NOx and VOC for ozone) and (NOx and SO2 for PM2.5) Used as a tier 1 “screening tool” to determine if modeling is required Intertwined with SILs EPA has switched to near-term guidance for quicker response instead of a rulemaking as originally planned Note: There will not be a single national number for MERPs Guidance will provide recommended procedure for developing regional, state-specific MERPs Draft MERPs guidance expected soon (review and comment) Final MERPs guidance expected shortly after SIL guidance