Network Assessment by Station Rankings: Description of Methodology Network Assessment Technical Support Group June 2001.

Slides:



Advertisements
Similar presentations
1 PM NAAQS: Update on Coarse Particle Monitoring and Research Efforts Lydia Wegman, Office of Air Quality Planning & Standards, EPA Presentation at the.
Advertisements

Heather Simon, Adam Reff, Benjamin Wells, Neil Frank Office of Air Quality Planning and Standards, US EPA Ozone Trends Across the United States over a.
NASA AQAST 6th Biannual Meeting January 15-17, 2014 Heather Simon Changes in Spatial and Temporal Ozone Patterns Resulting from Emissions Reductions: Implications.
Representativity of the Iowa Environmental Mesonet Daryl Herzmann and Jeff Wolt, Department of Agronomy, Iowa State University The Iowa Environmental Mesonet.
FASTNET Report: 0409RegHazeEvents04 Eastern US Regional Haze Events: Automated Detection and Documentation for 2004 Contributed by the FASNET Community,
PM 2.5 in the Upper Midwest Michael Koerber Lake Michigan Air Directors Consortium.
Correction of Particulate Matter Concentrations to Reference Temperature and Pressure Conditions Stefan R. Falke and Rudolf B. Husar Center for Air Pollution.
Maps of PM2.5 over the U.S. Derived from Regional PM2.5 and Surrogate Visibility and PM10 Monitoring Data Stefan R. Falke and Rudolf B. Husar Center for.
| Folie 1 Classification and Assessment of Representativeness of Air Quality Monitoring Stations La Rochelle, Wolfgang Spangl.
Missouri Air Quality Issues Stephen Hall Air Quality Analysis Section Air Pollution Control Program Air Quality Applied Sciences Team (AQAST) 9 th Semi-Annual.
OTAG Air Quality Analysis Workgroup Volume I: EXECUTIVE SUMMARY Dave Guinnup and Bob Collom, Workgroup co-chair “Telling the ozone story with data”
COMPGZ07 Project Management Presentations Graham Collins, UCL
CAPITA CAPITA PM and Ozone Analysis A. PM2.5 National Maps B. Visibility (PM2.5) trends C. Natural (out of EPA jurisdiction) Events D. US-Canada Ozone.
Revisions to NAAQS –Data Handling and Interpretation NO 2 /SO 2 Update AQS Conference Colorado Springs June 2010 Rhonda Thompson US EPA, Office of Air.
Alternative Approaches for PM2.5 Mapping: Visibility as a Surrogate Stefan Falke AAAS Science and Engineering Fellow U.S. EPA - Office of Environmental.
| Philadelphia | Atlanta | Houston | Washington DC SO 2 Data Requirements Rule – A Proactive Compliance Approach Mark Wenclawiak, CCM |
REVIEW OF ENVIRONMENT AND HEALTH INFORMATION UNDER THE EU E&H ACTION PLAN Work on ambient air APHEIS Network Meeting Ispra 6-7 June 2006 Scott.
MODELS3 – IMPROVE – PM/FRM: Comparison of Time-Averaged Concentrations R. B. Husar S. R. Falke 1 and B. S. Schichtel 2 Center for Air Pollution Impact.
Measures of Variability In addition to knowing where the center of the distribution is, it is often helpful to know the degree to which individual values.
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.
EPA’s Strength in Both GEOSS & ESIP; Environmental Health Decisionmaking January 10, 2008 George Gray, Ph.D. Assistant Administrator EPA Office of Research.
The National Ambient Air Monitoring Strategy and Network Design Westar Spring 2007 Business Meeting April 4, 2007 Bruce Louks, Idaho Department of Environmental.
Declustering in the Spatial Interpolation of Air Quality Data Stefan R. Falke and Rudolf B. Husar.
PM Network Assessment: Speciated Network Planning Prepared for EPA OAQPS Richard Scheffe by Rudolf B. Husar Center for Air Pollution Impact and Trend Analysis,
Describing distributions with numbers
Ideas on a Network Evaluation and Design System Prepared for EPA OAQPS Richard Scheffe by Rudolf B. Husar and Stefan R. Falke Center for Air Pollution.
Fine scale air quality modeling using dispersion and CMAQ modeling approaches: An example application in Wilmington, DE Jason Ching NOAA/ARL/ASMD RTP,
The Use of Source Apportionment for Air Quality Management and Health Assessments Philip K. Hopke Clarkson University Center for Air Resources Engineering.
EPA’s DRAFT SIP and MODELING GUIDANCE Ian Cohen EPA Region 1 December 8, 2011.
Ambient Air Monitoring Networks 2010 CMAS Conference Chapel Hill, NC October 13, 2010 Rich Scheffe, Sharon Phillips, Wyatt Appel, Lew Weinstock, Tim Hanley,
Spatial Pattern of PM2.5 over the US PM2.5 FRM Network Analysis for the First Year: July 1999-June 2000 Prepared for EPA OAQPS Richard Scheffe by Rudolf.
Conceptual Design of an Enhanced Multipurpose Aerometric Monitoring Network in Central California NOV. 15, 2002 AWMA SYMPOSIUM ON AIR QUALITY MEASUREMENT.
| Folie 1 Assessment of Representativeness of Air Quality Monitoring Stations Geneva, Wolfgang Spangl.
1 Emission and Air Quality Trends Review Central States May 2013.
Research Progress Discussions of Coordinated Emissions Research Suggestions to Guide this Initiative Focus on research emission inventories Do not interfere.
Quantitative Network Assessment Methodology: Illustration for PM Data Prepared for EPA OAQPS Richard Scheffe by Rudolf B. Husar and Stefan R. Falke Center.
PM Network Assessment: Speciated Network Planning Prepared for EPA OAQPS Richard Scheffe by Rudolf B. Husar Center for Air Pollution Impact and Trend Analysis,
Air Quality Monitoring Network Assessment: Illustration of the Assessment and Planning Methodology Speciated PM2.5 Prepared for EPA OAQPS Richard Scheffe.
Lead NAAQS Review: 2 nd Draft Risk Assessment NTAA/EPA Tribal Air Call August 8, 2007 Deirdre Murphy and Zachary Pekar OAQPS.
OTAG Air Quality Analysis Workgroup Volume I: EXECUTIVE SUMMARY Dave Guinnup and Bob Collom, Workgroup co-chair Telling the OTAG Ozone Story with Data.
August 1999PM Data Analysis Workbook: Characterizing PM23 Spatial Patterns Urban spatial patterns: explore PM concentrations in urban settings. Urban/Rural.
OZONE CHARACTERIZATION STUDY Contractor Technical & Business Systems, Inc. Environmental Research Associates.
So, what’s the “point” to all of this?….
Session 5, CMAS 2004 INTRODUCTION: Fine scale modeling for Exposure and risk assessments.
Analysis of station classification and network design INERIS (Laure Malherbe, Anthony Ung), NILU (Philipp Schneider), RIVM (Frank de Leeuw, Benno Jimmink)
Network Assessment Based on Daily Max Ozone Concentration Prepared for EPA OAQPS Richard Scheffe by Rudolf B. Husar and Stefan R. Falke Center for Air.
PM Monitoring Network Design Tool and Resources Planning Document Planning Document from 1996 dfdshg.
Exposure Assessment for Health Effect Studies: Insights from Air Pollution Epidemiology Lianne Sheppard University of Washington Special thanks to Sun-Young.
Possible use of Copernicus MACC-II modeling products in EEAs assessment work Leonor Tarrasón, Jan Horálek, Laure Malherbe, Philipp Schneider, Anthony Ung,
Network Assessment Based on Compliance Monitoring (Deviation from NAAQS) Prepared for EPA OAQPS Richard Scheffe by Rudolf B. Husar and Stefan R. Falke.
Comparison of NAAQS RIA and Risk Assessments - Bryan Hubbell.
Technical Details of Network Assessment Methodology: Concentration Estimation Uncertainty Area of Station Sampling Zone Population in Station Sampling.
CCOS TC Kickoff Meeting Cluster Analysis for CCOS Domain Ahmet Palazoglu (P.I.) Scott Beaver Swathi Pakalapati University of California, Davis Department.
Alternative Approaches for PM2.5 Mapping: Visibility as a Surrogate Stefan Falke AAAS Science and Engineering Fellow U.S. EPA - Office of Environmental.
Breakout Session 1 Air Quality Jack Fishman, Randy Kawa August 18.
Concepts on Aerosol Characterization R.B. Husar Washington University in St. Louis Presented at EPA – OAQPS Seminar Research Triangle Park, NC, April 4,
Technical Details of Network Assessment Methodology: Concentration Estimation Uncertainty Area of Station Sampling Zone Population in Station Sampling.
1 European air indicator reporting Process and experience © iStockphoto.com/cmisje.
Research Progress Discussions of Coordinated Emissions Research Suggestions to Guide this Initiative Focus on research emission inventories Do not interfere.
There is increasing evidence that intercontinental transport of air pollutants is substantial Currently, chemical transport models are the main tools for.
N Engl J Med Jun 29;376(26): doi: 10
Weight of Evidence for Regional Haze Reasonable Progress
Predicting PM2.5 Concentrations that Result from Compliance with National Ambient Air Quality Standards (NAAQS) James T. Kelly, Adam Reff, and Brett Gantt.
Initial thoughts on the benefits of TEMPO data for AQ management
How Can TEMPO Contribute to Air Pollution Health Effects Research
Instrumental Surface Temperature Record
PMcoarse , Monitoring Budgets, and AQI
Why Monitor Air Quality?
Advanced Algebra Unit 1 Vocabulary
U.S. Perspective on Particulate Matter and Ozone
Presentation transcript:

Network Assessment by Station Rankings: Description of Methodology Network Assessment Technical Support Group June 2001

Background on AQ Network Assessment To Rich Scheffe: This would need your touch… Monitoring the ambient concentrations provide the necessary sensory input to the various aspects of AQ management. EPA OAQPS has initiated a program to make the existing AQ monitoring networks more responsive to the needs of AQ management. Efforts are under way to implement new monitoring systems (PAMS and PM2.5 mass, PM2.5 composition). At the same time the performance of the existing ozone and PM monitoring sites are re-assessed for possible re-location or elimination.

Monitoring Network Evaluation: Multiple Criteria This should be trimmed & streamlined AQ Monitoring networks need to support multiple aspects of air quality management including risk assessment, compliance monitoring and tracking the effectiveness of control measures. Multiple purposes may require very different network designs. For example, health risk characterization requires sampling of the most harmful species over populated areas during episodes; tracking of emission-concentration changes requires broad regional sampling for establishing pollutant budgets. In general, an AQ monitoring network is characterized by the spatial distribution of sampling stations, temporal sampling pattern and the species measured. The current methodology is focused on evaluating the geographic features of the network. The consideration of the temporal and species aspects of network evaluation is left for future considerations. This methodology is based on multiple relative rankings of individual existing stations.

Network Layout: Uniform or Clustered? The existing monitoring networks for O3, PM2.5 and weather parameters show very different strategies: The monitoring ozone network is highly clustered in around populated areas (top). Evidently, O3 regulatory network is laid out with to focus on areas ‘where the people are’. The recently established FRM PM2.5 network is less clustered (center). On the other hand, the automated surface weather observing system, ASOS is uniformly distributed in space for broad spatial coverage (bottom). Clearly, the layout of these networks is tailored for different purposes. Click on images for full resolution

Network Evaluation Combining Subjective and Objective Steps 1.First, select multiple evaluation criteria i.e. risk assessment, compliance monitoring, trend tracking etc. This is a subjective procedure driven by the network objectives. 2.Decide on specific measures that can represent each criterion, i.e. number of persons in the sampling zone of each station; concentration etc. The selection of the suitable measures is also somewhat subjective. 3.Calculate the numeric value of each measure for each station in the network. This can be performed objectively using well defined, transparent algorithmic procedures. 4.Rank the stations according the each measure. This yields a separate rank value for each measure. For example, a station may be ranked 5 by day-max O3 and ranked 255 by persons in the sampling zone. This step can also be performed objectively. 5.Weigh the rankings, i.e. set the relative importance of various measures. This involves comparing ‘apples and oranges’ and it is clearly subjective. 6.Add the weighed rankings to derive the overall importance of the station and rank the stations by this aggregate measure. Use the aggregate ranking to guide decisions on network modifications. The proposed network evaluation methodology combines subjective and objective methods for network evaluation. Below is the outline of the hybrid procedure.

Network Evaluation Using Five Independent Measures The approach is illustrated with the ozone network using five independent measures. The five different measures represent the information need for (1) risk assessment, (2) compliance monitoring and (3) tracking. The methodology allows easy incorporation of additional measures. These are all measures of the network benefits. Other benefits measures (temporal, species) should also be incorporated. For cost-benefit analysis, the cost of the network operation should also be incorporated. AQ Management ActivityGeographic Info. Need Station Measure Risk assessmentPollutant concentration 4 th highest O3 Risk AssessmentPersons/Station Compliance evaluationConc. vicinity to NAAQS Deviation from NAAQS Reg./local source attribution & trackingSpatial coverage Area of Sampling Zone All aboveEstimation uncertainty Meas. & estimate difference Persons in sampling zone

The Independent Measures of Network Performance In this assessment, five independent measures are used to evaluate the AQ monitoring network performance. As the Network Assessment progresses, additional measure will be incorporated. Further details about the five measures can be found in separate presentations pertaining each measure. (See links below. Note: Use the Back arrow on the browser to return to this presentation). Pollutant Concentration is a measure of the health risk. According to the NAAQS, the relevant statistic is the 4 th highest daily max concentration over 3 years. The station with the highest 4 th highest daily max value is ranked 1.Pollutant Concentration Persons/Station measures the number of people in the ‘sampling zone’ of each station. Using this measure the station with the largest population in it’s zone is ranked #1. Note: Estimating the health risk requires both the population and the concentration in the sampling zone.Persons/Station Deviation from NAAQS measures the station’s value for compliance evaluation. The station ranking is according to the absolute difference between the station value and the NAAQS (85 ppb). The highest ranking is for the station whose concentration is closest to the standard (smallest deviation). Stations well above or below the standard concentration are ranked low.Deviation from NAAQS Spatial coverage measures the geographic surface are each station covers. The highest ranking is for the station with the largest area in it’s sampling zone. This measure assigns high relative value to remote regional sites and low value to clustered urban sites with small sampling zones.Spatial coverage Estimation uncertainty measures the ability to estimate the concentration at a station location using data from all other stations. The station with the highest deviation between the actual and the estimated values (i.e. estimation uncertainty) is ranked #1. In other words, the stations who’s values can be estimated accurately from other data are ranked (valued) low.Estimation uncertainty

Ranking of Stations In the following pages, the ozone network is evaluated using the five independent measures. The results are shown in maps with consistent representation scale for each five measures. For each measure, the stations are ranked in importance. The upper quartile (75 th percentile) and the lower quartile (25 th percentile) of the stations are highlighted. The focus is on the stations with low ranking of their ‘value’ i.e. on candidate stations for elimination. Subsequently, the rankings are aggregated subjectively to yield overall evaluation.

Ranking by Daily Max Concentration The daily max concentration is a factor in health risk. The stations with the highest O3 levels (red) are located over the NE Megalopolis, Ohio River Valley and the urban center in the Southeast: Dallas, Houston, Atlanta. The stations with the lowest O3 levels (blue) are located throughout the remainder EUS. Contiguous regions of low O3 are found in Florida, Upper Midwest, and the inland part of the Northeast. From O3 exposure perspective, the blue stations are ranked lowest.

Aggregate Ranking of Stations Aggregation of rankings is not simply a weighing the rankings since it involves subjective judgments. However, once the relative weights of different rankings are available, (from the negotiation process) the current methodology allows their incorporation into the assessment The following pages illustrate several aggregate station rankings.

Aggregate Ranking – Equal Weight All five measures are weighed equal at 20% each. High ‘aggregate value’ stations (red) are located over both urban and rural segments of the central EUS. Low ‘value’ sites (blue) are inter- dispersed with high value sites. Clusters of low value sites are found over Florida, Upper Midwest, and the inland portion of New England.

Aggregate Ranking – Focus on ‘Risk Assessment’ This weighing of ranking adds extra weight for population. The high ‘aggregate value’ stations (red) are distributed throughout the urban and no-urban areas of EUS. The low ‘value’ stations are (blue) clustered over Chicago, inland New England, Florida and New Orleans and St. Louis.

Summary of the Network Evaluation Methodology: Establish network evaluation criteria using subjective judgment. Calculate objective measures for each criteria for each site using existing data. Using the uniform standards across the network, rank stations according to each criteria using these objective measures. Subjectively weigh the rankings to establish the aggregate ranking.