Assessment of the Speciated PM Network (Initial Draft, November 2004 ) Washington University, St. Louis CIRA/NPS VIEWS Team.

Slides:



Advertisements
Similar presentations
Proposal Outline: Extensions to the VIEWS System: Analysis Tools and Auxiliary Data R. Husar, CAPITA March, 2003 Presentation and Analysis Tools CATT for.
Advertisements

FASTNET Report: 0409RegHazeEvents04 Eastern US Regional Haze Events: Automated Detection and Documentation for 2004 Contributed by the FASNET Community,
PM 2.5 in the Upper Midwest Michael Koerber Lake Michigan Air Directors Consortium.
Carbon Measurements and Adjustments Measurement of organics by IMPROVE & STN networks, Use of blank data to correct carbon concentration measurements,
Fusion of SeaWIFS and TOMS Satellite Data with Surface Observations and Topographic Data During Extreme Aerosol Events Stefan Falke and Rudolf Husar Center.
IMPROVE Network Assessment Plans. IMPROVE Network Assessment Motivation: –EPA’s air quality monitoring budget is not growing, but their requirements are.
Proposal Outline: Extensions to the VIEWS: General CATT Analysis Tool R. Husar, CAPITA Revised, June 26, 2003 Proposed Sub-Projects CATT for VIEWS$20k.
Maps of PM2.5 over the U.S. Derived from Regional PM2.5 and Surrogate Visibility and PM10 Monitoring Data Stefan R. Falke and Rudolf B. Husar Center for.
Aircraft spiral on July 20, 2011 at 14 UTC Validation of GOES-R ABI Surface PM2.5 Concentrations using AIRNOW and Aircraft Data Shobha Kondragunta (NOAA),
Supplemental Regional Haze-Related Data Rich Poirot, VT DEC Dallas RPO Mtg, December 2002.
Reason for Doing Cluster Analysis Identify similar and dissimilar aerosol monitoring sites so that we can test the ability of the Causes of Haze Assessment.
Results of Ambient Air Analyses in Support of Transport Rule Presentation for RPO Workshop November 2003.
Alternative Approaches for PM2.5 Mapping: Visibility as a Surrogate Stefan Falke AAAS Science and Engineering Fellow U.S. EPA - Office of Environmental.
MODELS3 – IMPROVE – PM/FRM: Comparison of Time-Averaged Concentrations R. B. Husar S. R. Falke 1 and B. S. Schichtel 2 Center for Air Pollution Impact.
Interpolation.
Technical Support for Exceptional Event Analysis for Volcano Impacts on PM2.5 in Hawaii using the Exceptional Event Decision Support System (EE DSS)EE.
PM Network Assessment: Speciated Network Planning Prepared for EPA OAQPS Richard Scheffe by Rudolf B. Husar Center for Air Pollution Impact and Trend Analysis,
FASTNET Report: 0409FebMystHaze Mystery Winter Haze: Natural? Nitrate/Sulfate? Stagnation? Contributed by the FASNET Community, Sep Correspondence.
Regional Haze Rule Reasonable Progress Goals I.Overview II.Complications III.Simplifying Approaches Prepared by Marc Pitchford for the WRAP Reasonable.
The Use of Source Apportionment for Air Quality Management and Health Assessments Philip K. Hopke Clarkson University Center for Air Resources Engineering.
RPO Monitoring Issues by Marc Pitchford, Ph.D. WRAP Ambient Monitoring & Reporting Forum Co-chair.
VISTAS Data / Monitoring Overview Scott Reynolds SC DHEC- Larry Garrison KY DNREP Data Workgroup Co-Chairs RPO National Technical Workgroup Meeting – St.
Exceptional Event Analysis Draft, July 13, 2005
Project Outline: Technical Support to EPA and RPOs Estimation of Natural Visibility Conditions over the US Project Period: June May 2008 Reports:
An Integrated Systems Solution to Air Quality Data and Decision Support on the Web GEO Architecture Implementation Pilot – Phase 2 (AIP-2) Kickoff Workshop.
Spatial Interpolation III
Spatio-Temporal Data Sharing using XML Web Services Presented at the Workgroup Meeting on Web-based Environmental Information System for Global Emission.
Spatial Pattern of PM2.5 over the US PM2.5 FRM Network Analysis for the First Year: July 1999-June 2000 Prepared for EPA OAQPS Richard Scheffe by Rudolf.
FASTNET Event Report: July4Haze, July 6, 2004 July 4, 2004 Aerosol Pulse Event Summary by the FASTNET Community Please send PPT slides or comments.
Stefan Falke Center for Air Pollution Impact and Trend Analysis Washington University in St. Louis Brooke Hemming US EPA – Office of Research and Development.
PM Network Assessment: Speciated Network Planning Prepared for EPA OAQPS Richard Scheffe by Rudolf B. Husar Center for Air Pollution Impact and Trend Analysis,
Model Evaluation Comparing Model Output to Ambient Data Christian Seigneur AER San Ramon, California.
Operational Evaluation and Comparison of CMAQ and REMSAD- An Annual Simulation Brian Timin, Carey Jang, Pat Dolwick, Norm Possiel, Tom Braverman USEPA/OAQPS.
August 1999PM Data Analysis Workbook: Characterizing PM23 Spatial Patterns Urban spatial patterns: explore PM concentrations in urban settings. Urban/Rural.
PM Event Detection from Time Series Contributed by the FASNET Community, Sep Correspondence to R Husar, R PoirotR Husar, R Poirot Coordination Support.
NATURAL AND TRANSBOUNDARY POLLUTION INFLUENCES ON AEROSOL CONCENTRATIONS AND VISIBILITY DEGRADATION IN THE UNITED STATES Rokjin J. Park, Daniel J. Jacob,
Network Assessment by Station Rankings: Description of Methodology Network Assessment Technical Support Group June 2001.
Ozone Data Integration for OTAG Quality Analysis and Evaluation Model Janja D. Husar and Rudolf B. Husar CAPITA, Center for Air Pollution Impact and trend.
Network Assessment Based on Daily Max Ozone Concentration Prepared for EPA OAQPS Richard Scheffe by Rudolf B. Husar and Stefan R. Falke Center for Air.
11 September 2015 On the role of measurements and modelling in Dutch air quality policies Guus Velders The Netherlands (RIVM)
The Federated Data System, DataFed ESIP Winter MeetingESIP Winter Meeting, Jan 10, 2013, Washington DC Rudolf Husar, Washington University, St. Louis Presented.
Distributed Data Analysis & Dissemination System (D-DADS ) Special Interest Group on Data Integration June 2000.
Possible Extensions to the VIEWS System Tools CATT for VIEWS30k Gridder/Contourer15k Auxiliary Data ASOS-Current Visibility20k Historical Weather20k Infrastructure.
Global and Local Dust over North America Initial Assessment by a Virtual Community on Dust Coordinated by R.
Support by Inter-RPO WG - NESCAUM Performed by CAPITA & Sonoma Technology, Inc F ast A erosol S ensing T ools for N atural E vent T racking FASTNET Project.
Estimating PM 2.5 from MODIS and MISR AOD Aaron van Donkelaar and Randall Martin March 2009.
Network Assessment Based on Compliance Monitoring (Deviation from NAAQS) Prepared for EPA OAQPS Richard Scheffe by Rudolf B. Husar and Stefan R. Falke.
Technical Details of Network Assessment Methodology: Concentration Estimation Uncertainty Area of Station Sampling Zone Population in Station Sampling.
MARAMA/NESCAUM/LADCO Project: MARAMA/NESCAUM/LADCO Project: Source Apportionment of Air Quality Monitoring Data: Paired Aerosol / Trajectory Database Analysis.
Aerosol Characterization Using the SeaWiFS Sensor and Surface Data E. M. Robinson and R. B. Husar Washington University, St. Louis, MO
Alternative Approaches for PM2.5 Mapping: Visibility as a Surrogate Stefan Falke AAAS Science and Engineering Fellow U.S. EPA - Office of Environmental.
NARSTO PM Assessment NARSTO PM Assessment Chapter 5: Spatial and Temporal Pattern TOC Introduction Data Global Pattern NAM Dust NAM Smoke NAM Haze NAM.
An Integrated Fire, Smoke and Air Quality Data & Tools Network Stefan Falke and Rudolf Husar Center for Air Pollution Impact and Trend Analysis Washington.
Attribution of Haze Report Update and Web Site Tutorial Implementation Work Group Meeting March 8, 2005 Joe Adlhoch Air Resource Specialists, Inc.
Ambient Monitoring & Reporting Forum Plans for 2005 Prepared by Marc Pitchford for the WRAP Planning Team Meeting (3/9 – 3/10/05)
Proposal to NARSTO: Incorporation of Canadian and Mexican PM data into NAM Spatio-Temporal Analysis Submitted to NARSTO Rudolf Husar May 2001.
Application of NASA ESE Data and Tools to Particulate Air Quality Management A proposal to NASA Earth Science REASoN Solicitation CAN-02-OES-01 REASoN:
Aerosol Pattern over Southern North America Tropospheric Aerosols: Science and Decisions in an International Community A NARSTO Technical Symposium on.
Proposal to MANE_VU: Extensions to the VIEWS: CATT Analysis Tool Full Proposal Text Full Proposal Text R. Husar, PI, CAPITA Revised, October 8, 2003 The.
Concepts on Aerosol Characterization R.B. Husar Washington University in St. Louis Presented at EPA – OAQPS Seminar Research Triangle Park, NC, April 4,
Technical Details of Network Assessment Methodology: Concentration Estimation Uncertainty Area of Station Sampling Zone Population in Station Sampling.
Fire, Smoke & Air Quality: Tools for Data Exploration & Analysis : Data Sharing/Processing Infrastructure This project integrates.
V:\corporate\marketing\overview.ppt CRGAQS: CAMx Sensitivity Results Presentation to the Gorge Study Technical Team By ENVIRON International Corporation.
Shawn McClure, Rodger Ames and Doug Fox - CIRA
IMPROVE/STN Comparison & Implications for Visibility and PM2.5
Brian Timin- EPA/OAQPS
Estimating PM2.5 using MODIS and GOES aerosol optical depth retrievals in IDEA Hai Zhang , Raymond M. Hoff1 , Shobha Kondragunta2.
Tom Moore (WESTAR and WRAP) and Pat Brewer (NPS ARD)
A Review of Time Integrated PM2.5 Monitoring Data in the United States
U.S. Perspective on Particulate Matter and Ozone
Presentation transcript:

Assessment of the Speciated PM Network (Initial Draft, November 2004 ) Washington University, St. Louis CIRA/NPS VIEWS Team

Contents Background and Approach –Aerosol Speciation and the NCore Network –Aerosol Species in IMPROVE and SPEC Networks –Technical Approach to the Network Assessment Temporal Data Coverage –Long-Term Trend Data: Fine Mass, SO4, K ( ) –Recent Trends ( ): Fine Mass, Sulfate, Nitrate, OC (IMPROVE), OC_NIOSH (SPEC), EC (IMPROVE), EC_NIOSH (SPEC) Spatial Data Coverage –Total Data Stock Maps: Fine Mass, So4, NO3, Ocf, Ocf_NIOSH, Euf –Fine Mass, Sulfate, Nitrate, Europium Information Value of Stations: Error for SO 4 –Estimation Error: Single Day –Estimation Error: Long-term Average Temporal and Spatial SO 4 Characterization Assessment Summary

Aerosol Speciation and the NCore Network NCore is to characterize the pollutant pattern for many applications Speciation provides the data for aerosol source apportionment In NCore, ‘core species’ are measured at Level-2 and L1 sites Currently (2003), the speciation sites exceed 350 The challenge is to assess the evolution and status of the speciation network

Technical Approach to the Network Assessment This draft Network Assessment is a collaboration of the CAPITA and CIRA groups CIRA has created an integrated speciation database as part of the RPO VIEWS project CAPITA has applied the analysis tools of DataFed to the VIEWS database The results of the assessment analysis are presented (this PPT) to the NCore team Guided by the evaluation and feedback from NCore, the assessment is revised CIRA/ VIEWS Database CAPITA/ DataFed Database Network Assessment PPT IMPROVE EPA SPEC CIRA Tools and Processes DataFed Tools and Processes Analysis Tools and Processes Speciated Data Flow and Processing EPA NCore Process Evaluation, Feedback

Aerosol Species Monitored Snapshot for Aug 19, 2003 IMPROVE monitors over 30 species EPA monitors over 40 species About 25 species are reported by both IMPROVE Species SPECIATION IMP + SPEC Data Count

Long-Term Monitoring: Fine Mass, SO4, K Long-term speciated monitoring begun in 1988 with the IMPROVE network Starting in 2000, the IMPROVE and EPA networks have expanded By 2003, the IMPROVE + EPA species are sampled at 350 sites In 2003, the FRM/IMPROVE PM25 network is reporting data from over 1200 sites Fine Mass Sulfate Potassium

Fine Mass, Sulfate, Nitrate Monitoring ( ) Daily valid station counts for sulfate has increased from 50 to 350 About 250 sites sample every 3 rd day, 350 sites every 6 th day Fine Mass NitrateSulfate Every 6 th day Every 3 rd day Every 6 th day Every 3 rd day Every 6 th day Every 3 rd day

OC (IMPROVE), OC_NIOSH (SPEC), Organic Carbon at the IMPROVE (OCf) and EPA (Ocf_NIOSH) sites are not fully compatible Since 2000, the IMPROVE OCf monitoring has increased from 50 to 150 sites Since 2001, the EPA network has grown from 20 to over 200 sites (Need to separate STN?? Which are the STN site codes?) IMPROVE OCf EPA OCf_NIOSH

EC (IMPROVE), EC_NIOSH (SPEC), Same as Organic Carbon … redundant?? IMPROVE ECf EPA ECf_NIOSH

Data Stock: Fine Mass, So4 The data stock is the accumulated data resource The IMPROVE sites that begun in 1988 have over 1500 samples (red circles) The more recent EPA sites have 400 or less samples IMPROVE + EPA Fine Mass IMPROVE + EPA Sulfate

Data Stock: Ocf, Ocf_NIOSH, NO3, EUf The IMPROVE data stock for OCf and NO3f over 1500, for EPA OCf_NIOSH is < 400 Europium is only measured at the EPA sites, ~400 samples or less IMPROVE OCf IMPROVE/EPA NO3f EPA Ocf_NIOSH EPA Europium

Evolution of Spatial Data Coverage: Fine Mass Before 1998, IMPROVE provided much of the PM2.5 data (non-FRM EPA PM25 not included here) In the 1990s, the mid-section of the US was not covered By 2003, the PM2.5 sites (1200+) covered most of the US

Evolution of Spatial Data Coverage: Fine Sulfate, Before 1998, IMPROVE provided much of the PM2.5 sulfate In the 1990s, the mid-section of the US was not covered By 2003, the IMPROVE and EPA sulfate sites (350+) covered most of the US

Evolution of Spatial Data Coverage: Nitrate, Ditto as Sulfate

Evolution of Spatial Data Coverage: Europium, Europium and many other trace elements are only measured in the EPA network Starting 2001, the EPA network expanded to over 200 sites

Site Information Value The information value of a site can be measured by how much it reduces the uncertainty of concentration estimates If the concentration at a site can be estimated accurately for auxiliary data, (estimation error is low), then the information content of that site is low If the estimate is poor (estimation error is high), then the information content of that site is high, since having that site reduces the concentration uncertainty Thus, estimation error is a measure of the information content of a specific site Estimation Error Calculation Cross-validation estimates the information value of individual stations by –removing each monitor, one at a time –estimating the concentration at the removed monitor location by the ‘best available’ spatial extrapolation scheme –calculation the error as difference (Estimated – Measured) or ratio (Estimated/Measured) The ‘best available’ estimated concentration is calculated using –de-clustering by the method of S. Falke –interpolation using 1/r -4 weighing –Using the nearest 10 stations within a 600 km search radius In the following examples, the error is estimated for two extreme cases: –for a single day with significant spatial gradients (Aug 19, 2003) –for the grand average concentration field with smooth pattern

SO4 Estimation Error: Single Day (Aug 19, 2003) The measured concentration contour plot is shown for all data The estimated contour uses only values estimated from neighboring sites The difference map indicates +/- 5 ug/m3 estimation errors (need to check ) The ratio map shows the error to be over 50% Measured Estimated RatioDifference

Estimation Error: Long-term Average ( ) The long-term average concentration pattern are smoother than for single day The concentration error difference is below +/- 1 ug/m3 for most stations in the East The error/measured ratio is below 20% except at few (<5) sites In the West, the errors are higher due to varying site elevation/topographic barriers MeasuredEstimated Ratio Difference

Average SO4: Estimated - Measured The Estimated-Measured correlation at Eastern sites is r 2 ~ 0.8 For the Western US sites the correlation is only r 2 ~ 0.5

Temporal and Spatial SO4 Characterization For SO4, the temporal coverage is every 3 or 6 days Typical ‘transport distance’ between 3-day samples (at 5 m/sec wind speed) is about 1500 km On the other hand, the characteristic distance between sites is about 160 km (total area 9x10 6 km 2, 350 sites) Thus, the spatial sampling provides at least 10-fold more ‘characterization' than the 3 rd day temporal sampling. 3 Day Transport 1 Day Transport

Speciation Network Assessment Summary (Initial Draft, November 2004 ) Since 2000, speciated aerosol monitoring has grown from 50 to 350 sites IMPROVE and EPA sites have accumulated 1500 and 400 data points, respectively By 2003, the spatial coverage for speciated sampling was high throughout the US For long-term SO4 averages, the estimation error over the East was below 1 ug/m 3 For a specific day with strong SO4 gradient, the error was below up to 5 ug/m 3 The 350 sites provide at least 10-fold more ‘characterization' than the 3 rd day sampling

AIRNOW PM25 - ASOS RH- Corrected Bext July 21, 2004July 22, 2004July 23, 2004 ARINOW PM25 ASOS RHBext

Quebec Smoke July 7, 2002 Satellite Optical Depth & Surface ASOS RHBext

A note to the NCore implementation managers: From ad hoc to Continuous Network Assessment By design, NCore will be changing in response to the evolving conditions Nudging NCore toward desired goals, requires assessment and feedback FRM mass and speciation monitoring is now ready to be ‘monitored’ The indicators can be calculated from the VIEWS integrated database Many assessment tools (maps, charts, CrossVal) are developed or feasible … so, it may be time to consider … Automated Network Assessment as a routine part of monitoring

Continuous Speciated Network Assessment: A Feasibility Pilot Project Currently, network assessments are done intermittently with ad hoc tools Network status and trends monitoring is now possible with web-based tools A ‘pilot feasibility project’ could aid the design of operational network assessment Such automatic feedback would contribute to the agility of the monitoring network CIRA/ VIEWS Database CAPITA/ DataFed Database Automatic Assessment WebTool IMPROVE EPA SPEC Monitoring Networks CIRA Tools and Processes DataFed Tools and Processes Analysis Tools and Processes Network Assessmt PPT Analysis Tools and Processes Network Adjustment Many Other Factors

Data Life Cycle: Acquisition Phase – Usage Phase Need a ‘force’ to move data from one-shot to reusable form External force – contracts Internal – humanitarian, benefits

The Researcher/Analyst’s Challenge “The researcher cannot get access to the data; if he can, he cannot read them; if he can read them, he does not know how good they are; and if he finds them good he cannot merge them with other data.” Information Technology and the Conduct of Research: The Users View National Academy Press, 1989

Data Flow Resistances These resistances can be overcome through a distributed system that catalogs and standardizes the data allowing easy access for data manipulation and analysis. The user does not know what data are available The available data are poorly described (metadata) There is a lack of QA/QC information Incompatible data can not be combined and fused The data flow process is hampered by a number of resistances.