Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment of the Speciated PM Network (Initial Draft, November 2004 ) Washington University, St. Louis CIRA/NPS VIEWS Team.

Similar presentations


Presentation on theme: "Assessment of the Speciated PM Network (Initial Draft, November 2004 ) Washington University, St. Louis CIRA/NPS VIEWS Team."— Presentation transcript:

1 Assessment of the Speciated PM Network (Initial Draft, November 2004 ) Washington University, St. Louis CIRA/NPS VIEWS Team

2 Contents Background and Approach –Aerosol Speciation and the NCore Network –Aerosol Species in IMPROVE and SPEC Networks –Technical Approach to the Network Assessment Temporal Data Coverage –Long-Term Trend Data: Fine Mass, SO4, K (1988-03) –Recent Trends (1999-03): Fine Mass, Sulfate, Nitrate, OC (IMPROVE), OC_NIOSH (SPEC), EC (IMPROVE), EC_NIOSH (SPEC) Spatial Data Coverage –Total Data Stock Maps: Fine Mass, So4, NO3, Ocf, Ocf_NIOSH, Euf –Fine Mass, Sulfate, Nitrate, Europium Information Value of Stations: Error for SO 4 –Estimation Error: Single Day –Estimation Error: Long-term Average Temporal and Spatial SO 4 Characterization Assessment Summary

3 Aerosol Speciation and the NCore Network NCore is to characterize the pollutant pattern for many applications Speciation provides the data for aerosol source apportionment In NCore, ‘core species’ are measured at Level-2 and L1 sites Currently (2003), the speciation sites exceed 350 The challenge is to assess the evolution and status of the speciation network

4 Technical Approach to the Network Assessment This draft Network Assessment is a collaboration of the CAPITA and CIRA groups CIRA has created an integrated speciation database as part of the RPO VIEWS project CAPITA has applied the analysis tools of DataFed to the VIEWS database The results of the assessment analysis are presented (this PPT) to the NCore team Guided by the evaluation and feedback from NCore, the assessment is revised CIRA/ VIEWS Database CAPITA/ DataFed Database Network Assessment PPT IMPROVE EPA SPEC CIRA Tools and Processes DataFed Tools and Processes Analysis Tools and Processes Speciated Data Flow and Processing EPA NCore Process Evaluation, Feedback

5 Aerosol Species Monitored Snapshot for Aug 19, 2003 IMPROVE monitors over 30 species EPA monitors over 40 species About 25 species are reported by both IMPROVE Species SPECIATION IMP + SPEC Data Count

6 Long-Term Monitoring: Fine Mass, SO4, K Long-term speciated monitoring begun in 1988 with the IMPROVE network Starting in 2000, the IMPROVE and EPA networks have expanded By 2003, the IMPROVE + EPA species are sampled at 350 sites In 2003, the FRM/IMPROVE PM25 network is reporting data from over 1200 sites Fine Mass Sulfate Potassium

7 Fine Mass, Sulfate, Nitrate Monitoring (1999-03) Daily valid station counts for sulfate has increased from 50 to 350 About 250 sites sample every 3 rd day, 350 sites every 6 th day Fine Mass NitrateSulfate Every 6 th day Every 3 rd day Every 6 th day Every 3 rd day Every 6 th day Every 3 rd day

8 OC (IMPROVE), OC_NIOSH (SPEC), 1999-2003 Organic Carbon at the IMPROVE (OCf) and EPA (Ocf_NIOSH) sites are not fully compatible Since 2000, the IMPROVE OCf monitoring has increased from 50 to 150 sites Since 2001, the EPA network has grown from 20 to over 200 sites (Need to separate STN?? Which are the STN site codes?) IMPROVE OCf EPA OCf_NIOSH

9 EC (IMPROVE), EC_NIOSH (SPEC), 1999-2003 Same as Organic Carbon … redundant?? IMPROVE ECf EPA ECf_NIOSH

10 Data Stock: Fine Mass, So4 The data stock is the accumulated data resource The IMPROVE sites that begun in 1988 have over 1500 samples (red circles) The more recent EPA sites have 400 or less samples IMPROVE + EPA Fine Mass IMPROVE + EPA Sulfate

11 Data Stock: Ocf, Ocf_NIOSH, NO3, EUf The IMPROVE data stock for OCf and NO3f over 1500, for EPA OCf_NIOSH is < 400 Europium is only measured at the EPA sites, ~400 samples or less IMPROVE OCf IMPROVE/EPA NO3f EPA Ocf_NIOSH EPA Europium

12 Evolution of Spatial Data Coverage: Fine Mass 1998-2003 Before 1998, IMPROVE provided much of the PM2.5 data (non-FRM EPA PM25 not included here) In the 1990s, the mid-section of the US was not covered By 2003, the PM2.5 sites (1200+) covered most of the US 1998 1999 2000 20032002 2001

13 Evolution of Spatial Data Coverage: Fine Sulfate, 1998-2003 1998 1999 2000 20032002 2001 Before 1998, IMPROVE provided much of the PM2.5 sulfate In the 1990s, the mid-section of the US was not covered By 2003, the IMPROVE and EPA sulfate sites (350+) covered most of the US

14 Evolution of Spatial Data Coverage: Nitrate, 1998-2003 Ditto as Sulfate 1998 1999 2000 20032002 2001

15 Evolution of Spatial Data Coverage: Europium, 1998-2003 Europium and many other trace elements are only measured in the EPA network Starting 2001, the EPA network expanded to over 200 sites

16 Site Information Value The information value of a site can be measured by how much it reduces the uncertainty of concentration estimates If the concentration at a site can be estimated accurately for auxiliary data, (estimation error is low), then the information content of that site is low If the estimate is poor (estimation error is high), then the information content of that site is high, since having that site reduces the concentration uncertainty Thus, estimation error is a measure of the information content of a specific site Estimation Error Calculation Cross-validation estimates the information value of individual stations by –removing each monitor, one at a time –estimating the concentration at the removed monitor location by the ‘best available’ spatial extrapolation scheme –calculation the error as difference (Estimated – Measured) or ratio (Estimated/Measured) The ‘best available’ estimated concentration is calculated using –de-clustering by the method of S. Falke –interpolation using 1/r -4 weighing –Using the nearest 10 stations within a 600 km search radius In the following examples, the error is estimated for two extreme cases: –for a single day with significant spatial gradients (Aug 19, 2003) –for the grand average concentration field with smooth pattern

17 SO4 Estimation Error: Single Day (Aug 19, 2003) The measured concentration contour plot is shown for all data The estimated contour uses only values estimated from neighboring sites The difference map indicates +/- 5 ug/m3 estimation errors (need to check ) The ratio map shows the error to be over 50% Measured Estimated RatioDifference

18 Estimation Error: Long-term Average (2000-2003) The long-term average concentration pattern are smoother than for single day The concentration error difference is below +/- 1 ug/m3 for most stations in the East The error/measured ratio is below 20% except at few (<5) sites In the West, the errors are higher due to varying site elevation/topographic barriers MeasuredEstimated Ratio Difference

19 Average SO4: Estimated - Measured The Estimated-Measured correlation at Eastern sites is r 2 ~ 0.8 For the Western US sites the correlation is only r 2 ~ 0.5

20 Temporal and Spatial SO4 Characterization For SO4, the temporal coverage is every 3 or 6 days Typical ‘transport distance’ between 3-day samples (at 5 m/sec wind speed) is about 1500 km On the other hand, the characteristic distance between sites is about 160 km (total area 9x10 6 km 2, 350 sites) Thus, the spatial sampling provides at least 10-fold more ‘characterization' than the 3 rd day temporal sampling. 3 Day Transport 1 Day Transport

21 Speciation Network Assessment Summary (Initial Draft, November 2004 ) Since 2000, speciated aerosol monitoring has grown from 50 to 350 sites IMPROVE and EPA sites have accumulated 1500 and 400 data points, respectively By 2003, the spatial coverage for speciated sampling was high throughout the US For long-term SO4 averages, the estimation error over the East was below 1 ug/m 3 For a specific day with strong SO4 gradient, the error was below up to 5 ug/m 3 The 350 sites provide at least 10-fold more ‘characterization' than the 3 rd day sampling

22 AIRNOW PM25 - ASOS RH- Corrected Bext July 21, 2004July 22, 2004July 23, 2004 ARINOW PM25 ASOS RHBext

23 Quebec Smoke July 7, 2002 Satellite Optical Depth & Surface ASOS RHBext

24 A note to the NCore implementation managers: From ad hoc to Continuous Network Assessment By design, NCore will be changing in response to the evolving conditions Nudging NCore toward desired goals, requires assessment and feedback FRM mass and speciation monitoring is now ready to be ‘monitored’ The indicators can be calculated from the VIEWS integrated database Many assessment tools (maps, charts, CrossVal) are developed or feasible … so, it may be time to consider … Automated Network Assessment as a routine part of monitoring

25 Continuous Speciated Network Assessment: A Feasibility Pilot Project Currently, network assessments are done intermittently with ad hoc tools Network status and trends monitoring is now possible with web-based tools A ‘pilot feasibility project’ could aid the design of operational network assessment Such automatic feedback would contribute to the agility of the monitoring network CIRA/ VIEWS Database CAPITA/ DataFed Database Automatic Assessment WebTool IMPROVE EPA SPEC Monitoring Networks CIRA Tools and Processes DataFed Tools and Processes Analysis Tools and Processes Network Assessmt PPT Analysis Tools and Processes Network Adjustment Many Other Factors

26 Data Life Cycle: Acquisition Phase – Usage Phase Need a ‘force’ to move data from one-shot to reusable form External force – contracts Internal – humanitarian, benefits

27 The Researcher/Analyst’s Challenge “The researcher cannot get access to the data; if he can, he cannot read them; if he can read them, he does not know how good they are; and if he finds them good he cannot merge them with other data.” Information Technology and the Conduct of Research: The Users View National Academy Press, 1989

28 Data Flow Resistances These resistances can be overcome through a distributed system that catalogs and standardizes the data allowing easy access for data manipulation and analysis. The user does not know what data are available The available data are poorly described (metadata) There is a lack of QA/QC information Incompatible data can not be combined and fused The data flow process is hampered by a number of resistances.


Download ppt "Assessment of the Speciated PM Network (Initial Draft, November 2004 ) Washington University, St. Louis CIRA/NPS VIEWS Team."

Similar presentations


Ads by Google