Presentation is loading. Please wait.

Presentation is loading. Please wait.

National Air Quality Forecasting Capability: Nationwide Prediction and Future Enhancements Ivanka Stajner 1,5, Daewon Byun 2,†, Pius Lee 2, Jeff McQueen.

Similar presentations


Presentation on theme: "National Air Quality Forecasting Capability: Nationwide Prediction and Future Enhancements Ivanka Stajner 1,5, Daewon Byun 2,†, Pius Lee 2, Jeff McQueen."— Presentation transcript:

1 National Air Quality Forecasting Capability: Nationwide Prediction and Future Enhancements Ivanka Stajner 1,5, Daewon Byun 2,†, Pius Lee 2, Jeff McQueen 3, Roland Draxler 2, Phil Dickerson 4, Kyle Wedmark 1,5 1 National Oceanic and Atmospheric Administration (NOAA), National Weather Service 2 NOAA, Air Resources Laboratory (ARL) 3 NOAA, National Center for Environmental Prediction (NCEP) 4 Environmental Protection Agency (EPA) 5 Noblis, Inc † Deceased 2011 National Air Quality Conference, San Diego, CAMarch 9, 2011

2 Overview Background Recent Progress Ozone Smoke Dust PM2.5 Feedback and Outreach Summary and Plans 2 Overview

3 National Air Quality Forecast Capability Current and Planned Capabilities, 3/11 Improving the basis for AQ alertsImproving the basis for AQ alerts Providing AQ information for people at riskProviding AQ information for people at risk 3 Near-term Targets: Higher resolution prediction Prediction Capabilities: Operations: Ozone nationwide: expanded from EUS to CONUS (9/07), AK (9/10) and HI (9/10) Smoke nationwide: implemented over CONUS (3/07), AK (9/09), and HI (2/10) Experimental testing: Ozone upgrades Dust predictions Developmental testing: Ozone upgrades Components for particulate matter (PM) forecasts 2005: O 3 2007 : O 3, & smoke 6 2010 smoke ozone 2009: smoke 2010: ozone Longer range: Quantitative PM 2.5 prediction Extend air quality forecast range to 48-72 hours Include broader range of significant pollutants

4 National Air Quality Forecast Capability End-to-End Operational Capability 4 Model Components: Linked numerical prediction system Operationally integrated on NCEP’s supercomputer NCEP mesoscale NWP: WRF-NMMNCEP mesoscale NWP: WRF-NMM NOAA/EPA community model for AQ: CMAQNOAA/EPA community model for AQ: CMAQ NOAA HYSPLIT model for smoke predictionNOAA HYSPLIT model for smoke prediction Observational Input: NWS weather observations; NESDIS fire locationsNWS weather observations; NESDIS fire locations EPA emissions inventory/monitoring dataEPA emissions inventory/monitoring data Gridded forecast guidance products On NWS servers: www.weather.gov/aq and ftp-serversOn NWS servers: www.weather.gov/aq and ftp-servers On EPA serversOn EPA servers Updated 2x dailyUpdated 2x daily Verification basis, near-real time: Ground-level AIRNow observationsGround-level AIRNow observations Satellite smoke observationsSatellite smoke observations Customer outreach/feedback State & Local AQ forecasters coordinated with EPAState & Local AQ forecasters coordinated with EPA Public and Private Sector AQ constituentsPublic and Private Sector AQ constituents

5 5 Progress in 2010 Progress in 2010 Ozone, Smoke Operational Nationwide; Dust Testing Ozone: Expanded Forecast Guidance to Alaska and Hawaii domains in NWS operations (9/10) –Operations, 2010: Updates for CONUS (emissions), new 1, 8-hour daily maximum products –Developmental testing: changing boundary conditions, dry deposition, PBL in CB-05 system Smoke: Expanded Forecast Guidance to Hawaii domain in NWS operations (2/10) –Operations: CONUS predictions operational since 2007, AK predictions since 2009 –Developmental testing: Improvements to verification Aerosols: Developmental testing providing comprehensive dataset for diagnostic evaluations. (CONUS) –CMAQ (aerosol option), testing CB-05 chemical mechanism with AERO-4 aerosol modules Qualitative; summertime underprediction consistent with missing source inputs –Dust and smoke inputs: testing dust contributions to PM2.5 from global sources Real-time testing of combining smoke inputs with CMAQ-aerosol –Testing experimental prediction of dust from CONUS sources –Developing prototype for assimilation of surface PM2.5 measurements –R&D efforts continuing in chemical data assimilation, real-time emissions sources, advanced chemical mechanisms

6 OZONE Operational predictions at http://www.weather.gov/aq/ 6 1-Hr Average Ozone 8-Hr Average Ozone

7 Near-real-time Ozone Verification 7 Year 2009 Year 2010 skill target: fraction correct ≥ 0.9 Comparing model with observations from ground level monitors compiled and provided by EPA’ s AIRNow program Fraction correct with respect to 75ppb (code orange) threshold for operational prediction (using CMAQ CBIV mechanism) over 48 contiguous states

8 Chemical mechanism sensitivity analysis operational ozone prediction (CB-IV mechanism) Summertime, Eastern US. Sensitivity studies in progress: Chemical speciation Indicator reactions in box model studies (organic nitrate, NOx, PAN) 8 Seasonal ozone bias for CONUS Mechanism differences  Ozone production  Precursor budget Seasonal input differences  Emissions  Meteorological parameters in 2009 Operational (CB-IV) Experimental (CB05) Experimental ozone prediction (with updated CB05 mechanism) shows larger biases than

9 ppb WRF/NMM-CMAQ NMMB-CMAQ Impact of NMM-B meteorology August 10, 2010 example 9 NMMB-CMAQ reduces overprediction of 8-hr maximum ozone in the coastal region in the northeastern US Overprediction case highlighted by Michael Giegert, CT DEP

10 SMOKE Operational predictions at http://www.weather.gov/aq/ 10 Surface Smoke Column Smoke

11 Smoke Prediction Example: Four Mile Canyon Fire 11 “The blaze broke out Monday morning in Four Mile Canyon northwest of Boulder and rapidly spread across roughly 1,400 hectares. Erratic wind gusts sometimes sent the fire in two directions at once.” “The 11-square-mile blaze had destroyed at least 92 structures and damaged at least eight others by Tuesday night, Boulder County sheriff's Cmdr. Rick Brough said.” September 7-8, 2010

12 DUST Experimental predictions at http://www.weather.gov/aq-expr/ 12 Surface DustColumn Dust

13 13 Developmental testing Standalone prediction of airborne dust from dust storms: Wind-driven dust emitted where surface winds exceed thresholds over source regions Source regions with emission potential estimated from monthly MODIS deep blue climatology (2003- 2006) HYSPLIT model for transport, dispersion and deposition Updated source map in experimental testing (Draxler et al., JGR, 2010) CONUS Dust Predictions: Experimental Testing

14 Dust simulation compared with IMPROVE data 14 HYSPLIT simulated air concentrations are compared with IMPROVE data in southern California and western Arizona: Model simulations: contours IMPROVE dust concentration: numbers next to “+“ Variability in model simulation Highest measured value in Phoenix Model and IMPROVE comparable in 3-10 ug/m 3 range Spotty pattern indicates under- prediction. Considering ways for improving representation of smaller dust emission sources. (Draxler et al., JGR, 2010) June-July 2007 average dust concentration

15 PM2.5 Developmental predictions 15 1h PM2.5

16 Developmental predictions, Summer 2010 Aerosols over CONUS From National emissions inventory (NEI) sources only   CMAQ: CB05 gases, AERO-4 aerosols   sea salt emissions and reactions Seasonal bias: winter, overpredictionwinter, overprediction summer, underpredictionsummer, underprediction Testing of real-time wildfire smoke emissions in CMAQ 16 Summer 2010

17 Assimilation of surface PM data Assimilation increases correlation between forecast and observed PM2.5 Control is forecast without assimilation 17 Bias ratio Equitable threat score [%] Assimilation reduces bias and increases equitable threat score Pagowski et al QJRMS, 2010 Exploring bias correction approaches, e.g. Djalalova et al., Atmospheric Env., 2010

18 18 Partnering with AQ Forecasters Focus group, State/local AQ forecasters: Participate in real-time developmental testing of new capabilities, e.g. aerosol predictionsParticipate in real-time developmental testing of new capabilities, e.g. aerosol predictions Provide feedback on reliability, utility of test productsProvide feedback on reliability, utility of test products Local episodes/case studies emphasisLocal episodes/case studies emphasis Regular meetings; working together with EPA’s AIRNow and NOAARegular meetings; working together with EPA’s AIRNow and NOAA Feedback is essential for refining/improving coordinationFeedback is essential for refining/improving coordination http://www.epa.gov/airnow/airaware/ This year’s Air Awareness week is May 2 nd – May 6 th 2011

19 AQ Forecaster Feedback Examples William Ryan, Penn State: Significant, and continuing, reductions in NO x emissions on a regional scale since 2003 and lowered O 3 concentrations The key relationship that powers statistical forecast models (O 3 and Tmax) has weakened Forecasters must now rely on numerical forecast models, updated with EGU emissions data yearly, for forecast guidance. NAQC model guidance out-performed other forecast methods in 2010 Precursor emissions and O 3 concentrations not static since 2003 and not likely to be in the near future *from Bill Ryan’s 2011 AMS presentation, The Effect of Recent Regional Emissions Reductions on Air Quality Forecasting in the mid-Atlantic 19 Skill Scores (2010) for Code Orange O 3 Threshold (≥ 76 ppbv, 8-hour)

20 Summary US national AQ forecasting capability status: Ozone prediction nationwide (AK and HI since September 2010) Smoke prediction nationwide (HI since February 2010) Experimental testing of dust prediction from CONUS sources (since June 2010) Developmental testing of CMAQ aerosol predictions with NEI sources Near-term plans for improvements and testing: Operational dust predictions over CONUS Development of satellite product for verification of dust predictions Understanding/reduction of summertime ozone biases in the CB05 system Transitions to meteorological inputs from NMM-B regional weather model Development of higher resolution predictions 20 Target operational implementation of initial quantitative total PM2.5 forecasts for northeastern US in 2015

21 21 Acknowledgments: AQF Implementation Team Members Special thanks to Paula Davidson, OST chief scientist and former NAQFC Manager NOAA/NWS/OSTTim McClung NAQFC Manager NOAA/OARJim Meagher NOAA AQ Matrix Manager NWS/OCWWS Jannie Ferrell Outreach, Feedback NWS/OPS/TOC Cindy Cromwell, Bob Bunge Data Communications NWS/OST/MDL Jerry Gorline, Marc Saccucci, Dev. Verification, NDGD Product Development Tim Boyer, Dave Ruth NWS/OSTKen Carey, Kyle Wedmark Program Support NESDIS/NCDCAlan Hall Product Archiving NWS/NCEP Jeff McQueen, Youhua Tang, Marina Tsidulko, AQF model interface development, testing, & integration Jianping Huang Jianping Huang *Sarah Lu, Ho-Chun Huang Global data assimilation and feedback testing *Brad Ferrier, *Dan Johnson, *Eric Rogers, WRF/NAM coordination *Hui-Ya Chuang Geoff Manikin Smoke Product testing and integration Dan Starosta, Chris Magee NCO transition and systems testing Robert Kelly, Bob Bodner, Andrew Orrison HPC coordination and AQF webdrawer NOAA/OAR/ARL Pius Lee, Rick Saylor, Daniel Tong, CMAQ development, adaptation of AQ simulations for AQF Tianfeng Chai, Fantine Ngan, Yunsoo Choi, Hyun-Cheol Kim, Yunhee Kim, Mo Dan Roland Draxler, Glenn Rolph, Ariel Stein HYSPLIT adaptations NESDIS/STAR Shobha Kondragunta, Jian Zeng Smoke Verification product development NESDIS/OSDPD Matt Seybold, Mark Ruminski HMS product integration with smoke forecast tool EPA/OAQPS partners: Chet Wayland, Phil Dickerson, Scott Jackson, Brad Johns AIRNow development, coordination with NAQFC * * Guest Contributors

22 22 Operational AQ Forecast Guidance www.weather.gov/aq Further information: www.nws.noaa.gov/ost/air_quality Smoke Products CONUS implemented in March 2007 AK implemented September 2009 HI implemented in February 2010 CONUS Ozone Expansion Implemented in September 2007 AK and HI implemented in September 2010

23 23 Backup

24 Smoke forecast tool: Alaska example Real-time objective verification: Based on NOAA/NESDIS satellite imagery: – –GOES Aerosol Smoke Product – –Smoke from identified fires only Filtered for interference: – –Clouds, surface reflectance, solar angle, other aerosol “Footprint” comparison: – –Threshold concentration (1 µ g/m 3 ) for average smoke in the column – –Tracking Threat scores, or Figure- of-merit statistics (FMS): 24 Very active fire season over Alaska in 2009 More than 2,950,000 acres burned 7/13/09, 17-18Z, Prediction: GOES smoke product: Confirms areal extent of peak concentrations FMS = 35%, for column-averaged smoke > 1 ug/m 3 July 2009 Target skill is 0.08 (Area Pred  Area Obs) ---------------------------------- (Area Pred. U Area Obs) 7/13/09, 17-18Z, Observation: A Pred A Obs


Download ppt "National Air Quality Forecasting Capability: Nationwide Prediction and Future Enhancements Ivanka Stajner 1,5, Daewon Byun 2,†, Pius Lee 2, Jeff McQueen."

Similar presentations


Ads by Google