VISTAS Meteorological Modeling November 6, 2003 National RPO Meeting St. Louis, MO Mike Abraczinskas North Carolina Division of Air Quality.

Slides:



Advertisements
Similar presentations
NASA AQAST 6th Biannual Meeting January 15-17, 2014 Heather Simon Changes in Spatial and Temporal Ozone Patterns Resulting from Emissions Reductions: Implications.
Advertisements

VISTAS Modeling Overview May 25, 2004 Mt. Cammerer, Great Smoky Mtns. National Park.
Template Meteorological Modeling Protocol for the Three States Air Quality Study (3SAQS) Ralph Morris and Bart Brashers ENVIRON International Corporation.
Updates on NOAA MM5 Assessment Where we left off Buoy assessment Temperature problems Solar radiation assessment Z T simulation Analysis nudging Where.
2002 MM5 Model Evaluation 12 vs. 36 km Results Chris Emery, Yiqin Jia, Sue Kemball-Cook, and Ralph Morris ENVIRON International Corporation Zion Wang UCR.
Georgia Institute of Technology Evaluation of CMAQ with FAQS Episode of August 11 th -20 th, 2000 Yongtao Hu, M. Talat Odman, Maudood Khan and Armistead.
UNC Modification Proposal Revised Timescales for LDZ Shrinkage Arrangements Simon Trivella – 25 th September 2008 Distribution Workstream.
8 th Conference on Air Quality Modeling – A&WMA AB-3 Comments on CALPUFF Implementation Issues By Mark Bennett, CH2M HILL.
Jared H. Bowden Saravanan Arunachalam
Natural Background Visibility Feb. 6, 2004 Presentation to VISTAS State Air Directors Mt. Cammerer, Great Smoky Mtn. National Park.
Dynamical Downscaling of CCSM Using WRF Yang Gao 1, Joshua S. Fu 1, Yun-Fat Lam 1, John Drake 1, Kate Evans 2 1 University of Tennessee, USA 2 Oak Ridge.
Recent performance statistics for AMPS real-time forecasts Kevin W. Manning – National Center for Atmospheric Research NCAR Earth System Laboratory Mesoscale.
CENRAP Modeling Workgroup Mational RPO Modeling Meeting May 25-26, Denver CO Calvin Ku Missouri DNR May 25, 2004.
Rapid Update Cycle Model William Sachman and Steven Earle ESC452 - Spring 2006.
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
Working together for clean air Puget Sound Area Ozone Modeling NW AIRQUEST December 4, 2006 Washington State University Puget Sound Clean Air Agency Washington.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 2011 WRF Modeling Model Performance Evaluation University of North Carolina (UNC-IE)
Transitioning CMAQ for NWS Operational AQ Forecasting Jeff McQueen*, Pius Lee*, Marina Tsildulko*, G. DiMego*, B. Katz* R. Mathur,T. Otte, J. Pleim, J.
Warm Season Precipitation Predictions over North America with the Eta Regional Climate Model Model Sensitivity to Initial Land States and Choice of Domain.
Jamie Wolff Jeff Beck, Laurie Carson, Michelle Harrold, Tracy Hertneky 15 April 2015 Assessment of two microphysics schemes in the NOAA Environmental Modeling.
Tanya L. Otte and Robert C. Gilliam NOAA Air Resources Laboratory, Research Triangle Park, NC (In partnership with U.S. EPA National Exposure Research.
V:\corporate\marketing\overview.ppt CRGAQS: Meteorological Modeling Presentation to the SWCAA By ENVIRON International Corporation Alpine Geophysics, LLC.
11/09/2015FINNISH METEOROLOGICAL INSTITUTE CARPE DIEM WP 7: FMI Progress Report Jarmo Koistinen, Heikki Pohjola Finnish Meteorological Institute.
OThree Chemistry MM5 Model Diagnostic and Sensitivity Analysis Results Central California Ozone Study: Bi-Weekly Presentation 1 T. W. Tesche Dennis McNally.
November 1, 2013 Bart Brashers, ENVIRON Jared Heath Bowden, UNC 3SAQS WRF Modeling Recommendations.
Russ Bullock 11 th Annual CMAS Conference October 17, 2012 Development of Methodology to Downscale Global Climate Fields to 12km Resolution.
WRF Winter Modeling Towards Improving Cold Air Pools Jared Bowden Kevin Talgo UNC Chapel Hill Institute for the Environment Feb. 25, 2015.
Sensitivity of top-down correction of 2004 black carbon emissions inventory in the United States to rural-sites versus urban-sites observational networks.
2004 Workplan WRAP Regional Modeling Center Prepared by: Gail Tonnesen, University of California Riverside Ralph Morris, ENVIRON Corporation Zac Adelman,
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.
Evaluation of CMAQ Sensitivities for VISTAS Air Quality Modeling James W. Boylan Georgia Department of Natural Resources (VISTA Technical Lead for Air.
COSMO General Meeting, Offenbach, 7 – 11 Sept Dependance of bias on initial time of forecasts 1 WG1 Overview
Jonathan Pleim 1, Robert Gilliam 1, and Aijun Xiu 2 1 Atmospheric Sciences Modeling Division, NOAA, Research Triangle Park, NC (In partnership with the.
Western Air Quality Study (WAQS) Intermountain West Data Warehouse (IWDW) Revisions to WAQS Phase 2 SoW: September 2015 – March 2016 University of North.
1 CCOS Update November 3, 2006 PC Meeting Project Status –Completed Projects Results –On-Going Projects Status Plan for CCOS Final Phase –Guiding Principles.
Meteorology of Winter Air Pollution In Fairbanks.
Modification of GFS Land Surface Model Parameters to Mitigate the Near- Surface Cold and Wet Bias in the Midwest CONUS: Analysis of Parallel Test Results.
VISTAS Emissions Inventory Overview Nov 4, VISTAS is evaluating visibility and sources of fine particulate mass in the Southeastern US View NE from.
Meteorological Data Analysis Urban, Regional Modeling and Analysis Section Division of Air Resources New York State Department of Environmental Conservation.
OThree Chemistry MM5/CAMx Model Diagnostic and Sensitivity Analysis Results Central California Ozone Study: Bi-Weekly Presentation 2 T. W. Tesche Dennis.
Emission Projections National RPO Meeting St. Louis, MO November 6, 2003 Presented by: Gregory Stella VISTAS Technical Advisor – Emission Inventories.
How well can we model air pollution meteorology in the Houston area? Wayne Angevine CIRES / NOAA ESRL Mark Zagar Met. Office of Slovenia Jerome Brioude,
Fugitive Dust Project Phase One The WRAP Emissions Forum contracted with a team of contractors lead by ENVIRON to produce regional PM 10 and PM 2.5 emissions.
1 CRGAQS: Meteorological Modeling prepared for Southwest Clean Air Agency 19 June 2006 prepared by Alpine Geophysics, LLC ENVIRON International Corp.
Technical Projects Update WRAP Board Meeting Salt Lake City, UT November 10, 2004.
The Operational Impact of QuikSCAT Winds at the NOAA Ocean Prediction Center Joe Sienkiewicz – NOAA Ocean Prediction Center Joan Von Ahn – STG/NESDIS ORA.
Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations T. W. Tesche & Dennis McNally -- Alpine Geophysics, LLC Ralph Morris -- ENVIRON Gail Tonnesen.
Impact of Meteorological Inputs on Surface O 3 Prediction Jianping Huang 9 th CMAS Annual Conference Oct. 12, 2010, Chapel, NC.
Photo image area measures 2” H x 6.93” W and can be masked by a collage strip of one, two or three images. The photo image area is located 3.19” from left.
Applications of Models-3 in Coastal Areas of Canada M. Lepage, J.W. Boulton, X. Qiu and M. Gauthier RWDI AIR Inc. C. di Cenzo Environment Canada, P&YR.
1 Impact on Ozone Prediction at a Fine Grid Resolution: An Examination of Nudging Analysis and PBL Schemes in Meteorological Model Yunhee Kim, Joshua S.
Evaluation of Models-3 CMAQ I. Results from the 2003 Release II. Plans for the 2004 Release Model Evaluation Team Members Prakash Bhave, Robin Dennis,
VISTAS Meteorological Modeling 2002 Simulation May 25, 2004 National RPO Modeling Meeting Denver, CO George Bridgers North Carolina Division of Air Quality.
May 22, UNDERSTANDING THE EFFECTIVENESS OF PRECURSOR REDUCTIONS IN LOWERING 8-HOUR OZONE CONCENTRATIONS Steve Reynolds Charles Blanchard Envair 12.
Boundary layer depth verification system at NCEP M. Tsidulko, C. M. Tassone, J. McQueen, G. DiMego, and M. Ek 15th International Symposium for the Advancement.
AoH/MF Meeting, San Diego, CA, Jan 25, 2006 WRAP 2002 Visibility Modeling: Summary of 2005 Modeling Results Gail Tonnesen, Zion Wang, Mohammad Omary, Chao-Jung.
VISTAS Modeling Overview Oct. 29, 2003
Impacts of Meteorological Variations on RRFs (Relative Response Factors) in the Demonstration of Attainment of the National Ambient Air Quality for 8-hr.
Satellite Data Assimilation Activities at CIMSS for FY2003 Robert M. Aune Advanced Satellite Products Team NOAA/NESDIS/ORA/ARAD Cooperative Institute for.
JMA GPRC report Arata OKUYAMA Meteorological Satellite Center,
MANE-VU Emissions Inventory Update
Verification Overview
A. Topographic radiation correction in COSMO: gridscale or subgridscale? B. COSMO-2: convection resolving or convection inhibiting model? Matteo Buzzi.
VISTAS Modeling Overview
Causes of Haze Assessment Brief Overview and Status Report
Ensemble-4DWX update: focus on calibration and verification
The Value of Nudging in the Meteorology Model for Retrospective CMAQ Simulations Tanya L. Otte NOAA Air Resources Laboratory, RTP, NC (In partnership with.
Air Resource Specialists, Inc. July 23, 2004
REGIONAL AND LOCAL-SCALE EVALUATION OF 2002 MM5 METEOROLOGICAL FIELDS FOR VARIOUS AIR QUALITY MODELING APPLICATIONS Pat Dolwick*, U.S. EPA, RTP, NC, USA.
WRAP 2014 Regional Modeling
Presentation transcript:

VISTAS Meteorological Modeling November 6, 2003 National RPO Meeting St. Louis, MO Mike Abraczinskas North Carolina Division of Air Quality

Contract with Baron Advanced Meteorological Systems (BAMS) –Formerly known as MCNC –Don Olerud, BAMS Technical Lead –Contract initiated January 2003

Meteorological Modeling Goals Phase I: Test model to define the appropriate set up for our region Investigate -> Model -> Evaluate -> Make decisions

Meteorological Modeling Goals Phase I Summary of recent and relevant MM5 sensitivity studies Draft delivered: January 2003 Learn from what others have done –Inter-RPO collaboration Will serve a starting point for VISTAS Recommend a set of sensitivity tests –Draft delivered: January 2003 –Different physics options and inputs proposed for testing

Meteorological Modeling Goals Phase I Evaluation methodologies –Draft delivered: January 2003, Updated April 2003 –Assessing Model Performance Conceptual understanding correct? – placement, timing of features Are diurnal features adequately captured Are clouds reasonably well modeled Are precipitation fields reasonable Do wind fields generally match observations Do temperature and moisture fields match observations Million dollar question… Do the meteorological fields produce acceptable air quality model results?

Evaluation: Spatial Products Spatial Aloft Products Timeseries Products Sounding Products Spatial Statistics Products Timeseries Statistics Products Combination Products Timeseries Statistics Aloft Products Statistical Tables Form Profiler Products Cross Sensitivity products Meteorological Modeling Goals Phase I

Meteorological Modeling Goals Phase I: Test model to define the appropriate set up for our region Investigate -> Model -> Evaluate -> Make decisions Periods that we’re modeling ? Geographical extent of testing ?

Sensitivity episodes January 1 – 20, 2002Episode 1 July 13 – 27, 2001Episode 2 July 13 – 21, 1999Episode 3 Choice of episode periods was based on: –Availability of robust AQ databases –Full AQ cycle (clean-dirty-clean) –Availability of meteorological data –Air quality and meteorological regime

36 km 12 km

Sensitivity Tests –PX_ACMPleim-Xiu land-surface model, ACM pbl scheme –NOAH_MRFNOAH land-surface model, MRF pbl scheme –Multi_BlkdrMulti-layer soil model, Blackadar pbl scheme –NOAH ETA M-YNOAH land-surface model, ETA Mellor-Yamada pbl BASE CASE

January 2002 – Episode 1 PX_ACM case significantly cold-biased PX_ACM runs are continuous (i.e. soil/moisture values from one modeling segment serves as initial conditions for following segment) Significantly better results obtained by making each P-X run independent (PX_ACM2)

T T

T T

T T

T T

1.5m Temperature stats 12 km domain - All hours - Episode 1 RunBias abserrIA PX PX

T

Daytime CFRAC (alt)

Daytime CFRAC (alt) Diff

Nighttime CFRAC (alt)

Nighttime CFRAC (alt) Diff

24-h Pcp

24-h Pcp Diff

Daytime Pcp

Daytime Pcp Diff

T

PBL Heights Subjective observations NOAH_MRF by far the highest and smoothest Probably too high –PX_ACM2 ~= Multi_blkdr PX_ACM2 subject to some suppressed PBL heights (in areas) during the day –Some of this may be real ? (over melting snow, or in presence of clouds/precipitation) –Lack of observations make this nearly impossible to evaluate PX_ACM2 very low at night –NOAH_ETA-MY lowest during day

Time Series Statistics 3-Panel Plots –Bias, Error, Index of Agreement for t, q, cld, spd, dir, RH –Bias, Accuracy, Equitable Threat Score for pcp (0.01, 0.05, 0.10, 0.25, 0.5, 1.0 in) –Labels sometimes difficult to see, so colors remain consistent px_acm(2): Blue noah_mrf: Red multi_blkdr: Black noah_eta-my: Purple –Pcp plots only available for “Full” regions

Temp Stats (Episode 1)

Mixing Ratio

Wind Speed

Wind Direction

Cloud Fraction

Cloud Fraction (Alt)

Relative Humidity

T (~500 m aloft)

T (~1600 m aloft)

T (~3400 m aloft)

Q (aloft)

D (aloft)

Precipitation (0.01 in)

Spatial Statistics Station-specific statistical ranking px_acm, noah_mrf, multi_blkdr, noah_eta-mypx_acm, noah_mrf, multi_blkdr, noah_eta-my Best sensitivity displayed Hourly (Composite UTC day), Total stats available PAVE date label just a placeholder Bias, error, rmse (total only) Warning: Possibly little difference between “best” and “worst” sensitivity

T

QV

SPD

DIR

UV

CLD2

RH

Episode 1 summary PX_ACM2 seems best overall Winds best in NOAH_ETA-MY (but PX-ACM2 not bad) Mixing ratio best in NOAH_MRF RH/Temp best in PX_ACM2 Significant differences in PBL heights (NOAH_MRF > PX_ACM2 > NOAH_ETA-MY )

Qualitative Analysis Uses only the Time Series Statistics Based on overall trend and model performance Not based on any quantitative values, although bias and error trends are considered

Qualitative Analysis Episode 1 January 2002 VISTAS 12 KM Episode 2 July 2001 VISTAS 12 KM Episode 3 July 1999 VISTAS 12 KM

T T Peachtree City, GA 00Z soundings

T T Nashville, TN00Z soundings

T T Greensboro, NC12Z soundings

T T Tampa, FL00Z soundings

Conclusions No definite winner… but… PX-ACM probably “best” overall –No very poor statistical quantity –PBL behavior a concern –PX-ACM or PX-ACM2 ? More air quality results likely needed before “best and final” sensitivity –NOAH_ETA-MY likely to show significantly different air quality results due to different PBL behavior –Wind performance a concern for NOAH_MRF –Temperature/precip performance a concern for NOAH_ETA-MY

Aug 2003: Emissions Inventory Base 2002 Dec 2003: Revised Em Inv Base 2002 Jan 2004: Modeling Protocol Mar 2004: Draft Em Inv 2018 July 2004: Revised State Em Inv Base 2002 Sept 2004: Annual Base Year Model Runs Dec 2004: Annual Run 2018 Apr 2004: DDM in CMAQ Oct 2004: Sensitivity Runs episodes Jan 2004: Met, Em, AQ model testing 3 episodes Sept 2004: Revised Em Inv 2018 Oct-Dec 2004: Control Strategy Inventories Jan 2005: Sensitivity Runs 2018 episodes Jan-Jun 2005: Control Strategy Runs 2018 Mar 2004: CART:select sensitivity episodes July-Dec 2005: Observations Conclusions Recommendations After Jun 2005 Model Runs: e.g. Power Plant Turnover Before Jun 2005 Other Inventory: e.g. Power Plant Turnover Meteorological, Emissions, and Air Quality Modeling Deliverables State Regulatory Activities Jan-Mar 2004 Define BART sources Optional June 2004 Identify BART controls Draft 10/31/03 EPA- approved Modeling Protocol

Contact information