Impact of Meteorological Inputs on Surface O 3 Prediction Jianping Huang 9 th CMAS Annual Conference Oct. 12, 2010, Chapel, NC.

Slides:



Advertisements
Similar presentations
Mei Xu, Jamie Wolff and Michelle Harrold National Center for Atmospheric Research (NCAR) Research Applications Laboratory (RAL) and Developmental Testbed.
Advertisements

A PERFORMANCE EVALUATION OF THE ETA - CMAQ AIR QUALITY FORECAST MODEL FOR THE SUMMER OF 2004 CMAS Workshop Chapel Hill, NC 20 October, 2004.
Constraining Anthropogenic Emissions of Fugitive Dust with Dynamic Transportable Fraction and Measurements Chapel Hill, NC October 22, 2009 Daniel Tong.
Jared H. Bowden Saravanan Arunachalam
Dynamical Downscaling of CCSM Using WRF Yang Gao 1, Joshua S. Fu 1, Yun-Fat Lam 1, John Drake 1, Kate Evans 2 1 University of Tennessee, USA 2 Oak Ridge.
1 Operational low visibility statistical prediction Frédéric Atger (Météo-France)
Regional Air Quality Modeling Progress at NOAA/NWS/NCEP
Evaluation of Real-time Air Quality Forecasts from WRF-NMM/CMAQ and NMMB/CMAQ Using Discover-AQ P-3B Airborne Measurements Youhua Tang 1,2, Jeffery T.
Effects of climate change on future wildfire and its impact on regional air quality Hyun Cheol Kim, Dae-Gyun Lee, and Daewon Byun 1 Institute for Multidimensional.
Transitioning CMAQ for NWS Operational AQ Forecasting Jeff McQueen*, Pius Lee*, Marina Tsildulko*, G. DiMego*, B. Katz* R. Mathur,T. Otte, J. Pleim, J.
Jamie Wolff Jeff Beck, Laurie Carson, Michelle Harrold, Tracy Hertneky 15 April 2015 Assessment of two microphysics schemes in the NOAA Environmental Modeling.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
Improving Cloud Simulation in Weather Research and Forecasting (WRF) Through Assimilation of GOES Satellite Observations Andrew White Advisor: Dr. Arastoo.
Tanya L. Otte and Robert C. Gilliam NOAA Air Resources Laboratory, Research Triangle Park, NC (In partnership with U.S. EPA National Exposure Research.
Regional Climate Modeling in the Source Region of Yellow River with complex topography using the RegCM3: Model validation Pinhong Hui, Jianping Tang School.
1 Improvements to Ozone Forecast Guidance for the National Air Quality Forecast Capability NWS/NCEP/EMC & NOAA/OAR/ARL - EPA October 2, 2007.
Air Resources Laboratory Yunsoo Choi 12, Daewon Byun 1, Pius Lee 1, Rick Saylor 1, Ariel Stein 12, Daniel Tong 12, Hyun-Cheol Kim 12, Fantine Ngan 13,
A Case Study Using the CMAQ Coupling with Global Dust Models Youhua Tang, Pius Lee, Marina Tsidulko, Ho-Chun Huang, Sarah Lu, Dongchul Kim Scientific Applications.
CMAS Conference, October 16 – 18, 2006 The work presented here was performed by the New York State Department of Environmental Conservation with partial.
Performance of the National Air Quality Forecast Capability, Urban vs. Rural and Other Comparisons Jerry Gorline and Jeff McQueen  Jerry Gorline, NWS/OST/MDL.
Comparison of three photochemical mechanisms (CB4, CB05, SAPRC99) for the Eta-CMAQ air quality forecast model for O 3 during the 2004 ICARTT study Shaocai.
Sensitivity of top-down correction of 2004 black carbon emissions inventory in the United States to rural-sites versus urban-sites observational networks.
Verification and Case Studies for Urban Effects in HIRLAM Numerical Weather Forecasting A. Baklanov, A. Mahura, C. Petersen, N.W. Nielsen, B. Amstrup Danish.
1 Using Hemispheric-CMAQ to Provide Initial and Boundary Conditions for Regional Modeling Joshua S. Fu 1, Xinyi Dong 1, Kan Huang 1, and Carey Jang 2 1.
Performance of NOAA-EPA air quality predictions, Jerry Gorline and Pius Lee Overview of the National Air Quality Forecasting Capability with.
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
Evaluation and Application of Air Quality Model System in Shanghai Qian Wang 1, Qingyan Fu 1, Yufei Zou 1, Yanmin Huang 1, Huxiong Cui 1, Junming Zhao.
Modification of GFS Land Surface Model Parameters to Mitigate the Near- Surface Cold and Wet Bias in the Midwest CONUS: Analysis of Parallel Test Results.
Air Resources Laboratory CMAS meeting Chapel Hill, North Carolina Yunsoo Choi 1,2, Hyuncheol Kim 1,2, Daniel Tong 1,2, Pius Lee 1, Rick Saylor 3, Ariel.
WRAP Experience: Investigation of Model Biases Uma Shankar, Rohit Mathur and Francis Binkowski MCNC–Environmental Modeling Center Research Triangle Park,
Rick Saylor 1, Barry Baker 1, Pius Lee 2, Daniel Tong 2,3, Li Pan 2 and Youhua Tang 2 1 National Oceanic and Atmospheric Administration Air Resources Laboratory.
CMAS Conference, October 6 – 8, 2008 The work presented in this paper was performed by the New York State Department of Environmental Conservation with.
Adaptation and Application of the CMAQ Modeling System for Real-time Air Quality Forecasting During the Summer of 2004 R. Mathur, J. Pleim, T. Otte, K.
Evaluation of modeled surface ozone biases as a function of cloud cover fraction Hyun Cheol Kim 1,2, Pius Lee 1, Fong Ngan 1,2, Youhua Tang 1,2, Hye Lim.
Ensemble Kalman Filter in a boundary layer 1D numerical model Samuel Rémy and Thierry Bergot (Météo-France) Workshop on ensemble methods in meteorology.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling and Analysis Division Using Dynamical Downscaling to Project.
Comparison of CMAQ Lightning NOx Schemes and Their Impacts Youhua Tang 1,2, Li Pan 1,2, Pius Lee 1, Jeffery T. McQueen 4, Jianping Huang 4,5, Daniel Tong.
Ui-Yong Byun, Song-You Hong, Hyeyum Shin Deparment of Atmospheric Science, Yonsei Univ. Ji-Woo Lee, Jae-Ik Song, Sook-Jung Ham, Jwa-Kyum Kim, Hyung-Woo.
Photo image area measures 2” H x 6.93” W and can be masked by a collage strip of one, two or three images. The photo image area is located 3.19” from left.
Applications of Models-3 in Coastal Areas of Canada M. Lepage, J.W. Boulton, X. Qiu and M. Gauthier RWDI AIR Inc. C. di Cenzo Environment Canada, P&YR.
1 Air Quality : National AQ Forecasting Capability surface O 3 and PM 2.5 Presented By: Pius Lee (OAR/ARL) Contributors: Jeffery McQueen, Jianping Huang,
1 Impact on Ozone Prediction at a Fine Grid Resolution: An Examination of Nudging Analysis and PBL Schemes in Meteorological Model Yunhee Kim, Joshua S.
C. Hogrefe 1,2, W. Hao 2, E.E. Zalewsky 2, J.-Y. Ku 2, B. Lynn 3, C. Rosenzweig 4, M. Schultz 5, S. Rast 6, M. Newchurch 7, L. Wang 7, P.L. Kinney 8, and.
Do better aerosol forecasts improve weather forecasts? A regional modeling and assimilation study. Mariusz Pagowski Stuart McKeen Georg Grell Ming Hu NOAA/ESRL,
Post-processing air quality model predictions of fine particulate matter (PM2.5) at NCEP James Wilczak, Irina Djalalova, Dave Allured (ESRL) Jianping Huang,
Diagnostic Study on Fine Particulate Matter Predictions of CMAQ in the Southeastern U.S. Ping Liu and Yang Zhang North Carolina State University, Raleigh,
Boundary layer depth verification system at NCEP M. Tsidulko, C. M. Tassone, J. McQueen, G. DiMego, and M. Ek 15th International Symposium for the Advancement.
Comparison of NOAA/NCEP 12km CMAQ Forecasts with CalNEX WP-3 Measurements Youhua Tang 1,2, Jeffery T. McQueen 2, Jianping Huang 1,2, Marina Tsidulko 1,2,
Air Resources Laboratory 1 Comprehensive comparisons of NAQFC surface and column NO 2 with satellites, surface, and field campaign measurements during.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling and Analysis Division October 21, 2009 Evaluation of CMAQ.
Evaluation of CMAQ Driven by Downscaled Historical Meteorological Fields Karl Seltzer 1, Chris Nolte 2, Tanya Spero 2, Wyat Appel 2, Jia Xing 2 14th Annual.
AN EVALUATION OF THE ETA-CMAQ AIR QUALITY FORECAST MODEL AS PART OF NOAA’S NATIONAL PROGRAM CMAQ AIRNOW AIRNOW Brian Eder* Daiwen Kang * Ken Schere* Ken.
Peak 8-hr Ozone Model Performance when using Biogenic VOC estimated by MEGAN and BIOME (BEIS) Kirk Baker Lake Michigan Air Directors Consortium October.
4 th JCSDA Science Workshop, 31 May – 1 June 2006 Broadband Satellite-Like (Infrared) Cloud Products from NCEP Models and Preliminary Cloud Verification.
The Impact of Lateral Boundary Conditions on CMAQ Predictions over the Continental US: a Sensitivity Study Compared to Ozonsonde Data Youhua Tang*, Pius.
A step toward operational use of AMSR-E horizontal polarized radiance in JMA global data assimilation system Masahiro Kazumori Numerical Prediction Division.
Ozone and PM 2.5 verification in NAM-CMAQ modeling system at NCEP in relation to WRF/NMM meteorology evaluation Marina Tsidulko, Jeff McQueen, Pius Lee.
COSMO General Meeting 2008, Krakow Modifications to the COSMO-Model Cumulus Parameterisation Scheme (Tiedtke 1989): Implementation and Testing Dimitrii.
Meteorological Development Laboratory / OST / National Weather Service  1200 and 0600 UTC OZONE 48-h experimental, 8-h (daily max) 48-h experimental,
Sensitivity of PM 2.5 Species to Emissions in the Southeast Sun-Kyoung Park and Armistead G. Russell Georgia Institute of Technology Sensitivity of PM.
On the Verification of Particulate Matter Simulated by the NOAA-EPA Air Quality Forecast System Ho-Chun Huang 1, Pius Lee 1, Binbin Zhou 1, Jian Zeng 6,
Daiwen Kang 1, Rohit Mathur 2, S. Trivikrama Rao 2 1 Science and Technology Corporation 2 Atmospheric Sciences Modeling Division ARL/NOAA NERL/U.S. EPA.
Daniel Tong NOAA Air Resources Lab & George Mason University
Potential use of TEMPO AOD & NO2 retrievals to support wild fire plume & O3 & PM2.5 forecast in National Air Quality Forecasting Capability (NAQFC) Pius.
Arastoo Pour Biazar1, Maudood Khan1, Andrew White1, Richart T
Potential Performance differences of the National Air Quality Forecasting Capability when upgrading the Chemical Transport Model Pius Lee1, Youhua Tang1,2,
Junhua Zhang and Wanmin Gong
Pius Lee, Youhua Tang, Jeff McQueen, Ho-Chun Huang,
The Value of Nudging in the Meteorology Model for Retrospective CMAQ Simulations Tanya L. Otte NOAA Air Resources Laboratory, RTP, NC (In partnership with.
REGIONAL AND LOCAL-SCALE EVALUATION OF 2002 MM5 METEOROLOGICAL FIELDS FOR VARIOUS AIR QUALITY MODELING APPLICATIONS Pat Dolwick*, U.S. EPA, RTP, NC, USA.
Presentation transcript:

Impact of Meteorological Inputs on Surface O 3 Prediction Jianping Huang 9 th CMAS Annual Conference Oct. 12, 2010, Chapel, NC

Co-Authors Jeff McQueen 1, Youhua Tang 1,2, Binbin Zhou 1,2, Marina Tsidulko 1,2, Ho-Chun Huang 1,2, Sarah Lu 1,2, Brad Ferrier 1,2, Bill Lapenta 1, Geoff DiMego 1 (1: NOAA/NCEP/EMC, 2: IMSG) Daewon Byun 3, Pius Lee 3, Daniel Tong 3,4 (3: NOAA/ARL, 4: ERT) Ivanka Stajner (NOAA/NWS/OST)

Motivation and objectives Motivation - O 3 over-predicted especially by CB05 and in coastal regions Objectives - to evaluate meteorological inputs - to reduce O 3 over-prediction

Outline National Air Quality Forecasting Capability Current issue of O 3 forecasting Verification of meteorological inputs Sensitivity ofO 3 prediction to cloud parameters Summary

National Air Quality Forecasting Capability Emission model: SMOKE - NEI BEIS V3 Met model: WRF/NMM (NAM, 12 km/L60) - T, RH, Wind, etc. - Cloud, PBL (re-calculated by PreMAQ) AQ model: CMAQ (12km/L22) - Oper: CONUS(CB04), AK/HI(CB05/Aero-4) - Exper/Dev: CONUS(CB05/ Aero-4) http:

Current issue of O 3 forecasting 6 8-hr max O 3 significantly over-predicted in NE coastal region as compared to AIRNOW 5x (Exp.) 8-hr max O 3 Aug ppb

Current issue of O 3 forecasting (cont.) 7 Daily 8-hr max O 3 (exp.) over-predicted (CONUS) Time period: July 1 st to August 31 st, 2010 O 3 (ppb) Date (12 UTC Cycle) obs fcs t bias rmse

Emissions - NEI 2005 Meteorological inputs - wind, etc. - cloud, PBL height (re-diagnosed in PreMAQ) CMAQ - deposition velocity, etc. - CB05 mechanism Lateral boundary condition - static Causes of O 3 over-prediction

Verification tool and data Verification tool: Forecast Verification System (FVS) - Grid2obs - Grid2grid - Statistics (e.g., rmse, bias) and FHO (e.g., csi, ets, far) Met observational data - T, RH, Wind: ANYSFC, ADPUPA, ONLYSF, VADWND - Cloud: AFWA (global, 1 0 x 1 o, 1-hr), CLAVR-x (global, 0.5 o x 0.5 o, 6-hr) O 3 data - AIRNOW Studied time period - O 3 and met verification: Jul. 1 to Aug. 31, Sensitivity testing: Aug. 5 – 31, 2010

FVS statistics parameters FVS Statistics variables: F.H.O. F = grid fraction of forecasted > threshold O = grid faction of observed > threshold H = grid fraction of both forecasted and observed > threshold Basic statistics scores Bias=F/O=(a+b)/(a+c) Critical Success Index CSI=H/(F+O-H)=a/(a+b+c) Probability of Detection POD=H/O=a/(a+c) False Alarm Ratio FAR =1-H/F=b/(a+b) Thresholds: O 3 : > 55, 65, 75, 85, 105, 125, 150 (ppb) N=a+b+c+d F=a+b O=a+c H=a b c d a

Verification of met inputs Date black: rmse red: bias Date T ( o C) rmse, bias of T ( o C) Relative humidity (RH)Temperature (T) black: obs mean red: fcst mean RH (%) rmse, bias of RH (%) Domain: CONUS

Verification of met inputs (cont.) WS (m/s) rmse, bias of WS (m/s) Date black: rmse red: bias Wind speed (WS) black: obs mean red: fcst mean Cloud cover (%) Date rmse, bias of TCLD (%) TCLD (%) Domain: CONUS

How does cloud impact O 3 prediction? Photolysis rate J cld =J 0 [1+C f (1.6t r cos(  )-1] below cloud, J cld =J 0 [1+C f  i (1-t r )cos(  )] above cloud, where J 0 is the clear sky photolysis rate, C f is cloud cover,  is the zenith angle, α i is a reaction dependent coefficient, and t r is cloud transmissivity, which is a function of cloud water content and cloud thickness. Cloud parameterization in PreMAQ - Cloud cover: Geleyn et al. (1982) (below PBL); Schumann (1989), Wyngaard and Brost (1984) (above PBL) - Liquid water content: Welcek and Taylor (1986), Change et al. (1987, 1990). NAM Cloud: more complicated cloud parameterization schemes (Ferrier et al. 2002) 13

Cloud cover FHO statistics Against AFWA for CONUS, Aug 05-31, 2010 False Alarm Ratio Total cloud cover threshold False Alarm Ratio Critical Success Index black: default red: modified %

Cloud cover FHO statistics (cont.) Total cloud cover threshold black: default red: modified Total cloud cover threshold False Alarm Ratio Against CLAVR-x for CONUS, Aug 05-31, 2010 Critical Success Index %

Sensitivity run: default vs. Modified PreMAQ : 13 UTC : 19 UTC Hourly-mean O 3 difference (modified-default) ppb

8-hr max O 3 verification: CONUS black dash: default-fcst red dash: modified-fcst solid: obs obs fcst rmse bias Date (12 UTC Cycle) black: default red: modified solid: rmse dash: bias 8-hr max O 3 (ppb) rmse, bias (ppb)

8-hr max O 3 verification: NEUS 8-hr max O 3 (ppb) black: default red: modified solid: rmse dash: bias black dash: default-fcst red dash: modified-fcst solid: obs Date (12 UTC Cycle) rmse, bias (ppb)

8-hr max O 3 FHO comparison: CONUS 8-hr max O 3 threshold Critical Success Index False Alarm Ratio black: default red: modified 8-hr max O 3 threshold (ppb) ppb

8-hr max O 3 FHO comparison: NEUS 8-hr max O 3 threshold Critical Success Index black: default red: modified False Alarm Ratio ppb

Summary O 3 over-prediction is often observed especially near North-eastern coastal region. Met verification results present that while temperature, relative humidity, and total cloud cover simulated by NAM show very good agreement with observations, NAM does not capture the time variability of the observed wind well. The sensitivity study indicates that direct taking cloud parameters (cloud cover, liquid water content, cloud base and top) from NAM outputs may slightly improve surface O 3 prediction especially over the NE coastal region.

Future work The role of cloud parameters will be examined further in the coupling of the new NMMB meteorological model with CMAQ. PBL schemes more suitable for stable atmospheric condition and marine boundary layer will be explored.