Model Performance Evaluation Data Base and Software - Application to CENRAP Betty K. Pun, Shu-Yun Chen, Kristen Lohman, Christian Seigneur PM Model Performance.

Slides:



Advertisements
Similar presentations
Exercise 7.5 (p. 343) Consider the hotel occupancy data in Table 6.4 of Chapter 6 (p. 297)
Advertisements

A Web-based Community Approach to Model Evaluation using AMET Saravanan Arunachalam 1, Nathan Rice 2 and Pradeepa Vennam 1 1 Institute for the Environment.
COMPARATIVE MODEL PERFORMANCE EVALUATION OF CMAQ-VISTAS, CMAQ-MADRID, AND CMAQ-MADRID-APT FOR A NITROGEN DEPOSITION ASSESSMENT OF THE ESCAMBIA BAY, FLORIDA.
Photochemical Model Performance for PM2.5 Sulfate, Nitrate, Ammonium, and pre-cursor species SO2, HNO3, and NH3 at Background Monitor Locations in the.
U.S. EPA Office of Research & Development October 30, 2013 Prakash V. Bhave, Mary K. McCabe, Valerie C. Garcia Atmospheric Modeling & Analysis Division.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 2008 CAMx Modeling Model Performance Evaluation Summary University of North Carolina.
Incorporation of the Model of Aerosol Dynamics, Reaction, Ionization and Dissolution (MADRID) into CMAQ Yang Zhang, Betty K. Pun, Krish Vijayaraghavan,
Christian Seigneur AER San Ramon, CA
University of Leeds Department of Chemistry The New MCM Website Stephen Pascoe, Louise Whitehouse and Andrew Rickard.
CENRAP Modeling Workgroup Mational RPO Modeling Meeting May 25-26, Denver CO Calvin Ku Missouri DNR May 25, 2004.
The AIRPACT-3 Photochemical Air Quality Forecast System: Evaluation and Enhancements Jack Chen, Farren Thorpe, Jeremy Avis, Matt Porter, Joseph Vaughan,
Statistics, data, and deterministic models NRCSE.
The Calibration Process
The AIRPACT-3 Photochemical Air Quality Forecast System: Evaluation and Enhancements Jack Chen, Farren Thorpe, Jeremy Avis, Matt Porter, Joseph Vaughan,
Evaluation of the AIRPACT2 modeling system for the Pacific Northwest Abdullah Mahmud MS Student, CEE Washington State University.
Jenny Stocker, Christina Hood, David Carruthers, Martin Seaton, Kate Johnson, Jimmy Fung The Development and Evaluation of an Automated System for Nesting.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling Division, Applied Modeling Research Branch October 8, 2008.
Beta Testing of the SCICHEM-2012 Reactive Plume Model James T. Kelly and Kirk R. Baker Office of Air Quality Planning & Standards US Environmental Protection.
ESRM 250 & CFR 520: Introduction to GIS © Phil Hurvitz, KEEP THIS TEXT BOX this slide includes some ESRI fonts. when you save this presentation,
PM Model Performance Goals and Criteria James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling Meeting Denver, CO May 26,
Carolina Environmental Program UNC Chapel Hill The Analysis Engine – A New Tool for Model Evaluation, Sensitivity and Uncertainty Analysis, and more… Alison.
Climate Predictability Tool (CPT) Ousmane Ndiaye and Simon J. Mason International Research Institute for Climate and Society The Earth.
Results of Ambient Air Analyses in Support of Transport Rule Presentation for RPO Workshop November 2003.
Earth System Sciences, LLC Suggested Analyses of WRAP Drilling Rig Databases Doug Blewitt, CCM 1.
Calculating Statistics: Concentration Related Performance Goals James W. Boylan Georgia Department of Natural Resources PM Model Performance Workshop Chapel.
III. Ground-Water Management Problem Used for the Exercises.
TSS Data Preparation Update WRAP TSS Project Team Meeting Ft. Collins, CO March 28-31, 2006.
BRAVO Results of CMAQ-MADRID Betty Pun, Christian Seigneur & Shiang-Yuh Wu AER, San Ramon Naresh Kumar EPRI, Palo Alto 10 October 2002.
Sensitivity of top-down correction of 2004 black carbon emissions inventory in the United States to rural-sites versus urban-sites observational networks.
Clinton MacDonald 1, Kenneth Craig 1, Jennifer DeWinter 1, Adam Pasch 1, Brigette Tollstrup 2, and Aleta Kennard 2 1 Sonoma Technology, Inc., Petaluma,
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.
Model Performance Evaluation Database and Software Betty K. Pun, Kristen Lohman, Shu-Yun Chen, and Christian Seigneur AER, San Ramon, CA Presentation at.
1 Using Hemispheric-CMAQ to Provide Initial and Boundary Conditions for Regional Modeling Joshua S. Fu 1, Xinyi Dong 1, Kan Huang 1, and Carey Jang 2 1.
Ozone MPE, TAF Meeting, July 30, 2008 Review of Ozone Performance in WRAP Modeling and Relevance to Future Regional Ozone Planning Gail Tonnesen, Zion.
PM Model Performance in Southern California Using UAMAERO-LT Joseph Cassmassi Senior Meteorologist SCAQMD February 11, 2004.
WRAP Experience: Investigation of Model Biases Uma Shankar, Rohit Mathur and Francis Binkowski MCNC–Environmental Modeling Center Research Triangle Park,
Using FLUXNET data to evaluate land surface models Ray Leuning and Gab Abramowitz 4 – 6 June 2008.
OThree Chemistry MM5/CAMx Model Diagnostic and Sensitivity Analysis Results Central California Ozone Study: Bi-Weekly Presentation 2 T. W. Tesche Dennis.
Regional Modeling of The Atmospheric Fate and Transport of Benzene and Diesel Particles with CMAQ Christian Seigneur, Betty Pun Kristen Lohman, and Shiang-Yuh.
Model Evaluation Comparing Model Output to Ambient Data Christian Seigneur AER San Ramon, California.
Operational Evaluation and Comparison of CMAQ and REMSAD- An Annual Simulation Brian Timin, Carey Jang, Pat Dolwick, Norm Possiel, Tom Braverman USEPA/OAQPS.
Dirk Felton RTP, NC February 12-13, 2008 Air Quality Data Summit: Session: Inventory of Data Systems Data provider perspectives.
Prakash V. Bhave, Ph.D. Physical Scientist PM Model Performance Workshop February 10, 2004 Postprocessing Model Output for Comparison to Ambient Data.
Source Attribution Modeling to Identify Sources of Regional Haze in Western U.S. Class I Areas Gail Tonnesen, EPA Region 8 Pat Brewer, National Park Service.
1 Overview Importing data from generic raster files Creating surfaces from point samples Mapping contours Calculating summary attributes for polygon features.
A Comparative Performance Evaluation of the AURAMS and CMAQ Air Quality Modelling Systems Steven C. Smyth, Weimin Jiang, Helmut Roth, and Fuquan Yang ICPET,
Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations T. W. Tesche & Dennis McNally -- Alpine Geophysics, LLC Ralph Morris -- ENVIRON Gail Tonnesen.
Modeling Regional Haze in Big Bend National Park with CMAQ Betty Pun, Christian Seigneur & Shiang-Yuh Wu AER, San Ramon Naresh Kumar EPRI, Palo Alto CMAQ.
GEOS-CHEM Modeling for Boundary Conditions and Natural Background James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling.
Evaluating temporal and spatial O 3 and PM 2.5 patterns simulated during an annual CMAQ application over the continental U.S. Evaluating temporal and spatial.
Report to WESTAR Technical Committee September 20, 2006 The CMAQ Visibility Model Applied To Rural Ozone In The Intermountain West Patrick Barickman Tyler.
Evaluation of CMAQ Driven by Downscaled Historical Meteorological Fields Karl Seltzer 1, Chris Nolte 2, Tanya Spero 2, Wyat Appel 2, Jia Xing 2 14th Annual.
WRAP Stationary Sources Joint Forum Meeting August 16, 2006 The CMAQ Visibility Model Applied To Rural Ozone In The Intermountain West Patrick Barickman.
Peak 8-hr Ozone Model Performance when using Biogenic VOC estimated by MEGAN and BIOME (BEIS) Kirk Baker Lake Michigan Air Directors Consortium October.
AoH/MF Meeting, San Diego, CA, Jan 25, 2006 WRAP 2002 Visibility Modeling: Summary of 2005 Modeling Results Gail Tonnesen, Zion Wang, Mohammad Omary, Chao-Jung.
Operational Evaluation and Model Response Comparison of CAMx and CMAQ for Ozone & PM2.5 Kirk Baker, Brian Timin, Sharon Phillips U.S. Environmental Protection.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 3SAQS 2011 CAMx Model Performance Evaluation University of North Carolina (UNC-IE)
Section 9.3 Measures of Regression and Prediction Intervals.
1 Preliminary evaluation of the 2002 Base B1 CMAQ simulation: Temporal Analysis A more complete statistical evaluation, including diurnal variations, of.
Research Progress Discussions of Coordinated Emissions Research Suggestions to Guide this Initiative Focus on research emission inventories Do not interfere.
WRAP Technical Work Overview
The Use of AMET and Automated Scripts for Model Evaluation
Preliminary evaluation of the 2002 Base B1 CMAQ simulation: Spatial Analysis A more complete statistical evaluation, including diurnal variations, of the.
Department of Civil & Environmental Engineering
Using CMAQ to Interpolate Among CASTNET Measurements
Suggested Analyses of WRAP Drilling Rig Databases
Update on 2016 AQ Modeling by EPA
A Review of Time Integrated PM2.5 Monitoring Data in the United States
Evaluation of Models-3 CMAQ Annual Simulation Brian Eder, Shaocai Yu, Robin Dennis, Alice Gilliland, Steve Howard,
Measurement Needs for AQ Models
Presentation transcript:

Model Performance Evaluation Data Base and Software - Application to CENRAP Betty K. Pun, Shu-Yun Chen, Kristen Lohman, Christian Seigneur PM Model Performance Workshop Chapel Hill, NC 11 February 2004

Acknowledgements Funding for the MPE software is provided by CENRAP under Modeling Umbrella Contract RP-005 Work Order 1; PM modeling and evaluation are funded under Work Order 3 CENRAP Modeling Workgroup and outside reviewers provided feedback on work plan and suggestions on useful features in the MPE database and software

The Role of Model Performance Evaluation Model Application Model Evaluation Model/Data Improvement Regulatory Application The modeling cycle iterates until performance is good enough for use in strategy design; hence the need to streamline and automate model performance evaluation

Processing Component Statistics Component Formatted ambient data Model or Preprocessor NetCDF data Model/measurement cross reference tables User input (control file): - model - species - subdomain - temporal processing - spatial processing - statistics... Data output Model Performance Evaluation Software Ambient Database MPE Database and Software Performance Statistics Paired, unpaired peak error Gross error, bias Normalized error, bias Root mean square error Coefficient of determination... Graphics

How to Consolidate, Store, and Retrieve Ambient Data for MPE? What we have: Many data sources (IMPROVE, CASTNet, AQS, special studies) Different formats Supporting information sometimes separate from data and sometimes difficult to find What we need: Preferably one data source Consistent format Supporting information –site location –sample start time including time zone –sample duration –units MySQL database upload data (measurement, site) compile other relevant information use query to retrieve data in consistent format

Querying the MPE Database for Monitoring Sites and Observations Site query BIBE1,IMPROVE, , Observation query BIBE1,2002,10,5,0,CT,24,3.619,”ug/m3”

Processing Model Results Two common formats of output files: binary and NetCDF Platform-independent NetCDF format selected as standard CMAQ files require no conversion Fortran binary to NetCDF converters developed for CAMx

MPE Software Processing Component Read observations –perform time zone changes –average to longer periods –perform unit conversions Extract modeling data –calculate grid cell corresponding to latitude/longitude –extract model data at grid cell(s) –sum components of species –perform unit conversions

Cross Reference Tables CAMx4.0, PMCAMx, CMAQ Used to look up what model species to extract based on model name and species evaluated Selected entries

Species Options PM 2.5 and PM 10 mass PM 2.5 components: sulfate, nitrate, ammonium, organic material, black carbon PM fraction –MPE software extracts relevant PM mass and species concentration to calculate PM fraction –MySQL query to calculate PM fraction at sites with co- located PM 2.5 and speciated PM measurements Gases: O 3, VOC (ppbC and ppb options), NO x, NO y, SO 2 Wet deposition –Option to evaluate concentration in precipitation or deposition flux

Subdomain Options All sites included in monitoring site and data files User to provide a list in a file (e.g., a metropolitan statistical area) User to specify minimum and maximum latitude and longitude User to specify minimum and maximum cell numbers in the x and y direction An RPO performance evaluation zone (J. Underhill/D. Watson, 2003)

Temporal Processing Options User can select sub-period within simulation period for evaluation User can specify a comparison interval that is greater than or equal to the measurement interval –MPE software will temporally average both measurements and model results based on the comparison interval –A completeness criterion is needed when performing temporal averaging (>75% recommended)

Spatial Processing Options Extract model value at grid cell corresponding to site location Linear interpolation using 4 closest grid cell Average of all cells within a user specified window Best estimate with a user specified window Distance-weighing method within a user specified window

Statistical Options Threshold to be selected by user Default output metrics –accuracy of peak (unpaired in time): paired and unpaired in space –mean observed and modeled value –Gross and normalized bias and error –Coefficient of correlation –Normalized root mean square error Optional output metrics –ratio of means, fractional bias and error, r 2, index of agreement, site specific root mean square error, normalized mean bias and error

Outputs header (model, period, species, units data files, options) mean observed value 39.3 mean modeled value 31.6 normalized bias -16.7%... Site I J Year Month Day CST Obs. Modeled ADPI ADPI … BOWA BOWA … Graphics package e.g., excel.stat.tser

A Basic PM Model Evaluation set inpdir = /usr2/cp179/cmaq/jan2002/out set inpfile = PM2.5.nc foreach spc (PM2.5_Sulfate PM2.5_Nitrate PM2.5_Organic_Material \ PM2.5_Black_Carbon PM2.5)./statp<<ieof … sitefile | /usr2/cp179/mpe/sitedata/SPECIATION.site datafile | /usr2/cp179/mpe/jan2002data/SPECIATION.$spc dat … species| $spc … ieof

Model Performance against Urban and Rural AQS Data (119 Speciation Sites) CMAQ (Jan 2002)PM2.5 SO4 OM BC NO3 NH4 Mean Observed Value Mean Modeled Value Gross Bias Normalized Bias Fractional Bias Gross Error Normalized Error Fractional Error Coef. Determination (r 2 )

Subset of Sites Selected by Grid Cell Range foreach spc (PM2.5_Sulfate PM2.5_Nitrate PM2.5_Organic_Material \ PM2.5_Black_Carbon PM2.5)./statp<<ieof … sitefile | /usr2/cp179/mpe/sitedata/IMPROVE.site datafile | /usr2/cp179/mpe/jan2002data/IMPROVE.$spc dat … species| $spc … model | CAMX … listflag| 3 … minx, miny| 2, 2 maxx, maxy| 100, ieof Internal boundary excluded from evaluation

Model Performance against Urban and Rural AQS Data (119 Speciation Sites) CAMX (Jan 2002)PM2.5 SO4 NO3 NH4 BC OM Mean Observed Value Mean Modeled Value Gross Bias Normalized Bias Fractional Bias Gross Error Normalized Error Fractional Error Coef. Determination (r 2 )

Gross bias Normalized bias Gross error Normalized error r 2 Comparison of Selected Performance Statistics at Speciation Sites A color box is shown under the model with better performance CMAQ CAMx Key: PM2.5 Sulfate Nitrate Ammonium Organic material Black carbon

A color box is shown under the model with better performance CMAQ CAMx Gross bias Normalized bias Gross error Normalized error r 2 Key: PM2.5 Sulfate Nitrate Organic material Black carbon Comparison of Selected Performance Statistics at IMPROVE Sites

Subset of Sites Selected by User Input File foreach spc (PM2.5_Sulfate PM2.5_Nitrate PM2.5_Organic_Material \ PM2.5_Black_Carbon PM2.5)./statp<<ieof … sitefile | /usr2/cp179/mpe/sitedata/IMPROVE.site datafile | /usr2/cp179/mpe/jan2002data/IMPROVE.$spc dat … species| $spc … compintvl| listflag| 1 listfile| BOWA.txt... ieof 1 BOWA1 BOWA.txt

Time Series at Boundary Waters Canoe Area

Composition of PM 2.5 at Boundary Waters Canoe Area  g/m 3 PM 2.5 Mass = 4.0  g/m 3 Predicted PM 2.5 = 5.4  g/m 3 Predicted PM 2.5 = 4.8  g/m 3 Observed CMAQ CAMx

Ambient Variability vs. Model Variability for Sulfate and Nitrate

An MPE Database and Software Designed for Community Use Comprehensive Processor –data base; binary to NetCDF converter; software –output compatible with common graphics software Versatility –PM, PM components, gases, deposition fluxes User-friendly Design –CENRAP review and community input for software features –Software engineering standard –Based on Fortran 90 and MySQL (free!) Documentation

Comparison with Ambient Data Does Not Always Tell How Good Model Is PM 2.5 –Model overprediction (e.g., nitrate, other) –Sampling losses of volatile species Organic mass –Model uncertainties in SOA formation –Factor used to convert organic carbon to organic mass –OC vs. BC: still an operational definition based on measurements Ammonium –Model overprediction (due to nitrate overprediction) –Sampling losses on nylon filter