Model Performance Evaluation Database and Software Betty K. Pun, Kristen Lohman, Shu-Yun Chen, and Christian Seigneur AER, San Ramon, CA Presentation at.

Slides:



Advertisements
Similar presentations
A Web-based Community Approach to Model Evaluation using AMET Saravanan Arunachalam 1, Nathan Rice 2 and Pradeepa Vennam 1 1 Institute for the Environment.
Advertisements

COMPARATIVE MODEL PERFORMANCE EVALUATION OF CMAQ-VISTAS, CMAQ-MADRID, AND CMAQ-MADRID-APT FOR A NITROGEN DEPOSITION ASSESSMENT OF THE ESCAMBIA BAY, FLORIDA.
Photochemical Model Performance for PM2.5 Sulfate, Nitrate, Ammonium, and pre-cursor species SO2, HNO3, and NH3 at Background Monitor Locations in the.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 2008 CAMx Modeling Model Performance Evaluation Summary University of North Carolina.
Incorporation of the Model of Aerosol Dynamics, Reaction, Ionization and Dissolution (MADRID) into CMAQ Yang Zhang, Betty K. Pun, Krish Vijayaraghavan,
Christian Seigneur AER San Ramon, CA
University of Leeds Department of Chemistry The New MCM Website Stephen Pascoe, Louise Whitehouse and Andrew Rickard.
CENRAP Modeling Workgroup Mational RPO Modeling Meeting May 25-26, Denver CO Calvin Ku Missouri DNR May 25, 2004.
Statistics, data, and deterministic models NRCSE.
Evaluation of the AIRPACT2 modeling system for the Pacific Northwest Abdullah Mahmud MS Student, CEE Washington State University.
Jenny Stocker, Christina Hood, David Carruthers, Martin Seaton, Kate Johnson, Jimmy Fung The Development and Evaluation of an Automated System for Nesting.
Dissemination of Haze Data, Data Products and Information Bret Schichtel, Rodger Ames, Shawn McClure and Doug Fox.
Office of Research and Development National Exposure Research Laboratory, Atmospheric Modeling Division, Applied Modeling Research Branch October 8, 2008.
Model Performance Evaluation Data Base and Software - Application to CENRAP Betty K. Pun, Shu-Yun Chen, Kristen Lohman, Christian Seigneur PM Model Performance.
CMAQ (Community Multiscale Air Quality) pollutant Concentration change horizontal advection vertical advection horizontal dispersion vertical diffusion.
Template CAMx Ancillary Input Development Chris Emery ENVIRON International Corporation, Novato CA November 14, 2012.
Beta Testing of the SCICHEM-2012 Reactive Plume Model James T. Kelly and Kirk R. Baker Office of Air Quality Planning & Standards US Environmental Protection.
ESRM 250 & CFR 520: Introduction to GIS © Phil Hurvitz, KEEP THIS TEXT BOX this slide includes some ESRI fonts. when you save this presentation,
Plume-in-Grid Modeling for PM & Mercury Prakash Karamchandani, Krish Vijayaraghavan, Shu-Yun Chen & Christian Seigneur AER San Ramon, CA 5th Annual CMAS.
PM Model Performance Goals and Criteria James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling Meeting Denver, CO May 26,
E X P E R I E N C E Y O U R A M E R I C A Management of Air Quality Monitoring Data Debbie Miller National Park Service U.S. Department of the Interior.
Earth System Sciences, LLC Suggested Analyses of WRAP Drilling Rig Databases Doug Blewitt, CCM 1.
Calculating Statistics: Concentration Related Performance Goals James W. Boylan Georgia Department of Natural Resources PM Model Performance Workshop Chapel.
TSS Data Preparation Update WRAP TSS Project Team Meeting Ft. Collins, CO March 28-31, 2006.
BRAVO Results of CMAQ-MADRID Betty Pun, Christian Seigneur & Shiang-Yuh Wu AER, San Ramon Naresh Kumar EPRI, Palo Alto 10 October 2002.
Comparing Two Advection Solvers in MAQSIP Shiang-Yuh Wu, Prasad Pai, Betty K. Pun AER San Ramon, CA 17 March 2000.
Clinton MacDonald 1, Kenneth Craig 1, Jennifer DeWinter 1, Adam Pasch 1, Brigette Tollstrup 2, and Aleta Kennard 2 1 Sonoma Technology, Inc., Petaluma,
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.
Weekday/Weekend O 3 and PM Differences in Three Cities Outside California CRC Project A-36B Betty K. Pun and Christian Seigneur AER, San Ramon, CA Warren.
Ozone MPE, TAF Meeting, July 30, 2008 Review of Ozone Performance in WRAP Modeling and Relevance to Future Regional Ozone Planning Gail Tonnesen, Zion.
DATA, SITE AND RESOURCE MANAGEMENT SOFTWARE. A Windows application software designed for use with Stylitis data loggers. EMMETRON consolidates resources,
1 Neil Wheeler, Kenneth Craig, and Clinton MacDonald Sonoma Technology, Inc. Petaluma, California Presented at the Sixth Annual Community Modeling and.
A comparison of PM 2.5 simulations over the Eastern United States using CB-IV and RADM2 chemical mechanisms Michael Ku, Kevin Civerolo, and Gopal Sistla.
_______________________________________________________________CMAQ Libraries and Utilities ___________________________________________________Community.
PM Model Performance in Southern California Using UAMAERO-LT Joseph Cassmassi Senior Meteorologist SCAQMD February 11, 2004.
Copyright 2006 Prentice-Hall, Inc. Essentials of Systems Analysis and Design Third Edition Joseph S. Valacich Joey F. George Jeffrey A. Hoffer Chapter.
Using FLUXNET data to evaluate land surface models Ray Leuning and Gab Abramowitz 4 – 6 June 2008.
OThree Chemistry MM5/CAMx Model Diagnostic and Sensitivity Analysis Results Central California Ozone Study: Bi-Weekly Presentation 2 T. W. Tesche Dennis.
PM Model Performance & Grid Resolution Kirk Baker Midwest Regional Planning Organization November 2003.
Regional Modeling of The Atmospheric Fate and Transport of Benzene and Diesel Particles with CMAQ Christian Seigneur, Betty Pun Kristen Lohman, and Shiang-Yuh.
Model Evaluation Comparing Model Output to Ambient Data Christian Seigneur AER San Ramon, California.
Operational Evaluation and Comparison of CMAQ and REMSAD- An Annual Simulation Brian Timin, Carey Jang, Pat Dolwick, Norm Possiel, Tom Braverman USEPA/OAQPS.
Using VSP for Groundwater Monitoring Program Optimization ASP Workshop Charleston, SC September 16, 2015 Alex Mikszewski, PE Amec Foster Wheeler.
Dirk Felton RTP, NC February 12-13, 2008 Air Quality Data Summit: Session: Inventory of Data Systems Data provider perspectives.
1 Overview Importing data from generic raster files Creating surfaces from point samples Mapping contours Calculating summary attributes for polygon features.
Evaluation of the VISTAS 2002 CMAQ/CAMx Annual Simulations T. W. Tesche & Dennis McNally -- Alpine Geophysics, LLC Ralph Morris -- ENVIRON Gail Tonnesen.
Modeling Regional Haze in Big Bend National Park with CMAQ Betty Pun, Christian Seigneur & Shiang-Yuh Wu AER, San Ramon Naresh Kumar EPRI, Palo Alto CMAQ.
GEOS-CHEM Modeling for Boundary Conditions and Natural Background James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling.
Evaluating temporal and spatial O 3 and PM 2.5 patterns simulated during an annual CMAQ application over the continental U.S. Evaluating temporal and spatial.
Weekday/Weekend O 3 and PM Differences in Three Cities Outside California Betty K. Pun and Christian Seigneur AER, San Ramon, CA Warren White Washington.
Report to WESTAR Technical Committee September 20, 2006 The CMAQ Visibility Model Applied To Rural Ozone In The Intermountain West Patrick Barickman Tyler.
Evaluation of CMAQ Driven by Downscaled Historical Meteorological Fields Karl Seltzer 1, Chris Nolte 2, Tanya Spero 2, Wyat Appel 2, Jia Xing 2 14th Annual.
WRAP Stationary Sources Joint Forum Meeting August 16, 2006 The CMAQ Visibility Model Applied To Rural Ozone In The Intermountain West Patrick Barickman.
Peak 8-hr Ozone Model Performance when using Biogenic VOC estimated by MEGAN and BIOME (BEIS) Kirk Baker Lake Michigan Air Directors Consortium October.
BRAVO Results of REMSAD Shiang-Yuh Wu, Betty Pun & Christian Seigneur AER, San Ramon 10 October 2002.
AoH/MF Meeting, San Diego, CA, Jan 25, 2006 WRAP 2002 Visibility Modeling: Summary of 2005 Modeling Results Gail Tonnesen, Zion Wang, Mohammad Omary, Chao-Jung.
Operational Evaluation and Model Response Comparison of CAMx and CMAQ for Ozone & PM2.5 Kirk Baker, Brian Timin, Sharon Phillips U.S. Environmental Protection.
Three-State Air Quality Study (3SAQS) Three-State Data Warehouse (3SDW) 3SAQS 2011 CAMx Model Performance Evaluation University of North Carolina (UNC-IE)
Modeling & Monitoring / Data Analysis Joint Session RPO National Workgroup Meeting December 3, 2002, 1:00 - 3:00 Crown Plaza, Dallas, TX.
1 Preliminary evaluation of the 2002 Base B1 CMAQ simulation: Temporal Analysis A more complete statistical evaluation, including diurnal variations, of.
By Russ Frith University of Alaska at Anchorage Civil Engineering Department Estimating Alaska Snow Loads.
The Use of AMET and Automated Scripts for Model Evaluation
Preliminary evaluation of the 2002 Base B1 CMAQ simulation: Spatial Analysis A more complete statistical evaluation, including diurnal variations, of the.
Dissemination of Haze Data, Data Products and Information
Department of Civil & Environmental Engineering
Using CMAQ to Interpolate Among CASTNET Measurements
Suggested Analyses of WRAP Drilling Rig Databases
Comparing NetCDF and a multidimensional array database on managing and querying large hydrologic datasets: a case study of SciDB– P5 Haicheng Liu.
M. Kezunovic (P.I.) S. S. Luo D. Ristanovic Texas A&M University
Update on 2016 AQ Modeling by EPA
Presentation transcript:

Model Performance Evaluation Database and Software Betty K. Pun, Kristen Lohman, Shu-Yun Chen, and Christian Seigneur AER, San Ramon, CA Presentation at the RPO Workgroup Meeting St. Louis, MO 5 November 2003

Acknowledgement Funding for this work is provided by CENRAP under Modeling Umbrella Contract RP-005 Work Order 1 Calvin Ku, Missouri DNR and Matthew Johnson, Iowa DNR for their continuous support CENRAP Modeling Workgroup and outside reviewers for feedback on work plan and suggestion on useful features in the MPE database and software

The Role of Model Performance Evaluation Model Application Model Evaluation Model/Data Improvement Regulatory Application The modeling cycle iterates until performance is good enough for use in strategy design; hence the need to streamline and automate model performance evaluation

Model Performance Evaluation in a Nutshell Ambient DataModel Results Model Evaluation Software + Graphics Package Performance Statistics Paired peak error Unpaired peak error Gross error Gross bias Normalized bias Normalized error Root mean square error Coefficient of determination... Graphics Time series Scatter plots Pie charts...

Data Available for PM Model Performance Evaluation Routine Monitors IMPROVE: 24 hour PM 2.5 and component data, one in three days CASTNet: 7-day sulfate, nitrate, ammonium, SO 2, and HNO 3 (may be problematic) AQS: hourly to daily PM 10, PM 2.5, PM 2.5 speciation, O 3, NO, NO 2, NO y, VOC, SO 2 Special Studies PM Supersites BRAVO Others

How to Consolidate, Store, and Retrieve Ambient Data for MPE? What we have: Many data sources Different formats Supporting information sometimes separate from data and sometimes difficult to find What we need: Preferably one data source Consistent format Supporting information –site location –sample start time including time zone –sample duration –units MySQL database upload data (measurement, site) compile other relevant information use query to retrieve data in consistent format

Database Design Principles Storage requirements: use a hierarchical design –Network (e.g., averaging period, frequency) –Site (e.g., location) –Parameter (e.g., units) –Measurement Ease of data updates –Each network stored in separate tables; each table can be updated independently –Use original data format to the extent possible –Scripts used for adding supplementary information and data screening

Querying the MPE Database for Monitoring Sites Site query SELECT site, “IMPROVE”, latitude, longitude INTO OUTFILE ‘improve.site.dat’ FIELDS TERMINATED BY ‘,’ FROM IMPROVE_LOCS ORDER BY site; Sample result BIBE1,IMPROVE, ,

Querying the MPE Database for Measurement Data Data query SELECT m.site_code, year(m.obs_date), month(m.obs_date), dayofmonth(m.obs_date), hour (n.starttime), s.timezone, n.avgtime_hr, (m.sf_val * 3), concat(‘’\’’”, p.units, “\””) INTO OUTFILE ‘improve.dat’ FIELDS TERMINATED BY ‘,’ FROM NETWORK_INFO as n, IMPROVE_LOCS as s, IMPROVE_VAR as p, IMPROVE_DAT as m WHERE m.sitecode = s.site AND n.network_id = “IMPROVE” and p.varname = “Sf_val” AND year(m.obs_date) = 2002 ORDER BY m.site_code, m.obs_date; This query utilizes all 4 levels in the hierarchy of information organization Sample result: BIBE1,2002,10,5,0,CT,24,3.619,”ug/m3”

Processing Model Results Two common formats of output files: binary and NetCDF Platform-independent NetCDF format selected as standard CMAQ files requires no conversion Fortran binary to NetCDF converters developed –CAMx concentration data –CAMx deposition data –PM-CAMx concentration data –MM5CAMx meteorology data

Processing Component Statistics Component Graphics Package Formatted ambient data Model or Preprocessor NetCDF data Model/measurement cross reference tables User input (control file): - model - species options - subdomain options - temporal processing options - spatial processing options - statistics options... Statistics Graphics Data output Model Performance Evaluation Software Database MPE Software

Processing Component Read observations –perform time zone changes –average to longer periods –perform unit conversions Extract modeling data –calculate grid cell corresponding to latitude/longitude –extract model data at grid cell(s) –sum components of species –average to longer periods –perform unit conversions

Cross Reference Tables CAMx4.0, PMCAMx, CMAQ Used to look up what model species to extract based on model name and species evaluated Selected entries

Species Options PM 2.5 and PM 10 mass PM 2.5 components: sulfate, nitrate, ammonium, organic material, black carbon PM fraction –MPE software extracts relevant PM mass and species concentration to calculate PM fraction –MySQL query to calculate PM fraction at sites with co- located PM 2.5 and speciated PM measurements Gases: O 3, VOC (ppbC and ppb options), NO x, NO y, SO 2 Wet deposition –Option to evaluate concentration in precipitation or deposition flux

Subdomain Options All sites included in monitoring site and data files User to provide a list in a file (e.g., a metropolitan statistical area) User to specify minimum and maximum latitude and longitude User to specify minimum and maximum cell numbers in the x and y direction An RPO performance evaluation zone (J. Underhill/D. Watson, 2003)

Temporal Processing Options User can select sub-period within simulation period for evaluation User can specify a comparison interval that is greater than or equal to the measurement interval –MPE software will temporally average both measurements and model results based on the comparison interval –A completeness criterion is needed when performing temporal averaging (>75% recommended)

Spatial Processing Options Extract model value at grid cell corresponding to site location Linear interpolation using 4 closest grid cell Average of all cells within a user specified window Best estimate with a user specified window Distance-weighing method within a user specified window

Statistical Options Threshold to be selected by user Default output metrics –accuracy of peak (unpaired in time): paired and unpaired in space –mean observed and modeled value –Gross and normalized bias and error –Coefficient of correlation –Normalized root mean square error Optional output metrics –ratio of means, fractional bias and error, r 2, index of agreement, site specific root mean square error, normalized mean bias and error

Outputs header (model, period, species, units data files, options) mean observed value 39.3 mean modeled value 31.6 normalized bias -16.7%... Site, year, month, day, time, obs, sim CHE185,1998,7,2,20,32.2,39.9 CHE185,1998,7,3,20,40.8,38.1 … BBE401,1998,7,2,20,42.0,24.7 BBE401,1998,7,3,20,41.7,23.7 … Graphics package e.g., excel.stat.tser

An MPE Database and Software Designed for Community Use Comprehensive Processor –data base; binary to NetCDF converter; software –output compatible with common graphics software Versatility –PM, PM components, gases, deposition fluxes User-friendly Design –CENRAP review and community input for software features –Software engineering standard –Based on Fortran 90 and MySQL (free!) Documentation