1 GOES-R AWG Product Validation Tool Development AWG GRAFIIR Team June 16, 2011 Presented by: Mat Gunshor of CIMSS Ray Garcia, Allen Huang, Graeme Martin,

Slides:



Advertisements
Similar presentations
1 1. FY09 GOES-R3 Project Proposal Title Page Title: Trace Gas and Aerosol Emissions from GOES-R ABI Project Type: GOES-R algorithm development project.
Advertisements

 nm)  nm) PurposeSpatial Resolution (km) Ozone, SO 2, UV8 3251Ozone8 3403Aerosols, UV, and Volcanic Ash8 3883Aerosols, Clouds, UV and Volcanic.
GOES-R Fog/Low Stratus (FLS) IFR Probability Product.
TRMM Tropical Rainfall Measurement (Mission). Why TRMM? n Tropical Rainfall Measuring Mission (TRMM) is a joint US-Japan study initiated in 1997 to study.
1 GOES Users’ Conference October 1, 2002 GOES Users’ Conference October 1, 2002 John (Jack) J. Kelly, Jr. National Weather Service Infusion of Satellite.
McIDAS-V Support for the GOES-R Program William Straka 1, Tom Rink 1, Tom Achtor 1, Tim Schmit 2, Kaba Bah 1, Joleen Feltz 1 1 CIMSS/SSEC, University of.
Calibration Working Group L1b Cal/Val Update 1 Presented by Changyong Cao NOAA/NESDIS/STAR GOES-R Calibration Working Group January 9, 2014.
GOES-R AWG Product Validation Tool Development Aerosol Optical Depth/Suspended Matter and Aerosol Particle Size Mi Zhou (IMSG) Pubu Ciren (DELL) Hongqing.
Motivation Many GOES products are not directly used in NWP but may help in diagnosing problems in forecasted fields. One example is the GOES cloud classification.
Jaime Daniels Program Manager (Acting) GOES-R Algorithm Working Group (AWG) NOAA/NESDIS, Center for Satellite Applications and Research September 21, 2011.
1 GOES-R AWG Hydrology Algorithm Team: Rainfall Probability June 14, 2011 Presented By: Bob Kuligowski NOAA/NESDIS/STAR.
Applications and Limitations of Satellite Data Professor Ming-Dah Chou January 3, 2005 Department of Atmospheric Sciences National Taiwan University.
1 GOES-R AWG Product Validation Tool Development Cloud Products Andrew Heidinger (STAR) Michael Pavolonis (STAR) Andi Walther (CIMSS) Pat Heck and Pat.
1 GOES-R AWG Hydrology Algorithm Team: Rainfall Potential June 14, 2011 Presented By: Bob Kuligowski NOAA/NESDIS/STAR.
1 CIMSS Participation in the Development of a GOES-R Proving Ground Timothy J. Schmit NOAA/NESDIS/Satellite Applications and Research Advanced Satellite.
GOES-R ABI PROXY DATA SET GENERATION AT CIMSS Mathew M. Gunshor, Justin Sieglaff, Erik Olson, Thomas Greenwald, Jason Otkin, and Allen Huang Cooperative.
Month day, year Comments from NWSHQ Perspective Need to produce table showing 68 baseline & option products vs products that PG is working on What is link.
Thanks also to… Tom Wrublewski, NOAA Liaison Office Steve Kirkner, GOES Program Office Scott Bachmeier, CIMSS Ed Miller, NOAA Liaison Office Eric Chipman,
The GOES-R Algorithm Working Group (AWG) program requests a high quality of proxy data for algorithm developments, testing and assessments. The central.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Current Status  Framework is in place and algorithms are being integrated.
1 GOES-R AWG Product Validation Tool Development Aviation Application Team – Volcanic Ash Mike Pavolonis (STAR)
1 GOES-R AWG Product Validation Tool Development Aviation Application Team – Volcanic Ash Mike Pavolonis (STAR)
1 Center for S a t ellite A pplications and R esearch (STAR) Applicability of GOES-R AWG Cloud Algorithms for JPSS/VIIRS AMS Annual Meeting Future Operational.
GOES-R ABI Synthetic Imagery at 3.9 and 2.25 µm 24Feb2015 Poster 2 Louie Grasso, Yoo-Jeong Noh CIRA/Colorado State University, Fort Collins, CO
1 Geostationary Cloud Algorithm Testbed (GEOCAT) Processing Mike Pavolonis and Andy Heidinger (NOAA/NESDIS/STAR) Corey Calvert and William Straka III (UW-CIMSS)
11 Ice Cover and Sea and Lake Ice Concentration with GOES-R ABI Presented by Yinghui Liu Presented by Yinghui Liu 1 Team Members: Yinghui Liu, Jeffrey.
Modern Era Retrospective-analysis for Research and Applications: Introduction to NASA’s Modern Era Retrospective-analysis for Research and Applications:
2015 AMS Annual Meeting Non Export-Controlled Information Progress in Development and Integration of the Product Generation Capabilities of the GOES- R.
2015 OCONUS Meeting Non Export-Controlled Information Progress in Development and Integration of the Product Generation Capabilities of the GOES-R Ground.
Improvements of the Geostationary Operational Environmental Satellites (GOES)-R series for Climate Applications GOES-R data and products will support applications.
Himawari‐8 Operational Data Processing and Access
IGARSS 2011, July 24-29, Vancouver, Canada 1 A PRINCIPAL COMPONENT-BASED RADIATIVE TRANSFER MODEL AND ITS APPLICATION TO HYPERSPECTRAL REMOTE SENSING Xu.
Cooperative Institute for Meteorological Satellite Studies University of Wisconsin - Madison ABI and AIRS Retrievals in McIDAS-V Kaba Bah.
Near-Real-Time Simulated ABI Imagery for User Readiness, Retrieval Algorithm Evaluation and Model Verification Tom Greenwald, Brad Pierce*, Jason Otkin,
Andrew Heidinger and Michael Pavolonis
Hyperspectral Infrared Alone Cloudy Sounding Algorithm Development Objective and Summary To prepare for the synergistic use of data from the high-temporal.
January 7, 2015 Walter Wolf, Jaime Daniels, and Lihang Zhou NOAA/NESDIS, Center for Satellite Applications and Research (STAR) Shanna Sampson, Tom King,
Testing LW fingerprinting with simulated spectra using MERRA Seiji Kato 1, Fred G. Rose 2, Xu Liu 1, Martin Mlynczak 1, and Bruce A. Wielicki 1 1 NASA.
GOES-R Cloud Phase Algorithm Integration Status. GOES-R Cloud Phase Integration The initial GOES-R cloud phase algorithm, modified to run on VIIRS data,
CBH statistics for the Provisional Review Curtis Seaman, Yoo-Jeong Noh, Steve Miller and Dan Lindsey CIRA/Colorado State University 12/27/2013.
NPP/Follow-On Phase months after NPP Launch JPSS contractor delivered xDRs  xDRs with To Be Confirmed (TBC) are products not currently provided.
Bill Campbell, Brian Gockel, and Jim Heil (NOAA/NWS) Marge Ripley and Jean-Jacques Bedet (NOAA/NESDIS/GOES-R) Improved Temporal Resolution GOES-R Sectorized.
As components of the GOES-R ABI Air Quality products, a multi-channel algorithm similar to MODIS/VIIRS for NOAA’s next generation geostationary satellite.
Summary of GOES-R Activities at CIMSS/ASPB and Recommendations for the Future Steven Ackerman, Tom Achtor GOES-R Algorithm Working Group GOES-R Algorithm.
Satellite based instability indices for very short range forecasting of convection Estelle de Coning South African Weather Service Contributions from Marianne.
The Hyperspectral Environmental Suite (HES) and Advanced Baseline Imager (ABI) will be flown on the next generation of NOAA Geostationary Operational Environmental.
PRODUCTS Sea Ice Characteristics SSTs River Ice (Monitor/Detect) Snow Cover/Depth Surface Temperature and Moisture (Permafrost) Vertical Temp/Humidity.
Visible optical depth,  Optically thicker clouds correlate with colder tops Ship tracks Note, retrievals done on cloudy pixels which are spatially uniform.
OBJECTIVE  GRAFIIR is a facility established to leverage existing capabilities and those under development for both current GOES and its successor ABI.
4. GLM Algorithm Latency Testing 2. GLM Proxy Datasets Steve Goodman + others Burst Test 3. Data Error Handling Geostationary Lightning Mapper (GLM) Lightning.
Geostationary Operational Environmental Satellite R- Series GOES-R Proving Ground High Latitude and Arctic Test Bed Dr. Frank P Kelly Director Alaska Region.
Preliminary results from the new AVHRR Pathfinder Atmospheres Extended (PATMOS-x) Data Set Andrew Heidinger a, Michael Pavolonis b and Mitch Goldberg a.
Matthew Lagor Remote Sensing Stability Indices and Derived Product Imagery from the GOES Sounder
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Combining GOES Observations with Other Data to Improve Severe Weather Forecasts.
Data Distribution/dissemination Method
Radiance Simulation System for OSSE  Objectives  To evaluate the impact of observing system data under the context of numerical weather analysis and.
Cloud Detection: Optical Depth Thresholds and FOV Considerations Steven A. Ackerman, Richard A. Frey, Edwin Eloranta, and Robert Holz Cloud Detection Issues.
1 Synthetic Hyperspectral Radiances for Retrieval Algorithm Development J. E. Davies, J. A. Otkin, E. R. Olson, X. Wang, H-L. Huang, Ping Yang # and Jianguo.
GOES-R AIT: Updating the Data Processing System with data from the Himawari-8 Geostationary Satellite Jonathan Wrotny1, A. Li1, A. Ken1, H. Xie1, M. Fan1,
GOES-R Analysis Facility for Instrument Impacts on Requirements (GRAFIIR) An Efficient End-to-End Semi Automated GOES-R ABI Algorithm Performance Analysis.
GOES-R Observational Requirements: Alternative 1 (no sounder)
GOES-R Baseline Instruments
and the new geostationary constellation
ABI Visible/Near-IR Bands
RECENT INNOVATIONS IN DERIVING ATMOSPHERIC MOTION VECTORS AT CIMSS
GOES-R Analysis Facility for Instrument Impacts on Requirements (GRAFIIR) An Efficient End-to-End Semi Automated GOES-R ABI Algorithm Performance Analysis.
Satellite data that we’ve acquired
Generation of Simulated GIFTS Datasets
Development of inter-comparison method for 3.7µm channel of SLSTR-IASI
Potential GEO Products/Users
Presentation transcript:

1 GOES-R AWG Product Validation Tool Development AWG GRAFIIR Team June 16, 2011 Presented by: Mat Gunshor of CIMSS Ray Garcia, Allen Huang, Graeme Martin, Eva Schiffer, Hong Zhang and others (CIMSS/UW-Madison)

Products Baseline Products GRAFIIR can currently run (L1 and L2+): Radiances –Validation of radiance data at the pixel level is a core function of GRAFIIR capability. Clouds –Clear Sky Mask; Cloud Optical Depth; Cloud Particle Size; Cloud Top Phase; Cloud Top Height; Cloud Top Pressure; Cloud Top Temperature Soundings –Legacy Vertical Moisture Profile; Legacy Vertical Temperature Profile; Derived Stability Indices (CAPE, LI, etc); Total Precipitable Water. Fire Hot Spot Characterization Imagery Derived Motion Winds Land Surface/Skin Temperature Hurricane Intensity Volcanic Ash Detection Baseline algorithms are currently produced in GEOCAT here. 2

Products In the future GRAFIIR expects to be able to run all of the AWG ABI baseline products by employing the AIT Framework as the processing end of the system. We expect that eventually all ABI Baseline and Option 2 products will be available to GRAFIIR via the AIT Framework 3

Products 4 Wavelength Micrometers Channel ID Baseline Products Aerosol DetectionXXX XXXX XX Suspended Matter/ODXXX XX Clear Sky Masks X XX X XXX XX Cloud & Moisture ImageryXXXXXXXXXXXXXXXX Cloud Optical Depth X XX XX Cloud Particle Size X XX XX Cloud Top Phase XX XX Cloud Top Height XXX Cloud Top Pressure XXX Cloud Top Temperature XXX Hurricane Intensity X Rainfall Rate/QPE X XX X X Legacy Vertical Moisture Profile XXXXXXXXX Legacy Vertical Temp Profile XXXXXXXXX Derived Stability Indices XXXXXXXXX Total Precipitable Water XXXXXXXXX Downward Solar Insolation Surf XX X XX Reflected Solar Insolation TOA XX X XX Derived Motion Winds X XX XX X Fire Hot Spot Characterization X X X X Land Surface Temperature XX Snow CoverXXX XXX X Sea Surface Temps X X XXX Bands may also be used by needed “upstream” products, such as the cloud mask.

Products 5 Wavelength Micrometers Channel ID Option 2 Products Cloud Layer/Heights XXX Cloud Ice Water Path X XX Cloud Liquid Water X X X Cloud Type XX XX Convective Initiation XXXXX XX X Turbulence X Low Cloud and Fog X X Enhanced-V/Overshooting Top X Aircraft Icing Threat X XX XX XXX SO 2 Detection X XX XX Visibility (no direct use of ABI bands)XXX XXXX XX Upward Longwave Radiation (TOA) X X X X X Downward Longwave Radiation (SFC) XX XXXXXX Upward Longwave Radiation (SFC) X XXX Total Ozone XXX XXXXX Aerosol Particle SizeXXX XX Surface Emissivity XXXXXXXXX Surface AlbedoXX X XX Vegetation Index XX Vegetation Green Fraction XX Flood/Standing Water XX X XX Rainfall Potential (no direct use of ABI bands) X X X X X Rainfall Probability (no direct use of ABI bands) X X X X X Snow Depth (no direct use of ABI bands)XXX XXX X Sea & Lake Ice: Age (no direct use of ABI bands) XX X XX Sea & Lake Ice: Concentration XX X XX Sea & Lake Ice: Motion X Ocean Currents X Ocean Currents: Offshore X

Validation Strategies GRAFIIR seeks to be able to validate all of the ABI L1 data and L2+ products in the context of analyzing ABI instrument waiver requests from the vendor – By manipulating ABI proxy data to reflect instrument effects, GRAFIIR compares algorithm results “before” and “after” instrument effects are introduced to proxy data. The objective is to assess the effects of an instrument waiver on product performance for products that require the affected band(s). Current capability: product output comparisons, provide statistical analysis, and generate reports automatically through Glance. Future strategy: Obtain the AIT Framework in order to gain the capability of generating any ABI L2 product. –The Framework must be maintained and kept in sync with the AIT version. –It will remain important in the future to maintain synergy between NESDIS scientists and the algorithm developers (at cooperative institutes, for example) by employing the same environment for development (The AIT Framework). 6

Routine Validation Tools GRAFIIR has developed validation tools as part of its mission to assess instrument effects on ABI data and products. –The idea of “routine” perhaps does not fit. –Tool development has naturally grown to fit needs. –Tools in use now are more of the deep-dive variety. GRAFIIR Vision: make the current tools more easily automatable –An automatable version of GEOCAT/Framework paired with Glance and the collocation tools would give many product algorithm teams the ability to easily validate their products against a variety of “truth” datasets. 7

The validation tools used for GRAFIIR vary due to the nature of the instrument waiver instead of based on product type. 1.ABI instrument effects that can be applied to simulated ABI Proxy Data from the WRF model. 2.ABI instrument effects that cannot be (easily/quickly) applied to simulated ABI Proxy Data from WRF model. The AWG GRAFIIR Team has responded to ABI waivers for both situations and have developed tools accordingly. The best validation tool GRAFIIR has is Glance –This is the most easily applicable, cross-cutting tool we have available to other AWG teams and the AIT. –Glance can be used with both L1 data and L2+ products. 8 ”Deep-Dive” Validation Tools

9 Glance could make this easier!

10 ”Deep-Dive” Validation Tools Can you see a difference?

11 ”Deep-Dive” Validation Tools

The validation tools used for GRAFIIR vary due to the nature of the instrument waiver instead of based on product type. 1.ABI instrument effects that can be applied to simulated ABI Proxy Data from the WRF model. 2.ABI instrument effects that cannot be (easily/quickly) applied to simulated ABI Proxy Data from WRF model. 12 ”Deep-Dive” Validation Tools

1.ABI instrument effects that can be applied to simulated ABI Proxy Data from the WRF model. –Example: Striping in one or more spectral bands. –Example: Increased noise in one or more spectral bands. –Example: Navigation errors in one or more spectral bands. Note: ABI specifications exist for all of these parameters and a waiver is only required for when the expected instrument performance will be worse than the specs. When the effect is relatively easy to simulate in the existing simulated ABI proxy data sets, the process if fairly straightforward. 13 ”Deep-Dive” Validation Tools

1.ABI instrument effects that can be applied to simulated ABI Proxy Data from the WRF model. There are 4 primary steps to analyzing a waiver such as this: 1.Simulate the instrument effect in the proxy data. (MATLAB) This is the least straight-forward part of the process – depending on what the waiver is for and how it is deemed to affect the radiance data. 2.Produce products that rely on the affected spectral bands using data before and after the instrument effect was introduced. (GEOCAT) Could be done in the Framework; This is a straightforward step for an analyst familiar with the software. 3.Compare “before” and “after” products; analyze differences (Glance) Glance can read multiple file types and provide a variety of types of analysis 4.Obtain expert analysis of the results Typically we get input from the algorithm scientists and generate a PowerPoint presentation that also serves as a report. 14 ”Deep-Dive” Validation Tools

The following slides are an example of this first type of validation analysis –Existing proxy data are altered to reflect the effects of some out-of-spec component of the instrument. In this case, we’re pretending that we have one line of the detector array in one spectral band that is noisier than it should be. First, the instrument effect is simulated in proxy data. The validation shown here is visual, but we do statistical validation in this step as well; for instance the random noise generated is tested (is it normally distributed noise, predictable standard deviation). This is done in MATLAB. Second, products are generated that use this spectral band. This step is done in Geocat or could be done in the Framework. Third, we analyze the difference in the products generated using “control” and “waiver” data. We only show cloud top height here. This step is done using Glance. 15 ”Deep-Dive” Validation Tools

16 ”Deep-Dive” Validation Tools The Control Case: Magnified by 3x and focused on the Texas/Oklahoma border area convection where one of the out-of-spec lines passes through.

17 ”Deep-Dive” Validation Tools The Waiver Case: Magnified by 3x and focused on the Texas/Oklahoma border area convection where one of the out-of-spec lines passes through.

18 ”Deep-Dive” Validation Tools The Difference: Magnified by 3x, the out-of-spec line is evident in the brightness temperature difference image.

HTML Report generated by Glance with statistics and images. The “Zero Tolerance” analysis shows all the absolute changes introduced by one of out-of-spec line noise. Follow the link for the statistical report (click on a product variable name to see the reports for each one) –Cloud Top HeightCloud Top Height –Cloud Top PressureCloud Top Pressure –Cloud Top TemperatureCloud Top Temperature –Cloud Mask (unaffected)Cloud Mask (unaffected) –Cloud Phase (unaffected)Cloud Phase (unaffected) –Cloud Type (unaffected)Cloud Type (unaffected) 19 ”Deep-Dive” Validation Tools

20 ”Deep-Dive” Validation Tools Difference Image: Most of the image is a difference of 0. This is Cloud Top Height, but the image looks similar for Cloud Top Pressure and Cloud Top Temperature; from Glance.

21 ”Deep-Dive” Validation Tools “Trouble Points”: Trouble points are marked for any pixel in the two output files whose difference exceeds epsilon (which is 500m in this case). Cloud Top Height shown, from Glance.

22 ”Deep-Dive” Validation Tools The statistics are from Glance ProductTrouble Points Trouble Point Fraction Max Difference Mean Difference Height (500m) e-056, Pressure (50hPa) e Temperature (3K) e Cloud Mask0000 Cloud Phase0000 Cloud Type0000

2.ABI instrument effects that cannot be (easily/quickly) applied to simulated ABI Proxy Data from WRF model. –Example: Out-of-Spec Spectral Response Functions If the effect cannot be easily or accurately replicated in the simulated data it means we cannot generate products in GEOCAT and compare the outputs in Glance. –SRF changes are generally too time-consuming to get into the proxy data because they involve altering the forward model which. Proxy data generated from forward model calculations using forecast model atmospheric profile information is temporally expensive to produce and we typically only have 1-2 weeks to respond to a waiver. 23 ”Deep-Dive” Validation Tools

2.ABI instrument effects that cannot be (easily/quickly) applied to simulated ABI Proxy Data from WRF model. –Example: Out-of-Spec Spectral Response Functions There are 3 primary steps to analyzing a waiver such as this: 1.Simulate the instrument effect in proxy data (MATLAB) For example, convolve high spectral resolution data with before/after SRFs 2.Since products cannot be generated, use alternatives (MATLAB) In the case of SRFs, compare the brightness temperatures of convolved high spectral resolution data (e.g. IASI) and compare differences to the spec noise to get an understanding of their significance. We have to be sure we are still measuring key components (e.g. SO2). The products we can run in GEOCAT have all had analysis done on them previously with “pure” proxy data compared to data with spec noise added. 3.Obtain expert analysis of the results Typically we get input from the algorithm scientists and generate a PowerPoint presentation that also serves as a report. 24 ”Deep-Dive” Validation Tools

25 ”Deep-Dive” Validation Tools The following slides are an example of this second type of validation analysis –The 8.5um band SRF may be slightly out of spec –Will we still be able to see SO2? –How will the radiances be affected? First, SRFs must be obtained and altered. –We were given SRFs which were “compliant” and “non-compliant” with the specs. –These are not being shown here to avoid ITAR designation Second, radiances and brightness temperatures are generated from both calculated and measured high spectral resolution data. –We had some calculated spectra available with various amounts of SO2 Third, we analyze the differences in the radiances of the compliant and non-compliant SRFs. –These are compared to the spec noise for perspective.

26 ”Deep-Dive” Validation Tools Non-Compliant 8.5 SRF convolved with IASI Compliant 8.5 SRF convolved with IASI Brightness temperature difference Image

27 ”Deep-Dive” Validation Tools Ratio of the radiance difference (Spec Compliant minus Non-Spec Compliant) to the spec noise (NEdN) in this band (0.1303). When this ratio is less than 1 it means the difference is less than the spec NEdN. Spec-Compliant – Non-Spec-Compliant 8.5um Radiance Difference to NEdN Ratio

28 ”Deep-Dive” Validation Tools Using Glance to compare L2+ Product Output to a “truth” dataset that was generated as product output. –Most of GRAFIIR’s waiver tasks are to measure the effects of a change on product output. –But many algorithm teams have a need to validate their product against another type of measured data to quantify product performance. Glance can be used to do more validation. –As teams learn what their needs are and develop capabilities we hope to be able to merge these ideas. –Ideally, scientists should be doing more analysis and not have to worry about the traditionally difficult tasks of collocating data, processing it, etc.

29 ”Deep-Dive” Validation Tools Example: Using Glance to compare WRF model output cloud top temperature to the AWG product algorithm cloud top temperature generated with the simulated Proxy ABI data (generated from the WRF-model output). –The WRF model cloud top information is treated as “truth” –We expect there to be differences because the model reports a cloud top when it is too optically thin to be detected by ABI. So WRF model cloud tops should be higher and colder than those in the proxy data. –Note: One file is an hdf file output from GEOCAT and the other is a netCDF generated from WRF model output.

30 ”Deep-Dive” Validation Tools Cloud Top Temperature from ABI cloud algorithm (from Glance)

31 ”Deep-Dive” Validation Tools Cloud Top Temperature from WRF (from Glance)

32 ”Deep-Dive” Validation Tools Difference image, WRF output – Proxy L2 (from Glance)

33 ”Deep-Dive” Validation Tools Statistics from Glance: Numerical Comparison Statistics correlation*: diff_outside_epsilon_count*: diff_outside_epsilon_fraction*: 1 max_diff*: mean_diff*: median_diff*: mismatch_points_count*: mismatch_points_fraction*: perfect_match_count*: 0 perfect_match_fraction*: 0 r-squared correlation*: rms_diff*: std_diff*: 14.11*

GRAVA (GOES-R Advanced Validation Automation) –Greater automation of validation tasks. –An extension of GRAFIIR to optimize field campaign work for GOES-R –Beginning with a planned analysis of field campaigns to asses how to optimize utilization of them for GOES-R Cal/Val. AIT Framework at CIMSS to expand our access to more products. –Converge on file types for inputs and outputs (avoid reliance on McIDAS AREA files for input) –Converge on calibration methods (adopt the Imagery Team file format) –Converge on navigation (Fixed Grid Format) Merging the collocation capabilities with Glance should make validation easier for a host of algorithm teams. The GRAFIIR Team’s use of Glance thus far has been fairly limited in that comparisons are normally done as a before/after look at instrument effects on product performance. But Glance can be used to compare to a “truth” dataset that is not a prior run of the product. 34 Ideas for the Further Enhancement and Utility of Validation Tools

35 Summary Glance –The GRAFIIR team has helped to develop a validation tool which can be used for both routine validation and as a deep-dive tool. –In analyzing multiple ABI waivers to date, the GRAFIIR team has been doing both L1 radiance and L2 product validation already using Glance. –Glance can meet the needs of many product algorithm teams. GRAVA –A future extension of GRAFIIR for greater automation –Coordination of collocation, calibration/validation, field campaign and other data, L1 and L2+ ABI data and products, visualization, and Glance. GOES-R AWG –The GRAFIIR toolset is not a replacement for science expertise. –Scientists should spend less time worrying about: File formats Collocation –The GRAFIIR team can help!

36 Summary More Information How to Install Glance Glance Documentation Eva Schiffer Ray Garcia Mat Gunshor