GHRSST XI Science Team Meeting, ST-VAL, 21-25 June 2010, Lima, Peru Recent developments to the SST Quality Monitor (SQUAM) and SST validation with In situ.

Slides:



Advertisements
Similar presentations
Improvements to the NOAA Geostationary Sea Surface Temperature Product Suite Eileen Maturi, NOAA/NESDIS/STAR Andy Harris, Jonathan Mittaz, Prabhat Koner.
Advertisements

JPSS and GOES-R SST Sasha Ignatov
15 May 2009ACSPO v1.10 GAC1 ACSPO upgrade to v1.10 Effective Date: 04 March 2009 Sasha Ignatov, XingMing Liang, Yury Kihai, Boris Petrenko, John Stroup.
March 6, 2013iQuam v21 In situ Quality Monitor (iQuam) version 2 Near-real time online Quality Control, Monitoring, and Data Serving tool for SST Cal/Val.
Pathfinder –> MODIS -> VIIRS Evolution of a CDR Robert Evans, Peter Minnett, Guillermo Podesta Kay Kilpatrick (retired), Sue Walsh, Vicki Halliwell, Liz.
1 High resolution SST products for 2001 Satellite SST products and coverage In situ observations, coverage Quality control procedures Satellite error statistics.
Satellite SST Radiance Assimilation and SST Data Impacts James Cummings Naval Research Laboratory Monterey, CA Sea Surface Temperature Science.
Medspiration user meeting, dec 4-6 Use of Medspiration and GHRSST data in the Northern Seas Jacob L. Høyer & Søren Andersen Center for Ocean and Ice, Danish.
Characterizing and comparison of uncertainty in the AVHRR Pathfinder SST field, Versions 5 & 6 Robert Evans Guilllermo Podesta’ RSMAS Nov 8, 2010 with.
8 November 2010SST Science Team Meeting1 Towards Community Consensus SSTs and Clear-Sky Radiances from AVHRR SST Science Team Meeting 8-10 November 2010,
1 High Resolution Daily Sea Surface Temperature Analysis Errors Richard W. Reynolds (NOAA, CICS) Dudley B. Chelton (Oregon State University)
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 NOAA Operational Geostationary Sea Surface Temperature Products from NOAA.
1 Improved Sea Surface Temperature (SST) Analyses for Climate NOAA’s National Climatic Data Center Asheville, NC Thomas M. Smith Richard W. Reynolds Kenneth.
1 NOAA’s National Climatic Data Center April 2005 Climate Observation Program Blended SST Analysis Changes and Implications for the Buoy Network 1.Plans.
1 Sea Surface Temperature Analyses NOAA’s National Climatic Data Center Asheville, NC Richard W. Reynolds.
Improved NCEP SST Analysis
Recent activities on utilization of microwave imager data in the JMA NWP system - Preparation for AMSR2 data assimilation - Masahiro Kazumori Japan Meteorological.
4 June 2009GHRSST-X STM - SQUAM1 The SST Quality Monitor (SQUAM) 10 th GHRSST Science Team Meeting 1-5 June 2009, Santa Rosa, CA Alexander “Sasha” Ignatov*,
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 CLOUD MASK AND QUALITY CONTROL FOR SST WITHIN THE ADVANCED CLEAR SKY PROCESSOR.
Orbit Characteristics and View Angle Effects on the Global Cloud Field
MODIS Sea-Surface Temperatures for GHRSST-PP Robert H. Evans & Peter J. Minnett Otis Brown, Erica Key, Goshka Szczodrak, Kay Kilpatrick, Warner Baringer,
Number of match-ups Mean Anomaly Fig. 3. Time-series of night MUT SST anomaly statistics compared to daily OISST SST. SST from different platforms mostly.
GOES-R AWG 2 nd Validation Workshop 9-10 January 2014, College Park, MD GOES-R and JPSS SST Monitoring System Sasha Ignatov, Prasanjit Dash, Xingming Liang,
Application of in situ Observations to Current Satellite-Derived Sea Surface Temperature Products Gary A. Wick NOAA Earth System Research Laboratory With.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 In Situ SST for Satellite Cal/Val and Quality Control Alexander Ignatov.
GHRSST XIV STM – AUS TAG. Woods Hole 18 June 2013, Cape Cod, MA 1 GHRSST 2013 Annual Meeting – AUS TAG breakout session 18 June, 2013, Woods Hole, MA SQUAM.
1 GOES-R AWG Product Validation Tool Development Sea Surface Temperature (SST) Team Sasha Ignatov (STAR)
1 SST Near-Real Time Online Validation Tools Sasha Ignatov (STAR) AWG Annual Meeting June 2011, Ft Collins, CO.
Monitoring/Validation of HR SSTs in SQUAM GHRSST-XV, 2-6 June 2014, Cape Town, South Africa 1 The 15 th GHRSST 2014 meeting, ST-VAL Breakout session 2–6.
Which VIIRS product to use: ACSPO vs. NAVO GHRSST-XV, 2-6 June 2014, Cape Town, South Africa 1 Prasanjit Dash 1,2, Alex Ignatov 1, Yuri Kihai 1,3, John.
1 RTM/NWP-BASED SST ALGORITHMS FOR VIIRS USING MODIS AS A PROXY B. Petrenko 1,2, A. Ignatov 1, Y. Kihai 1,3, J. Stroup 1,4, X. Liang 1,5 1 NOAA/NESDIS/STAR,
GHRSST XI Meeting, IC-TAG Breakout Session, 22 June 2010, Lima, Peru Cross-monitoring of L4 SST fields in the SST Quality Monitor (SQUAM)
Retrieval Algorithms The derivations for each satellite consist of two steps: 1) cloud detection using a Bayesian Probabilistic Cloud Mask; and 2) application.
Infrared and Microwave Remote Sensing of Sea Surface Temperature Gary A. Wick NOAA Environmental Technology Laboratory January 14, 2004.
GHRSST XIV Science Team Meeting. Woods Hole June 2013, Cape Cod, MA 1 GHRSST 2013 Annual Meeting 17–21 June, 2013, Woods Hole, MA Preliminary analyses.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Monitoring of IR Clear-sky Radiances over Oceans for SST (MICROS) Alexander.
NOAA Climate Observation Annual Review Silver, Spring, MD Sept. 3-5, Intercomparisons Among Global Daily SST Analyses NOAA’s National Climatic Data.
24 January th AMS Symposium on Future Operational Environmental Satellite Systems 22 – 26 January 2012, New Orleans, LA NPP VIIRS SST Algorithm.
2 November 2011JPSS SST at STAR 1 2 nd NASA SST Science Team Meeting 2 – 4 November 2011, Miami Beach, FL Joint Polar Satellite System (JPSS) SST Algorithm.
Preliminary results from the new AVHRR Pathfinder Atmospheres Extended (PATMOS-x) Data Set Andrew Heidinger a, Michael Pavolonis b and Mitch Goldberg a.
Quality Flags: Two separate sets of QFs are provided in IDPS Product: (1) SST QFs; and (2) VIIRS Cloud Mask. Analyses were aimed at identifying a suitable.
BLUElink> Regional High-Resolution SST Analysis System – Verification and Inter-Comparison Helen Beggs Ocean & Marine Forecasting Group, BMRC, Bureau of.
CLIMAR-III, Gdynia, Poland, May 2008 Advances in the AVHRR Pathfinder Sea Surface Temperature Climate Data Record and its Connections with GHRSST Reanalysis.
GHRSST HL_TAG meeting Copenhagen, March 2010 Validation of L2P products in the Arctic Motivation: Consistent inter-satellite validation of L2p SST observations.
AQUA AMSR-E MODIS POES AVHRR TRMM TMI ENVISAT AATSR Multi-satellite, multi-sensor data fusion: global daily 9 km SSTs from MODIS, AMSR-E, and TMI
Report from the Australian Regional Data Assembly Centre Helen Beggs 1, Leon Majewski 2 and Justin Freeman 1 1 Centre for Australian Weather and Climate.
A comparison of AMSR-E and AATSR SST time-series A preliminary investigation into the effects of using cloud-cleared SST data as opposed to all-sky SST.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Image: MODIS Land Group, NASA GSFC March 2000 Advanced Clear-Sky Processor.
Martin Rutherford Technical Director Oceanography and Meteorology Royal Australian Navy 02 Jun 2009 Towards a Bias Corrected L4 Median.
International GHRSST User Symposium Santa Rosa, California, USA 28-29th May 2009 MODIS Sea-Surface Temperatures Peter J Minnett & Robert H. Evans With.
2 Jun 09 UNCLASSIFIED 10th GHRSST Science Team Meeting Santa Rosa, CA 1 – 5 June Presented by Bruce McKenzie Charlie N. Barron, A.B. Kara, C. Rowley.
1 March 2011iQuam GHRSST DV-WG, HL-TAG and ST-VAL Meeting 28 February – 2 March 2011, Boulder, CO In situ Quality Monitor (iQuam) Near-real time.
Japan Meteorological Agency, June 2016 Coordination Group for Meteorological Satellites - CGMS JMA’s Cal/Val activities Presented to CGMS-44 Working Group.
Characterizing and comparison of uncertainty in the AVHRR Pathfinder Versions 5 & 6 SST field to various reference fields Robert Evans Guilllermo Podesta’
© Crown copyright Met Office Report of the GHRSST Inter-Comparison TAG (IC-TAG) Matt Martin GHRSST XI meeting, Lima, Peru, June 2010.
Uncertainty estimation from first principles: The future of SSES? Gary Corlett (University of Leicester) Chris Merchant (University of Edinburgh)
Validation in Arctic conditions Steinar Eastwood (met.no) David Poulter (NOCS) GHRSST9, Perros-Guirec OSI SAF METOP SST, days mean.
GHRSST-9 Perros-Guirec, France 9-13 June Intercomparisons Among Global Daily SST Analyses NOAA’s National Climatic Data Center Asheville, NC, USA.
29 May 2009GHRSST User's Symp - SQUAM1 The SST Quality Monitor (SQUAM) 1 st GHRSST Int’l User’s Symposium May 2009, Santa Rosa, CA Alexander “Sasha”
Validation of MTSAT-1R SST for the TWP+ Experiment Leon Majewski 1, Jon Mittaz 2, George Kruger 1,3, Helen Beggs 3, Andy Harris 2, Sandra Castro 4 1 Observations.
L2P and L3 SST data produced by EUMETSAT/OSI SAF and EC/MyOcean
GSICS Telecon July 2012 AVHRR, MODIS, VIIRS Radiance Monitoring in MICROS and GSICS help to SST Sasha Ignatov.
Monitoring of SST Radiances
Joint GRWG and GDWG Meeting February 2010, Toulouse, France
SST – GSICS Connections
NOAA Report on Ocean Parameters - SST Presented to CGMS-43 Working Group 2 session, agenda item 9 Author: Sasha Ignatov.
Using Double Differences in MICROS for Cross-Sensor Consistency Checks
Passive Microwave Radiometer constellation for Sea Surface Temperature Prepared by CEOS Sea Surface Temperature Virtual Constellation (SST-VC) Presented.
Towards Understanding and Resolving Cross-Platform Biases in MICROS
Radiometric Consistency between AVHRR, MODIS, and VIIRS in SST Bands
Presentation transcript:

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Recent developments to the SST Quality Monitor (SQUAM) and SST validation with In situ SST Quality Monitor (iQUAM) data GHRSST XI Science Team Meeting June 2010, Lima, Peru Alexander Ignatov 1, Prasanjit Dash 1,2 and Pierre LeBorgne 3 1 NOAA/NESDIS, Center for Satellite Applications & Research (STAR) 2 Colorado State Univ, Cooperative Institute for Research in the Atmosphere (CIRA) 3 Meteo-France, Satellite Meteorology Centre 1

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Initial Objectives of SQUAM  Monitor NESDIS operational AVHRR SST products in NRT Heritage Main Unit Task (MUT, 2001-pr) Advanced Clear-Sky Processor for Oceans (ACSPO, 2008-pr) for stability, self-consistency, cross-platform & cross- product consistency  Evaluate satellite SST products daily in global domain, against global L4 fields (Reynolds, RTG, OSTIA, ODYSSEA)  Quickly identify anomalies & facilitate product diagnostics (e.g., due to sensor malfunction, cloud mask, or SST algorithm) 2

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Recent new additions  Worked with NCEP to add NRT inter-comparison of daily L4 SSTs : Two Reynolds (AVHRR-only and AVHRR+AMSR) Two RTG (low and high-resolution) OSTIA ODYSSEA  Collaborated with NAVOCEANO to include SEATEMP GAC SSTs : Platform/sensor: AVHRRs onboard NOAA-14 through 19,MetOp-A Time: 2000 – recent  MetOp-A AVHRR ~1km FRAC SST products were added : NESDIS: ACSPO FRAC Worked with O&SI SAF (MGR SST) 3

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Premises of validating against L4  In situ measurements have limitations :  Sparse and geographically biased  Quality non-uniform and suboptimal  Not available in NRT in sufficient numbers  SQUAM complements heritage VAL against in situ:  Calculates ΔT S = Satellite SST (T S ) – L4 SST (T R ) Tabs for ΔT S in SQUAM  Maps  Histograms  Statistical time series  Dependencies  Hovmöller time series (selected products) 4

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Maps of ΔT S (T S -T R ) Maps are used to assess satellite SST globally “at a glance” MetOp-A – OSTIA, NESDIS ACSPO 1km FRAC, 17-Jan-2010, Night 5

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Histograms of ΔT S (T S -T R ): Reference SST: In situ (from CalVal system) 30 days of data: ~7,000 match-ups with in situ SST Median = K; Robust STD = 0.27 K MUT satellite SST – quality controlled in situ SST (iQUAM)**, Night, Mar-2009 ** Quality controlled monthly in situ data available at: 6

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Histograms of ΔT S (T S -T R ): Reference SST: OSTIA (from MUT SQUAM system) More at SQUAM web MUT satellite night SST – OSTIA, 11-May to 20-May days of data: ~500,000 match-ups with OSTIA Median = 0.00 K; Robust STD = 0.30 K 7

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Time Series (NESDIS MUT SST) Reference SST: In situ SST 1 data point = 1 month match-up with in situ data Median bias within ~0.1 K (except N16 – sensor problems) MUT satellite SST – quality controlled in situ SST (iQUAM), Night 8

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Time series (NESDIS MUT SST) Reference SST: Daily Reynolds More at SQUAM web 1 data point = 1 week match-up with OSTIA Patterns reproducible yet crisper (finer temporal resolution) Short-term noise in time series: artifacts in Reynolds SST MUT satellite night SST – daily Reynolds 9

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Artificial dependencies View zenith angle Such ‘retrieval-space’ dependent biases are difficult to uncover and quantify using customary validation against in situ data, which do not fully cover the retrieval space. The SQUAM diagnostics helped uncover a bug in the MUT SST which was causing across-swath bias >0.7K. After correction, bias reduced to ~0.2K and symmetric with respect to nadir. 10

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Timeseries zonal dependences (MetOp-A FRAC - OSTIA: O&SI SAF, ACSPO) Timeseries plots of Mean ΔT S is used to detect persistent areas of cloud contamination. More combinations available at SQUAM web. 11

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Summary and Future Work  SQUAM currently monitors:  Three NESDIS AVHRR SSTs MUT GAC (NOAA-16 to NOAA-19, MetOp-A) ACSPO FRAC ACSPO GAC  O&SI SAF FRAC (together with NESDIS FRAC)  NAVOCEANO SEATEMP GAC (NOAA-14 to NOAA-19, MetOp-A)  Inter-comparison of six daily Level-4 (L4) products  Products show high degree of cross-platform and day-night consistency but there is room for improvement  Using data from In situ Quality Monitor (iQUAM), validation activity is ongoing with quality controlled buoy data (more results will be shown in next term).  Future plans  Reconcile NESDIS AVHRR SSTs (different platforms, Day-Night): Improve AVHRR sensor calibration (NESDIS) Adjust T REF for diurnal cycle (e.g., Gentemann model) Improve NESDIS SST product (cloud screening, SST algorithms)  Include more L4 SSTs to L4-SQUAM prototype (K10, GHRSST GMPE) Work with L4 producers to reconcile different L4s and satellite SSTs THANK YOU! 12

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru L4 intercomparisons in SQUAM Using Double Difference for cross-platform consistency BACK-UP SLIDES 13

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru L4 comparisons 14

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru L4 intercomparisons in SQUAM “The best” L4 SST is not easy to identify. For diagnostics & monitoring, any L4 may be used. Currently, the L4 SSTs show large differences, especially in the high latitudes. This should be coordinated among developers and resolved. More L4-analyses at Reynolds minus OSTIA L4 SST – RTG (low resolution), Mean 15

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru  Cross-platform consistency of T SAT can be evaluated from time series of T SAT - T REF overlaid for different platforms  For more quantitative analyses, one ‘reference’ platform can be selected & subtracted from all other ( T SAT - T REF )  N17 was selected as ‘reference’, because it is available for the full SQUAM period, and its AVHRR is stable  Double-differences (DD) were calculated as DD = ( T SAT - T REF ) - ( T N17 - T REF ) for SAT=N16, N18, and MetOp-A Cross-Platform Consistency Using Double-Differences (T SAT – T SAT_REF ) 16

GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Quantitative evaluation of cross-platform consistency using Double-differences (DD): DD = ( T SAT - T REF ) - ( T N17 - T REF ) Cross-Platform Consistency in MUT Using Double-Differences (T SAT – T SAT_REF ) Choice of third transfer standard is not critical. More analyses at: 17