4 June 2009GHRSST-X STM - SQUAM1 The SST Quality Monitor (SQUAM) 10 th GHRSST Science Team Meeting 1-5 June 2009, Santa Rosa, CA Alexander “Sasha” Ignatov*,

Slides:



Advertisements
Similar presentations
JPSS and GOES-R SST Sasha Ignatov
Advertisements

15 May 2009ACSPO v1.10 GAC1 ACSPO upgrade to v1.10 Effective Date: 04 March 2009 Sasha Ignatov, XingMing Liang, Yury Kihai, Boris Petrenko, John Stroup.
March 6, 2013iQuam v21 In situ Quality Monitor (iQuam) version 2 Near-real time online Quality Control, Monitoring, and Data Serving tool for SST Cal/Val.
Pathfinder –> MODIS -> VIIRS Evolution of a CDR Robert Evans, Peter Minnett, Guillermo Podesta Kay Kilpatrick (retired), Sue Walsh, Vicki Halliwell, Liz.
GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Recent developments to the SST Quality Monitor (SQUAM) and SST validation with In situ.
1 High resolution SST products for 2001 Satellite SST products and coverage In situ observations, coverage Quality control procedures Satellite error statistics.
1 Met Office, United Kingdom AATSR Meteo product: global SST validation at the Met Office Lisa Horrocks Jim Watts, Roger Saunders, Anne O’Carroll Envisat.
Satellite SST Radiance Assimilation and SST Data Impacts James Cummings Naval Research Laboratory Monterey, CA Sea Surface Temperature Science.
Characterizing and comparison of uncertainty in the AVHRR Pathfinder SST field, Versions 5 & 6 Robert Evans Guilllermo Podesta’ RSMAS Nov 8, 2010 with.
8 November 2010SST Science Team Meeting1 Towards Community Consensus SSTs and Clear-Sky Radiances from AVHRR SST Science Team Meeting 8-10 November 2010,
1 High Resolution Daily Sea Surface Temperature Analysis Errors Richard W. Reynolds (NOAA, CICS) Dudley B. Chelton (Oregon State University)
GOES-13 Science Team Report SST Images and Analyses Eileen Maturi, STAR/SOCD, Camp Springs, MD Andy Harris, CICS, University of Maryland, MD Chris Merchant,
NOAA Climate Obs 4th Annual Review Silver Spring, MD May 10-12, NOAA’s National Climatic Data Center 1.SSTs for Daily SST OI NOAA’s National.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 NOAA Operational Geostationary Sea Surface Temperature Products from NOAA.
1 Improved Sea Surface Temperature (SST) Analyses for Climate NOAA’s National Climatic Data Center Asheville, NC Thomas M. Smith Richard W. Reynolds Kenneth.
1 NOAA’s National Climatic Data Center April 2005 Climate Observation Program Blended SST Analysis Changes and Implications for the Buoy Network 1.Plans.
Improved NCEP SST Analysis
1 Comparisons of Daily SST Analyses for NOAA’s National Climatic Data Center Asheville, NC Richard W. Reynolds (NOAA, NCDC) Dudley B. Chelton.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 CLOUD MASK AND QUALITY CONTROL FOR SST WITHIN THE ADVANCED CLEAR SKY PROCESSOR.
MODIS Sea-Surface Temperatures for GHRSST-PP Robert H. Evans & Peter J. Minnett Otis Brown, Erica Key, Goshka Szczodrak, Kay Kilpatrick, Warner Baringer,
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 MAP (Maximum A Posteriori) x is reduced state vector [SST(x), TCWV(w)]
Number of match-ups Mean Anomaly Fig. 3. Time-series of night MUT SST anomaly statistics compared to daily OISST SST. SST from different platforms mostly.
Quantifying the effect of ambient cloud on clear-sky ocean brightness temperatures and SSTs Korak Saha 1,2, Alexander Ignatov 1, and XingMing Liang 1,2.
GOES-R AWG 2 nd Validation Workshop 9-10 January 2014, College Park, MD GOES-R and JPSS SST Monitoring System Sasha Ignatov, Prasanjit Dash, Xingming Liang,
Assimilating Retrievals of Sea Surface Temperature from VIIRS and AMSR2 Bruce Brasnett and Dorina Surcel Colan CMDE November 21, 2014 Brasnett, B. and.
Application of in situ Observations to Current Satellite-Derived Sea Surface Temperature Products Gary A. Wick NOAA Earth System Research Laboratory With.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 In Situ SST for Satellite Cal/Val and Quality Control Alexander Ignatov.
GHRSST XIV STM – AUS TAG. Woods Hole 18 June 2013, Cape Cod, MA 1 GHRSST 2013 Annual Meeting – AUS TAG breakout session 18 June, 2013, Woods Hole, MA SQUAM.
1 GOES-R AWG Product Validation Tool Development Sea Surface Temperature (SST) Team Sasha Ignatov (STAR)
1 SST Near-Real Time Online Validation Tools Sasha Ignatov (STAR) AWG Annual Meeting June 2011, Ft Collins, CO.
Monitoring/Validation of HR SSTs in SQUAM GHRSST-XV, 2-6 June 2014, Cape Town, South Africa 1 The 15 th GHRSST 2014 meeting, ST-VAL Breakout session 2–6.
Which VIIRS product to use: ACSPO vs. NAVO GHRSST-XV, 2-6 June 2014, Cape Town, South Africa 1 Prasanjit Dash 1,2, Alex Ignatov 1, Yuri Kihai 1,3, John.
1 RTM/NWP-BASED SST ALGORITHMS FOR VIIRS USING MODIS AS A PROXY B. Petrenko 1,2, A. Ignatov 1, Y. Kihai 1,3, J. Stroup 1,4, X. Liang 1,5 1 NOAA/NESDIS/STAR,
Eileen Maturi 1, Jo Murray 2, Andy Harris 3, Paul Fieguth 4, John Sapper 1 1 NOAA/NESDIS, U.S.A., 2 Rutherford Appleton Laboratory, U.K., 3 University.
GHRSST XI Meeting, IC-TAG Breakout Session, 22 June 2010, Lima, Peru Cross-monitoring of L4 SST fields in the SST Quality Monitor (SQUAM)
1 Daily OI Analysis for Sea Surface Temperature NOAA’s National Climatic Data Center Asheville, NC Richard W. Reynolds (NOAA, NCDC) Thomas M. Smith (NOAA,
Retrieval Algorithms The derivations for each satellite consist of two steps: 1) cloud detection using a Bayesian Probabilistic Cloud Mask; and 2) application.
AVHRR Radiance Bias Correction Andy Harris, Jonathan Mittaz NOAA Cooperative Institute for Climate Studies University of Maryland Some concepts and some.
GHRSST XIV Science Team Meeting. Woods Hole June 2013, Cape Cod, MA 1 GHRSST 2013 Annual Meeting 17–21 June, 2013, Woods Hole, MA Preliminary analyses.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Monitoring of IR Clear-sky Radiances over Oceans for SST (MICROS) Alexander.
24 January th AMS Symposium on Future Operational Environmental Satellite Systems 22 – 26 January 2012, New Orleans, LA NPP VIIRS SST Algorithm.
2 November 2011JPSS SST at STAR 1 2 nd NASA SST Science Team Meeting 2 – 4 November 2011, Miami Beach, FL Joint Polar Satellite System (JPSS) SST Algorithm.
Quality Flags: Two separate sets of QFs are provided in IDPS Product: (1) SST QFs; and (2) VIIRS Cloud Mask. Analyses were aimed at identifying a suitable.
BLUElink> Regional High-Resolution SST Analysis System – Verification and Inter-Comparison Helen Beggs Ocean & Marine Forecasting Group, BMRC, Bureau of.
AQUA AMSR-E MODIS POES AVHRR TRMM TMI ENVISAT AATSR Multi-satellite, multi-sensor data fusion: global daily 9 km SSTs from MODIS, AMSR-E, and TMI
A comparison of AMSR-E and AATSR SST time-series A preliminary investigation into the effects of using cloud-cleared SST data as opposed to all-sky SST.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Image: MODIS Land Group, NASA GSFC March 2000 Advanced Clear-Sky Processor.
International GHRSST User Symposium Santa Rosa, California, USA 28-29th May 2009 MODIS Sea-Surface Temperatures Peter J Minnett & Robert H. Evans With.
2 Jun 09 UNCLASSIFIED 10th GHRSST Science Team Meeting Santa Rosa, CA 1 – 5 June Presented by Bruce McKenzie Charlie N. Barron, A.B. Kara, C. Rowley.
1 March 2011iQuam GHRSST DV-WG, HL-TAG and ST-VAL Meeting 28 February – 2 March 2011, Boulder, CO In situ Quality Monitor (iQuam) Near-real time.
Characterizing and comparison of uncertainty in the AVHRR Pathfinder Versions 5 & 6 SST field to various reference fields Robert Evans Guilllermo Podesta’
EARWiG: SST retrieval issues for High Latitudes Andy Harris Jonathan Mittaz NOAA-CICS University of Maryland Chris Merchant U Edinburgh.
EARWiG: SST retrieval issues for TWP Andy Harris Jonathan Mittaz Prabhat Koner NOAA-CICS University of Maryland Eileen Maturi NOAA/NESDIS, Camp Springs,
© Crown copyright Met Office Report of the GHRSST Inter-Comparison TAG (IC-TAG) Matt Martin GHRSST XI meeting, Lima, Peru, June 2010.
Uncertainty estimation from first principles: The future of SSES? Gary Corlett (University of Leicester) Chris Merchant (University of Edinburgh)
New Australian High Resolution AVHRR SST Products from the Integrated Marine Observing System Presented at the GHRSST Users Symposium, Santa Rosa, USA,
SST from MODIS AQUA and TERRA Kay Kilpatrick, Ed Kearns, Bob Evans, and Peter Minnett Rosenstiel School of Marine and Atmospheric Science University of.
Prototyping SST Retrievals from GOES-R ABI with MSG SEVIRI Data Nikolay V. Shabanov 1,2, (301)
29 May 2009GHRSST User's Symp - SQUAM1 The SST Quality Monitor (SQUAM) 1 st GHRSST Int’l User’s Symposium May 2009, Santa Rosa, CA Alexander “Sasha”
Validation of MTSAT-1R SST for the TWP+ Experiment Leon Majewski 1, Jon Mittaz 2, George Kruger 1,3, Helen Beggs 3, Andy Harris 2, Sandra Castro 4 1 Observations.
GSICS Telecon July 2012 AVHRR, MODIS, VIIRS Radiance Monitoring in MICROS and GSICS help to SST Sasha Ignatov.
Monitoring of SST Radiances
Joint GRWG and GDWG Meeting February 2010, Toulouse, France
SST – GSICS Connections
NOAA Report on Ocean Parameters - SST Presented to CGMS-43 Working Group 2 session, agenda item 9 Author: Sasha Ignatov.
Using Double Differences in MICROS for Cross-Sensor Consistency Checks
Validation of Satellite-derived Lake Surface Temperatures
Towards Understanding and Resolving Cross-Platform Biases in MICROS
The SST CCI: Scientific Approaches
Radiometric Consistency between AVHRR, MODIS, and VIIRS in SST Bands
Presentation transcript:

4 June 2009GHRSST-X STM - SQUAM1 The SST Quality Monitor (SQUAM) 10 th GHRSST Science Team Meeting 1-5 June 2009, Santa Rosa, CA Alexander “Sasha” Ignatov*, Prasanjit Dash*, John Sapper**, Yury Kihai* NOAA/NESDIS *Center for Satellite Applications & Research (STAR) **Office of Satellite Data Processing & Distribution (OSDPD)

4 June 2009GHRSST-X STM - SQUAM2 NESDIS Operational AVHRR SST Products  Heritage Main Unit Task (MUT) present (McClain et al., 1985; Walton et al., 1998).  New Advanced Clear-Sky Processor for Oceans (ACSPO) -May 2008 – present Employ L4 SSTs (Reynolds, RTG, OSTIA, ODYSSEA,..) to Evaluate MUT and ACSPO SST products in near-real time for self-, cross-platform and cross-product consistency Identify product anomalies & help diagnose their causes (e.g., sensor malfunction, cloud mask, or SST algorithm) Objective of the SST Quality Monitor (SQUAM)

4 June 2009GHRSST-X STM - SQUAM3 Customarily, satellite SSTs are validated against in situ SSTs However, in situ SSTs have limitations  They are sparse and geographically biased (cover retrieval domain not fully and non-uniformly).  Have non-uniform and suboptimal quality (often comparable to or worse than satellite SSTs).  Not available in near real time in sufficient numbers to cover the full geographical domain and retrieval space.

4 June 2009GHRSST-X STM - SQUAM4 AVHRR SST MetOp-A GAC, 3 January 2008 (Daytime) Heritage MUT SST product ACSPO SST product SST imagery is often inspected visually for quality and artifacts. Large-scale SST background dominates making it not easy to discern “signal” from “noise”.

4 June 2009GHRSST-X STM - SQUAM5 Heritage MUT SST product Mapping deviations from a global reference field constrains the SST “signal” and emphasizes “noise”. This helps reveal artifacts in SST product (cold stripes at swath edges). Removing large-scale SST background (daily 0.25 º Reynolds) emphasizes ‘noise’ ACSPO SST product

4 June 2009GHRSST-X STM - SQUAM6 View angle dependence of ‘MUT - daily Reynolds SST’ (NOAA-17) Such ‘retrieval-space’ dependent biases are difficult to uncover and quantify using customary validation against in situ data, which do not fully cover the retrieval space. The SQUAM diagnostics helped uncover a bug in the MUT SST which was causing across-swath bias >0.7K. After correction, bias reduced to ~0.2K and symmetric with respect to nadir.

4 June 2009GHRSST-X STM - SQUAM7 Use global L4 SST products to quantitatively evaluate satellite SST  Satellite & reference SSTs are subject to near-Gaussian errors T SAT = T TRUE + ε SAT ; ε SAT = N(μ sat,σ sat 2 ) T REF = T TRUE + ε REF ; ε REF = N(μ ref,σ ref 2 ) where μ’s and σ’s are global mean and standard deviations of ε‘s  The residual is distributed near-normally ΔT = T SAT - T REF = ε SAT - ε REF ; ε ΔT = N(μ ΔT,σ ΔT 2 ) where μ ΔT = μ sat - μ ref ; σ ΔT 2 = σ sat 2 + σ ref 2 (if ε SAT and ε REF are independent)  If T REF = T in situ, then it is customary ‘validation’. If (μ ref, σ ref ) are comparable to (μ in situ, σ in situ ), and if ε SAT and ε REF are not too strongly correlated, then T REF can be used to monitor T SAT

4 June 2009GHRSST-X STM - SQUAM8 Global Histograms of T SAT - T REF ( Nighttime MUT)

4 June 2009GHRSST-X STM - SQUAM9 Histogram of SST residual Reference SST: In situ 30 days of data: ~6,500 match-ups with in situ SST Median = K; Robust Standard Deviation = 0.27 K

4 June 2009GHRSST-X STM - SQUAM10 8 days of data: ~483,500 match-ups with OSTIA SST Median = 0.00 K; Robust Standard Deviation = 0.30 K Histogram of SST residual Reference SST: OSTIA

4 June 2009GHRSST-X STM - SQUAM11 8 days of data: ~483,700 match-ups with daily Reynolds SST Median = K; Robust Standard Deviation = 0.44 K Histogram of SST residual Reference SST: Daily Reynolds

4 June 2009GHRSST-X STM - SQUAM12  Global histograms of T SAT - T REF are close to Gaussian, against all T REF including T in situ  Normal distribution is characterized by location (median) and scale (robust standard deviation, RSD)  Reduced number/magnitude of outliers with respect to L4 T REF compared to T in situ  For some T REF (e.g., OSTIA), VAL statistics is closer to T in situ than for others (e.g., Reynolds). * More histograms (ACSPO/MUT, day/night, other platforms / reference SSTs) are found at SQUAM page Observations from global histograms analyses

4 June 2009GHRSST-X STM - SQUAM13 Time Series Global Median Biases of (T SAT - T REF )

4 June 2009GHRSST-X STM - SQUAM14 Global Median Biases T SAT – T in situ 1 data point = 1 month match-up with in situ Median Bias within ~0.1 K (except for N16 - sensor problems) MetOp-A and N17 fly close orbits but show a cross-platform bias of ~0.1 K

4 June 2009GHRSST-X STM - SQUAM15 1 data point = 1 week match-up with OSTIA SST Patterns reproducible yet crisper (finer temporal resolution) Cross-platform biases: Slightly differ from Val (diurnal cycle) OSTIA artifacts observed in early period ( ) Global Median Biases T SAT – T OSTIA

4 June 2009GHRSST-X STM - SQUAM16 1 data point = 1 week match-up with Reynolds SST Patterns reproducible but noisier than with respect to OSTIA Artifacts also observed but different from OSTIA Global Median Biases T SAT – T Reynolds

4 June 2009GHRSST-X STM - SQUAM17  Number of match-ups is more than two orders of magnitude larger against L4 T REF than against T in situ  Major trends & anomalies in T SAT are captured well against all T REF. More detailed and crisper than against T in situ  Some T REF are “noisier” for VAL purposes than others. Different artifacts are seen in different T REF  Nevertheless, time series of ( T SAT – T REF ) can be used to monitor T SAT for cross-platform & cross-product consistency * More time series (ACSPO/MUT, other reference SSTs) are available from SQUAM page Observations from time series of global biases

4 June 2009GHRSST-X STM - SQUAM18  Cross-platform consistency of T SAT can be evaluated from time series of T SAT - T REF overlaid for different platforms  For more quantitative analyses, one ‘reference’ platform can be selected & subtracted from all other ( T SAT - T REF )  N17 was selected as ‘reference’, because it is available for the full SQUAM period, and its AVHRR is stable  Double-differences (DD) were calculated as DD = ( T SAT - T REF ) - ( T N17 - T REF ) for SAT=N16, N18, and MetOp-A Cross-Platform Consistency Using Double-Differences (T SAT – T SAT_REF )

4 June 2009GHRSST-X STM - SQUAM19 Global Median Biases T SAT – T in situ Same as slide 14

4 June 2009GHRSST-X STM - SQUAM20 In situ Double-Differences (T SAT – T in situ ) - (T N17 – T in situ ) Biases are due to errors in T SAT and T SAT /T in situ skin/bulk differences Before mid-2006, all SSTs agree to within ~0.01 K In 2006, N16 develops a low bias up to ~-0.7 K, and N18 and MetOp- A a warm bias up to ~+0.1 K

4 June 2009GHRSST-X STM - SQUAM21 OSTIA Double-Differences (T SAT – T OSTIA ) - (T N17 – T OSTIA ) DD’s with respect to global reference fields: Errors in T SAT + Missing diurnal signal in T REF (T REF do not resolve diurnal cycle) N16: sensor problems. MetOp-A: suboptimal regression coefficients Diurnal correction to T REF is needed to rectify inconsistencies in T SAT

4 June 2009GHRSST-X STM - SQUAM22 Reynolds Double-Differences (T SAT – T Reynolds ) - (T N17 – T Reynolds ) DD’s are consistent for different T REF (biases/noises in T REF largely cancel out in calculating DD’s)

4 June 2009GHRSST-X STM - SQUAM23  In situ DD’s are close to ‘true’ cross-platform bias in T SAT (bulk T in situ partially accounts for diurnal cycle in skin T SAT )  DD’s with respect to global T REF additionally include diurnal signal (current L4 T REF do not resolve diurnal cycle)  Employing diurnal-cycle resolved T REF in DD’s (or adding diurnal correction on the top of existing T REF ) should rectify the ‘true’ cross-platform inconsistency in T SAT  The DD’s provide quick global ‘validation’ of the diurnal cycle model (e.g., Gentemann et al, 2003; Kennedy et al, 2007; Filipiak and Merchant, 2009) Observations from Satellite-to-Satellite Double Differences

4 June 2009GHRSST-X STM - SQUAM24  Day-Night consistency of T SAT can be evaluated as DD = ( T DAY - T REF ) - ( T NIGHT - T REF ) Day-Night Consistency Using Double-Differences T DAY – T NIGHT

4 June 2009GHRSST-X STM - SQUAM25 In situ Day-Night Double-Differences (T DAY – T in situ ) - (T NIGHT – T in situ ) During daytime, all platforms show a warmer ~+(0.1±0.1) K bias (except for N16 – sensor problem) Seasonal structure seen in DD’s Different capturing of diurnal cycle by skin T SAT and bulk T in situ

4 June 2009GHRSST-X STM - SQUAM26 OSTIA Day-Night Double-Differences (T DAY – T OSTIA ) - (T NIGHT – T OSTIA ) Day-Night DD’s wrt OSTIA show biases due to diurnal warming Seasonal variability seen in all DD’s For N17 and MetOp-A (~10am/pm), diurnal signal is (+0.1±0.1) K For N18 (~2am/pm), diurnal signal is (+0.3±0.1) K

4 June 2009GHRSST-X STM - SQUAM27 Reynolds Day-Night Double-Differences (T DAY – T Reynolds ) - (T NIGHT – T Reynolds ) DD’s are closely reproducible for all T REF (biases/noise in T REF largely cancel out in calculating DD’s)

4 June 2009GHRSST-X STM - SQUAM28  DD’s wrt in situ data more closely represent cross-platform inconsistencies in T SAT, less difference in the diurnal  If global T REF is used, then DD’s additionally include diurnal signal (currently, T REF ‘s do not resolve diurnal cycle)  Employing diurnal-cycle resolved T REF in DD’s is expected to improve cross-platform consistency  The DD’s provide quick global ‘validation’ of the diurnal cycle model (e.g., Gentemann et al, 2003; Kennedy et al, 2007; Filipiak and Merchant, 2009) Observations from Day-Night Double Differences

4 June 2009GHRSST-X STM - SQUAM29  Validation against global reference fields is currently employed in SQUAM to monitor two NESDIS operational AVHRR SST products, in near-real time  It helps quickly uncover SST product anomalies and diagnose their root causes (SST algorithm, cloud mask, or sensor performance), and leads to corrections Summary and Future Work  Work is underway to reconcile AVHRR & reference SSTs -Improve AVHRR sensor calibration -Adjust T REF for diurnal cycle (e.g., Kennedy et al., 2007) -Improve SST product (cloud screening, SST algorithms) -Provide feedback to T REF producers Objective is to have a single “benchmark” SST in NPOESS era  Add NOAA-19 and eventually MetOp-B, -C and VIIRS to SQUAM  We are open to integration with GHRSST and collaboration (to test other satellite & reference SSTs, diurnal correction,..)

4 June 2009GHRSST-X STM - SQUAM30  SQUAM page Real time maps, histograms, time series (including double differences), dependencies  CALVAL page Cal/Val of MUT and ACSPO data against in situ SST (currently, password protected but will be open in 2-3 months)  MICROS page (Monitoring of IR Clear-sky Radiances over Oceans for SST) Validation of SST Radiances against RTM calculations with Reynolds SST and NCEP GFS inputhttp:// NESDIS NRT SST analyses on the web