8 November 2010SST Science Team Meeting1 Towards Community Consensus SSTs and Clear-Sky Radiances from AVHRR SST Science Team Meeting 8-10 November 2010,

Slides:



Advertisements
Similar presentations
Improvements to the NOAA Geostationary Sea Surface Temperature Product Suite Eileen Maturi, NOAA/NESDIS/STAR Andy Harris, Jonathan Mittaz, Prabhat Koner.
Advertisements

JPSS and GOES-R SST Sasha Ignatov
15 May 2009ACSPO v1.10 GAC1 ACSPO upgrade to v1.10 Effective Date: 04 March 2009 Sasha Ignatov, XingMing Liang, Yury Kihai, Boris Petrenko, John Stroup.
March 6, 2013iQuam v21 In situ Quality Monitor (iQuam) version 2 Near-real time online Quality Control, Monitoring, and Data Serving tool for SST Cal/Val.
GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Recent developments to the SST Quality Monitor (SQUAM) and SST validation with In situ.
1 Overview LTSRF Status GHRSST Reanalysis Example L4 Analyses Future Activities GlobColour Connection Villefranche sur mer,
Satellite SST Radiance Assimilation and SST Data Impacts James Cummings Naval Research Laboratory Monterey, CA Sea Surface Temperature Science.
Characterizing and comparison of uncertainty in the AVHRR Pathfinder SST field, Versions 5 & 6 Robert Evans Guilllermo Podesta’ RSMAS Nov 8, 2010 with.
1 High Resolution Daily Sea Surface Temperature Analysis Errors Richard W. Reynolds (NOAA, CICS) Dudley B. Chelton (Oregon State University)
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 NOAA Operational Geostationary Sea Surface Temperature Products from NOAA.
1 Improved Sea Surface Temperature (SST) Analyses for Climate NOAA’s National Climatic Data Center Asheville, NC Thomas M. Smith Richard W. Reynolds Kenneth.
1 NOAA’s National Climatic Data Center April 2005 Climate Observation Program Blended SST Analysis Changes and Implications for the Buoy Network 1.Plans.
1 Overview LTSRF Status GHRSST Reanalysis Example L4 Analyses Future Activities Beijing, October 2006 The GODAE High Resolution.
1 Comparisons of Daily SST Analyses for NOAA’s National Climatic Data Center Asheville, NC Richard W. Reynolds (NOAA, NCDC) Dudley B. Chelton.
4 June 2009GHRSST-X STM - SQUAM1 The SST Quality Monitor (SQUAM) 10 th GHRSST Science Team Meeting 1-5 June 2009, Santa Rosa, CA Alexander “Sasha” Ignatov*,
MISST FY1 team meeting April 5-6, Miami, FL NOAA: Gary Wick, Eric Bayler, Ken Casey, Andy Harris, Tim Mavor Navy: Bruce Mckenzie, Charlie Barron NASA:
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Image: MODIS Land Group, NASA GSFC March 2000 POES-GOES Blended SST Analysis.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 CLOUD MASK AND QUALITY CONTROL FOR SST WITHIN THE ADVANCED CLEAR SKY PROCESSOR.
MODIS Sea-Surface Temperatures for GHRSST-PP Robert H. Evans & Peter J. Minnett Otis Brown, Erica Key, Goshka Szczodrak, Kay Kilpatrick, Warner Baringer,
Number of match-ups Mean Anomaly Fig. 3. Time-series of night MUT SST anomaly statistics compared to daily OISST SST. SST from different platforms mostly.
Quantifying the effect of ambient cloud on clear-sky ocean brightness temperatures and SSTs Korak Saha 1,2, Alexander Ignatov 1, and XingMing Liang 1,2.
GOES-R AWG 2 nd Validation Workshop 9-10 January 2014, College Park, MD GOES-R and JPSS SST Monitoring System Sasha Ignatov, Prasanjit Dash, Xingming Liang,
Assimilating Retrievals of Sea Surface Temperature from VIIRS and AMSR2 Bruce Brasnett and Dorina Surcel Colan CMDE November 21, 2014 Brasnett, B. and.
AVHRR GAC/LAC Calibration Challenges for SST Jonathan Mittaz University of Reading National Physical Laboratory.
Application of in situ Observations to Current Satellite-Derived Sea Surface Temperature Products Gary A. Wick NOAA Earth System Research Laboratory With.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 In Situ SST for Satellite Cal/Val and Quality Control Alexander Ignatov.
GHRSST XIV STM – AUS TAG. Woods Hole 18 June 2013, Cape Cod, MA 1 GHRSST 2013 Annual Meeting – AUS TAG breakout session 18 June, 2013, Woods Hole, MA SQUAM.
1 GOES-R AWG Product Validation Tool Development Sea Surface Temperature (SST) Team Sasha Ignatov (STAR)
1 SST Near-Real Time Online Validation Tools Sasha Ignatov (STAR) AWG Annual Meeting June 2011, Ft Collins, CO.
Monitoring/Validation of HR SSTs in SQUAM GHRSST-XV, 2-6 June 2014, Cape Town, South Africa 1 The 15 th GHRSST 2014 meeting, ST-VAL Breakout session 2–6.
Which VIIRS product to use: ACSPO vs. NAVO GHRSST-XV, 2-6 June 2014, Cape Town, South Africa 1 Prasanjit Dash 1,2, Alex Ignatov 1, Yuri Kihai 1,3, John.
1 RTM/NWP-BASED SST ALGORITHMS FOR VIIRS USING MODIS AS A PROXY B. Petrenko 1,2, A. Ignatov 1, Y. Kihai 1,3, J. Stroup 1,4, X. Liang 1,5 1 NOAA/NESDIS/STAR,
Eileen Maturi 1, Jo Murray 2, Andy Harris 3, Paul Fieguth 4, John Sapper 1 1 NOAA/NESDIS, U.S.A., 2 Rutherford Appleton Laboratory, U.K., 3 University.
AQUA AMSR-E MODIS POES AVHRR TRMM TMI ENVISAT AATSR GOES Imager Multi-sensor Improved SST (MISST) for GODAE Part I: Chelle Gentemann, Gary Wick Part II:
GHRSST XI Meeting, IC-TAG Breakout Session, 22 June 2010, Lima, Peru Cross-monitoring of L4 SST fields in the SST Quality Monitor (SQUAM)
Retrieval Algorithms The derivations for each satellite consist of two steps: 1) cloud detection using a Bayesian Probabilistic Cloud Mask; and 2) application.
1 Overview LTSRF Status GHRSST Reanalysis Example L4 Analyses Future Activities D.C., Nov 2006 Long Term Stewardship and Reanalysis.
Geophysical Ocean Products from AMSR-E & WindSAT Chelle L. Gentemann, Frank Wentz, Thomas Meissner, Kyle Hilburn, Deborah Smith, and Marty Brewer
Infrared and Microwave Remote Sensing of Sea Surface Temperature Gary A. Wick NOAA Environmental Technology Laboratory January 14, 2004.
GHRSST XIV Science Team Meeting. Woods Hole June 2013, Cape Cod, MA 1 GHRSST 2013 Annual Meeting 17–21 June, 2013, Woods Hole, MA Preliminary analyses.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Monitoring of IR Clear-sky Radiances over Oceans for SST (MICROS) Alexander.
24 January th AMS Symposium on Future Operational Environmental Satellite Systems 22 – 26 January 2012, New Orleans, LA NPP VIIRS SST Algorithm.
2 November 2011JPSS SST at STAR 1 2 nd NASA SST Science Team Meeting 2 – 4 November 2011, Miami Beach, FL Joint Polar Satellite System (JPSS) SST Algorithm.
Quality Flags: Two separate sets of QFs are provided in IDPS Product: (1) SST QFs; and (2) VIIRS Cloud Mask. Analyses were aimed at identifying a suitable.
BLUElink> Regional High-Resolution SST Analysis System – Verification and Inter-Comparison Helen Beggs Ocean & Marine Forecasting Group, BMRC, Bureau of.
GHRSST HL_TAG meeting Copenhagen, March 2010 Validation of L2P products in the Arctic Motivation: Consistent inter-satellite validation of L2p SST observations.
AQUA AMSR-E MODIS POES AVHRR TRMM TMI ENVISAT AATSR GOES Imager Multi-sensor Improved SST (MISST) for GODAE Part I: Chelle Gentemann, Gary Wick Part II:
A comparison of AMSR-E and AATSR SST time-series A preliminary investigation into the effects of using cloud-cleared SST data as opposed to all-sky SST.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Image: MODIS Land Group, NASA GSFC March 2000 Advanced Clear-Sky Processor.
Use of high resolution global SST data in operational analysis and assimilation systems at the UK Met Office. Matt Martin, John Stark,
9th June 2008 GHRSST-9, Perros-Guirec SST framework at European level Medspiration, Marine Core Services & SAF O&SI Status and products Jean-François Piollé.
International GHRSST User Symposium Santa Rosa, California, USA 28-29th May 2009 MODIS Sea-Surface Temperatures Peter J Minnett & Robert H. Evans With.
2 Jun 09 UNCLASSIFIED 10th GHRSST Science Team Meeting Santa Rosa, CA 1 – 5 June Presented by Bruce McKenzie Charlie N. Barron, A.B. Kara, C. Rowley.
1 March 2011iQuam GHRSST DV-WG, HL-TAG and ST-VAL Meeting 28 February – 2 March 2011, Boulder, CO In situ Quality Monitor (iQuam) Near-real time.
Characterizing and comparison of uncertainty in the AVHRR Pathfinder Versions 5 & 6 SST field to various reference fields Robert Evans Guilllermo Podesta’
© Crown copyright Met Office Report of the GHRSST Inter-Comparison TAG (IC-TAG) Matt Martin GHRSST XI meeting, Lima, Peru, June 2010.
Uncertainty estimation from first principles: The future of SSES? Gary Corlett (University of Leicester) Chris Merchant (University of Edinburgh)
Edward Armstrong Jorge Vazquez Mike Chin Mike Davis James Pogue JPL PO.DAAC California Institute of Technology 28 May 2009 GHRSST Symposium, Santa Rosa,
Prototyping SST Retrievals from GOES-R ABI with MSG SEVIRI Data Nikolay V. Shabanov 1,2, (301)
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Systematic biases in satellite SST observations.
29 May 2009GHRSST User's Symp - SQUAM1 The SST Quality Monitor (SQUAM) 1 st GHRSST Int’l User’s Symposium May 2009, Santa Rosa, CA Alexander “Sasha”
GSICS Telecon July 2012 AVHRR, MODIS, VIIRS Radiance Monitoring in MICROS and GSICS help to SST Sasha Ignatov.
Monitoring of SST Radiances
Joint GRWG and GDWG Meeting February 2010, Toulouse, France
SST – GSICS Connections
NOAA Report on Ocean Parameters - SST Presented to CGMS-43 Working Group 2 session, agenda item 9 Author: Sasha Ignatov.
Extending MICROS to include Solar Reflectance Bands (SRB)
Using Double Differences in MICROS for Cross-Sensor Consistency Checks
Towards Understanding and Resolving Cross-Platform Biases in MICROS
Radiometric Consistency between AVHRR, MODIS, and VIIRS in SST Bands
Presentation transcript:

8 November 2010SST Science Team Meeting1 Towards Community Consensus SSTs and Clear-Sky Radiances from AVHRR SST Science Team Meeting 8-10 November 2010, Seattle WA Sasha Ignatov NOAA/NESDIS Prasanjit Dash, Xingming Liang, Feng Xu NOAA/NESDIS and CSU/CIRA

8 November 2010SST Science Team Meeting2 Contributions  AVHRR Level 2/3 SST products -J. Sapper, Y. Kihai, B. Petrenko, J. Stroup: ACSPO (GAC: 5 platforms, FRAC: MetOp-A) -P. LeBorgne: O&SI SAF MetOp-A FRAC -D. May, B. McKenzie: NAVO SEATEMP -K. Casey, T. Brandon, R. Evans, J. Vazquez, E. Armstrong: PathFinder v5.0  Level 4 SST products (additional L4 SSTs are being tested) -R. Grumbine, Xu Li, B. Katz: RTG (Low-Res & Hi-Res), GSI -R. Reynolds: OISST (AVHRR & AVHRR+AMSRE) -M. Martin: OSTIA foundation, GHRSST Median Product Ensemble -D. May, B. McKenzie: NAVO K10 -E. Autret, J.-F. Piollé: ODYSSEA -E. Maturi, A. Harris, W. Meng: POES-GOES blended -B. Brasnett: Canadian Met. Centre, 0.2  foundation  AVHRR Radiances -C. Cao, X. Wu, J. Mittaz, A. Harris, A. Heidinger, L. Wang: AVHRR Cal -C. Cao, T. Hewison, M. König: GSICS (Global Space-based Inter-Cal System) -Y. Han, M. Liu, Y. Chen, P. Van Delst, D. Groff, F. Weng: CRTM

8 November 2010SST Science Team Meeting3 SST data come from various sources, sensors, and processing algorithms L4  Reynolds (AVHRR; +AMSR-E)  RTG (Low, High Resolution), GSI  OSTIA (UKMO)  ODYSSEA (France)  GMPE (GHRSST)  NAVO K10  NESDIS POES-GOES Blended  JPL G1SST  CMC 0.2 (Canada)  GAMSSA (Australia)  JAXA (Japan)  RSS (MW, MW+MODIS) In situ  Sources -GTS, ICOADS, GODAE/FNMOC  Platforms -Drifters, Moorings, Ships, ARGO Floats  Quality Control -May be unavailable or non-uniform L2/L3  POES -AVHRR (NESDIS, NAVO, O&SI SAF, U. Miami, NODC) -MODIS, ATSR, VIIRS,.. -Microwave  GOES -GOES (NESDIS, NAVO, O&SI SAF) -SEVIRI (NESDIS, O&SI SAF) -MTSAT (NESDIS, JAXA) Are these products self-consistent? Cross-consistent? Processed using community consensus algorithms?

8 November 2010SST Science Team Meeting4 Initial Focus: AVHRR  Past: Climate Data Record -Pathfinder Ocean: 1981-present  Present: Initial Joint Polar System (IJPS) -NOAA/EUMETSAT Cooperation -NOAA: GAC (4km), NOAA-18 & -19 (2 pm/am) (NOAA-16 and -17 are also in orbit but degraded) -EUMETSAT: FRAC (1km) 3 mid-morning birds (9:30 am/pm) Metop-A (19 Oct 2006), -B (2012), -C (2016)  Future: Joint Polar Satellite System (JPSS) -Europe: AVHRR/3 onboard Metop-B (~2012) & -C (2016) -AVHRR will serve the data through at least USA: Visible and IR Imagers Suite (VIIRS) on NPP & NPOESS -Need to ensure VIIRS/AVHRR synergy

8 November 2010SST Science Team Meeting5 Operational POES SST at NESDIS  Heritage Main Unit Task (MUT) pr (MCSST - McClain et al., 1985; NLSST - Walton et al., 1998) pr: Re-hosted to NAVO (“Shared Processing Agreement”). Robust end-to-end system. No fundamental redesign since 1981: Data sub- sampling (e.g. 2×2); No RTM; No reprocessing capability.  New Advanced Clear-Sky Processor for Oceans (ACSPO) -Development started in late 2005 (IJPS) - Operational in May 2008 Process all AVHRR pixels (GAC, FRAC). RTM & Reprocessing capability.  Joint Polar Satellite System (JPSS) -Generate AVHRR-like ACSPO products from VIIRS radiances. Fall-back for NPOESS SST. Benchmark to measure VIIRS improvements. Smooth transition for SST users. -Cross-evaluate against VIIRS SST generated by NPOESS contractor. Contractor’s cloud mask + VIIRS instrument + Heritage NLSST algorithm

8 November 2010SST Science Team Meeting6 GAC ~4km Global Area Coverage GAC ~4km Global Area Coverage NESDIS ACSPO (new); MUT (heritage) NESDIS ACSPO (new); MUT (heritage) MetOp FRAC ~1km Full Resolution Area Coverage MetOp FRAC ~1km Full Resolution Area Coverage EUMETSAT O&SI SAF EUMETSAT O&SI SAF NAVO SEATEMP NAVO SEATEMP U. Miami + NODC Pathfinder Ocean (L3) U. Miami + NODC Pathfinder Ocean (L3) NESDIS ACSPO (new) NESDIS ACSPO (new) Cross-evaluate ACSPO against heritage MUT SST Current AVHRR SST Products.. and against other available L2 AVHRR SST products

L2 vs. in situ Customary Validation 78 November 2010SST Science Team Meeting

8 November 2010SST Science Team Meeting8 Global Mean “SAT – In Situ”Global Std Dev “SAT – In Situ” Outliers in in situ data: Critically affect validation statistics MUT SST versus in situ SST (no QC applied to in situ SST)

8 November 2010SST Science Team Meeting9 Global Mean “SAT – In Situ”Global Std Dev “SAT – In Situ” MUT SST versus in situ SST (after QC of match-up data) QC improves both moments (Mean, Std Dev). Most dramatically, it affects the Std Dev. Absolute values are sensitive to QC.

8 November 2010SST Science Team Meeting10 QC & Monitoring of in situ SST: Need community consensus  GHRSST coordinates efforts towards community consensus -Quality Control of in situ SST -Match-up procedures and practices  NESDIS contribution -Implement in situ QC consistent with the UK Met Office -Set up near-real time web-based in situ SST Quality Monitor (iQuam) -Monitor QC’ed data against global L4 SST (currently, daily Reynolds) -Serve QC’ed in situ data to SST community, in near-real time  Work in progress -Validate L2/3/4 products against in situ SSTs, using consistent QC and match-up procedures -Evaluate match-up criteria and iQuam QC through sensitivity studies; Adjust as needed; Seek community consensus -Add ARGO floats to iQuam

8 November 2010SST Science Team Meeting11 iQuam webpage

L2/3 vs. L4 Validation against global L4 128 November 2010SST Science Team Meeting

8 November 2010SST Science Team Meeting13 Validation against global L4 SSTs was first considered because… …in situ SSTs have limitations  Non-uniform / Suboptimal quality (may be worse than satellite SSTs)  Sparse, geographically biased, and do not cover retrieval domain, fully and uniformly  Not available in sufficient numbers in near-real time (limited statistics). Diagnostics delayed Using L4 products provides global snapshot (map) of T SAT performance, in near-real time, with uniform global coverage. Also, QC & smoothing done at L4 production stage. Errors of L4 SST are more uniform in space & time than for in situ data.

8 November 2010SST Science Team Meeting14 ΔT S =T SAT - T REF are mapped to identify issues in data, in near-real time (e.g., cold biases likely due to residual cloud)

8 November 2010SST Science Team Meeting15 NIGHT DAY NIGHT DAY O&SI SAF SST minus OSTIA Metop-A FRAC 16-Oct-2010

8 November 2010SST Science Team Meeting16 NIGHT DAY NIGHT DAY NESDIS ACSPO SST minus OSTIA Metop-A FRAC 16-Oct-2010

8 November 2010SST Science Team Meeting17 NIGHT DAY O&SI SAF: ~40 M clear pixels/night and ~40 M/day. At night, ACSPO produces ~15% clear-sky ocean pixels than O&SI SAF. Global median biases and Robust Std Dev (RSD) wrt. OSTIA are comparable. During daytime, # of clear-sky ocean pixels comparable in ACSPO and O&SI. Global median biases: comparable; RSD: slightly smaller for ACSPO. O&SI SAF ACSPO Number of Observations (x 1.0e7) Robust Standard DeviationMedian Difference Statistics of ΔT S for O&SI SAF and ACSPO Metop-A FRAC minus OSTIA

8 November 2010SST Science Team Meeting18 Pathfinder v5.0 (Day) - Reynolds SST Reynolds OISST uses Pathfinder as input. Still, the two products may differ by several tenths K

8 November 2010SST Science Team Meeting19 “Pathfinder v5.0 (Day) - OISST” vs. Wind Speed Platforms Year Wind Speed (ms -1 ) NOAA-17 mid-morning platform – Diurnal warming suppressed

L2-L4 statistics for various L2s (L4 = OSTIA) SST systems * + Attributes 8 November SST Science Team Meeting Red: Day Blue: Night Avg. num. of retrievals/24hr Avg. Median Diff. (K) Avg. Std Dev (K) RobustConv. NESDIS MUT 73,000 54, NAVO SEATEMP 163, , ACSPO GAC 2,500,000 2,200, Pathfinder v5 (L3P) 2,700,000 (N_L2: 6,060,000) 1,843,000 (N_L2: 4,100,000) ACSPO FRAC 42,500,000 44,500, O&SI SAF FRAC 42,900,000 37,000, Average NOAA-18 is shown. Outliers not removed

SST systems * + Attributes 8 November SST Science Team Meeting Red: Day Blue: Night Avg. num. of retrievals/24hr Avg. Median Diff. (K) Avg. Std Dev (K) RobustConv. NESDIS MUT 73,000 54, NAVO SEATEMP 163, , ** 0.39** ACSPO GAC 2,500,000 2,200, Pathfinder v5 (L3P) 2,700,000 (N_L2: 6,060,000) 1,843,000 (N_L2: 4,100,000) ACSPO FRAC 42,500,000 44,500, O&SI SAF FRAC 42,900,000 37,000, L2-L4 statistics for various L2s (L4 = Reynolds “AVHRR”) + Average NOAA-18 is shown. Outliers not removed **NB: 2006-pr OISST uses NAVO SEATEMP as input

8 November 2010SST Science Team Meeting22 Observations from validation against global L4 SSTs  Choice of L4 affects Mean and Std Dev statistics of L2-L4  Absolute values are L4-specific.  However, relative performance of L2 products can be evaluated using (L2-L4) analyses, with any L4.  Main advantage of L2 vs. L4: Product performance is monitored in near real time, using a global instantaneous view. Work in progress  Add MODIS and VIIRS SSTs to SQUAM.  Sensitivity to reference L4 calls for “L4 vs. L4” evaluation.  Although beyond our initial plans, our NCEP RTG colleagues suggested adding L4 comparisons to SST Quality Monitor.

L4 vs. L4 Cross-evaluation of various L4s 238 November 2010SST Science Team Meeting

8 November 2010SST Science Team Meeting24 Contribution to GHRSST Inter-Comparison Technical Advisory Group

“OSTIA – GMPE” mean zonal difference 8 November 2010SST Science Team Meeting25 Year Latitude

“L4s – GMPE”: time series statistics 8 November 2010SST Science Team Meeting26 More time-series at: DOI_AV DOI_AA RTG_HR RTG_LR GOESPOES OSTIA CMC 0.2 NAVO K10 ODYSSEA GMPE Reynolds OISST (AVHRR) Reynolds OISST (+ AMSR-E) Real Time Global high resolution RTG low resolution Blended POES and GOES Operational SST and Sea ice analysis Canadian Met. Centre 0.2 degree NAVOCEANO 1/10 degree MERSEA IFREMER/CERSAT GHRSST Median Ensemble Product Roughly, L4 products form 3 major groups: DOI_AV, DOI_AA, RTG_LR, NAVO K10 RTG_HR, GOES-POES blended (with seasonal variation between: RTG_HR, RTG_LR) OSTIA, CMC, & GMPE *ODYSSEA: production temporarily halted Mean Year Std Dev Year wrt GHRSST GMPE

Match-up statistics in SQUAM for various daily L4 products wrt. GMPE SST systems * + Attributes Type/resolution Input Avg. Median Diff. (K) Avg. Std Dev(K) *RobustConv. Reynolds (AVHRR) Bulk, 0.25° IR (PF till `05, then NAVO), & in situ Reynolds (+AMSR-E) Bulk 0.25° AVHRR IR, AMSR-E MW, & in situ RTG low resolution Bulk, 0.5° AVHRR IR, & in situ NAVO K10 Bulk, 0.1° AVHRR, VISSR, AMSR-E, JPL cli POESGOES blended Bulk, 0.1° AVHRR, GOES RTG high resolution Bulk, 1/12° AVHRR IR physical retr., & in situ OSTIA Foundation, 0.05° IR: AVHRR, AATSR, SEVIRI MW: AMSR-E, TMI, SSMI ice, & in situ CMC Foundation, 0.2° IR: AVHRR, AATSR MW: AMSR-E, & in situ & CMC sea/ice ODYSSEA Subskin, 0.1° IR: AATSR, AVHRR, VISSR, SEVIRI, MW: AMSR-E, TMI November SST Science Team Meeting * robust parameters are resistant to outliers, may hide local issues. Should be used with additional diagnostics

8 November 2010SST Science Team Meeting28 Conclusion to L4 vs. L4 Comparisons  Currently, nine L4 daily products are continuously monitored in SQUAM, wrt. each other.  Foundation SST products (OSTIA, CMC) appear more stable in time, less noisy in space, and more consistent with satellite data.  G1SST is being evaluated (G1SST). RSS MW is in pipeline. Future plans…  Validate all L4 SSTs against independently QCed in situ data (iQUAM).  Explore DV model. This should reduce cross-platform biases and spurious noise in “L2-L4”, and will help to globally validate the DV model.  Cooperate with L4 producers to add missing L4s

Radiances Monitoring Clear- Sky Radiances 298 November 2010SST Science Team Meeting

8 November 2010SST Science Team Meeting30 Monitoring of IR Clear-sky Radiances over Oceans for SST (MICROS)  Web-Based NRT tool to monitor M-O bias -M (Model) = Community Radiative Transfer Model (CRTM) used in ACSPO to simulate TOA Brightness Temperatures. -O (Observation) = AVHRR Clear-Sky BTs in Ch3B, 4 & 5.  Key objectives -Fully understand and reconcile CRTM and AVHRR BTs -Minimize cross-platform biases -  Users/Applications -Test and improve ACSPO products -Validate and improve CRTM performance -Contribute to sensor characterization and inter-calibration within Global Space-based Inter-Calibration System (GSICS)

8 November 2010SST Science Team Meeting31 MICROS Overview MICROS is end-to-end system

8 November 2010SST Science Team Meeting32 M-O Biases and Double-Differences  Warm M-O biases are due to a combined effect of incomplete model (aerosols not included; bulk SST used instead of skin; daily mean Reynolds SST is used to represent nighttime SST) and biased satellite sensor radiances (residual cloud).  Double differences (DDs) cancel out many possible systematic errors in CRTM and its input (SST and GFS, missing aerosol, etc).  Non-zero DDs are mainly due to errors in sensor calibration and spectral response functions. Largest systematic errors are in N18 (Ch4) and N19 (all bands).  NOAA-16 is unstable in whole monitoring period.

8 November 2010SST Science Team Meeting33 The effect of CRTM input on M-O bias (Reynolds SST used as CRTM input) μ and σ: median and RSD over time Spurious variations found when Reynolds SST used as CRTM input.

8 November 2010SST Science Team Meeting34 The effect of reference SST on M-O bias (OSTIA SST used as CRTM input) Spurious time variability reduced when OSTIA used as CRTM input. The Std Deviations are also dramatically reduced.

8 November 2010SST Science Team Meeting35 Conclusion to Monitoring Radiances  SST is an unresolved combination of 2-3 sensor bands. Monitoring individual bands is needed to unscramble SST anomalies.  Web-based near-real time Monitoring of IR Clear-sky Radiances over Oceans for SST (MICROS) established  Currently, three users groups actively use MICROS -ACSPO SST developers -CRTM developers -Sensor calibration scientists Future plans…  Minimize M-O biases through: Adding aerosol in CRTM; Improving AVHRR sensor characterization; Improving CRTM accuracy

Conclusion Community Consensus SSTs & Radiances 368 November 2010SST Science Team Meeting

8 November 2010SST Science Team Meeting37 Conclusion  Community consensus methodologies and tools for SST evaluation and validation are needed  Prototypes have been established at NESDIS -SST Quality Monitor (SQUAM) for L2 & L4 products -In situ SST Quality Monitor (iQuam)  Satellite radiances in individual sensor bands should be monitored -Monitoring of IR Clear-sky Radiances over Oceans for SST  We are open to any suggestions to make SQUAM, iQuam, and MICROS the “community tools”

Questions? Thank you! 388 November 2010SST Science Team Meeting

8 November 2010SST Science Team Meeting39 Back-Up slides

8 November 2010SST Science Team Meeting40 Std Dev of “In Situ – Daily Reynolds SST” stratified by in situ data types All SSTs are strongly contaminated by outliers (drifters affected most). Effect of QC on NCEP GTS SST (in situ minus daily Reynolds) Ships Drifters Tropical Moorings Coastal Moorings

8 November 2010SST Science Team Meeting41 Percent of in situ data excluded by QC Percent of outliers in drifter SSTs is largest– QC is needed most. Percent of NCEP GTS data excluded by QC Ships Drifters Tropical Moorings Coastal Moorings

8 November 2010SST Science Team Meeting42 Pathfinder v5.0 (Night) - Reynolds SST Pathfinder v5 was recently added to SQUAM

8 November 2010SST Science Team Meeting43 Pathfinder v5.0 (Day) - Reynolds SST Pathfinder v5 was recently added to SQUAM