S-NPP EDR Validated Maturity Readiness Review 3-4 September 2014, College Park, MD Validated Stage 3 SST Maturity Review Alexander Ignatov And STAR JPSS.

Slides:



Advertisements
Similar presentations
JPSS and GOES-R SST Sasha Ignatov
Advertisements

15 May 2009ACSPO v1.10 GAC1 ACSPO upgrade to v1.10 Effective Date: 04 March 2009 Sasha Ignatov, XingMing Liang, Yury Kihai, Boris Petrenko, John Stroup.
March 6, 2013iQuam v21 In situ Quality Monitor (iQuam) version 2 Near-real time online Quality Control, Monitoring, and Data Serving tool for SST Cal/Val.
Pathfinder –> MODIS -> VIIRS Evolution of a CDR Robert Evans, Peter Minnett, Guillermo Podesta Kay Kilpatrick (retired), Sue Walsh, Vicki Halliwell, Liz.
GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Recent developments to the SST Quality Monitor (SQUAM) and SST validation with In situ.
VIIRS LST Uncertainty Estimation And Quality Assessment of Suomi NPP VIIRS Land Surface Temperature Product 1 CICS, University of Maryland, College Park;
Reduction of stripe noise in ACSPO clear-sky radiances and SST Marouan Bouali and Alexander Ignatov NOAA/NESDIS/STAR and CSU/CIRA 1 SPIE Security, Defense.
Satellite SST Radiance Assimilation and SST Data Impacts James Cummings Naval Research Laboratory Monterey, CA Sea Surface Temperature Science.
Medspiration user meeting, dec 4-6 Use of Medspiration and GHRSST data in the Northern Seas Jacob L. Høyer & Søren Andersen Center for Ocean and Ice, Danish.
Characterizing and comparison of uncertainty in the AVHRR Pathfinder SST field, Versions 5 & 6 Robert Evans Guilllermo Podesta’ RSMAS Nov 8, 2010 with.
8 November 2010SST Science Team Meeting1 Towards Community Consensus SSTs and Clear-Sky Radiances from AVHRR SST Science Team Meeting 8-10 November 2010,
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 NOAA Operational Geostationary Sea Surface Temperature Products from NOAA.
May 15, 2013 Space Weather Product Team (SWPT) Making Sense of the Nonsensical.
Arctic SST retrieval in the CCI project Owen Embury Chris Merchant University of Reading.
1 Validated Stage 1 Science Maturity Review for {JPSS Algorithm} Presented by Date.
4 June 2009GHRSST-X STM - SQUAM1 The SST Quality Monitor (SQUAM) 10 th GHRSST Science Team Meeting 1-5 June 2009, Santa Rosa, CA Alexander “Sasha” Ignatov*,
MISST FY1 team meeting April 5-6, Miami, FL NOAA: Gary Wick, Eric Bayler, Ken Casey, Andy Harris, Tim Mavor Navy: Bruce Mckenzie, Charlie Barron NASA:
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Image: MODIS Land Group, NASA GSFC March 2000 POES-GOES Blended SST Analysis.
1 GOES-R AWG Hydrology Algorithm Team: Rainfall Probability June 14, 2011 Presented By: Bob Kuligowski NOAA/NESDIS/STAR.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 CLOUD MASK AND QUALITY CONTROL FOR SST WITHIN THE ADVANCED CLEAR SKY PROCESSOR.
1 JPSS Engineering Review Board (ERB) Sea Surface Temperature (SST) Product Configuration Change Request (CCR) Evaluation 9 May 2014.
MODIS Sea-Surface Temperatures for GHRSST-PP Robert H. Evans & Peter J. Minnett Otis Brown, Erica Key, Goshka Szczodrak, Kay Kilpatrick, Warner Baringer,
Number of match-ups Mean Anomaly Fig. 3. Time-series of night MUT SST anomaly statistics compared to daily OISST SST. SST from different platforms mostly.
Quantifying the effect of ambient cloud on clear-sky ocean brightness temperatures and SSTs Korak Saha 1,2, Alexander Ignatov 1, and XingMing Liang 1,2.
GOES-R AWG 2 nd Validation Workshop 9-10 January 2014, College Park, MD GOES-R and JPSS SST Monitoring System Sasha Ignatov, Prasanjit Dash, Xingming Liang,
Assimilating Retrievals of Sea Surface Temperature from VIIRS and AMSR2 Bruce Brasnett and Dorina Surcel Colan CMDE November 21, 2014 Brasnett, B. and.
Application of in situ Observations to Current Satellite-Derived Sea Surface Temperature Products Gary A. Wick NOAA Earth System Research Laboratory With.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 In Situ SST for Satellite Cal/Val and Quality Control Alexander Ignatov.
GHRSST XIV STM – AUS TAG. Woods Hole 18 June 2013, Cape Cod, MA 1 GHRSST 2013 Annual Meeting – AUS TAG breakout session 18 June, 2013, Woods Hole, MA SQUAM.
1 Validated Stage 1 Science Maturity Review for VIIRS Cloud Base, Nighttime Optical Properties and Cloud Cover Layers Products Andrew Heidinger September.
1 GOES-R AWG Product Validation Tool Development Sea Surface Temperature (SST) Team Sasha Ignatov (STAR)
1 SST Near-Real Time Online Validation Tools Sasha Ignatov (STAR) AWG Annual Meeting June 2011, Ft Collins, CO.
Monitoring/Validation of HR SSTs in SQUAM GHRSST-XV, 2-6 June 2014, Cape Town, South Africa 1 The 15 th GHRSST 2014 meeting, ST-VAL Breakout session 2–6.
Which VIIRS product to use: ACSPO vs. NAVO GHRSST-XV, 2-6 June 2014, Cape Town, South Africa 1 Prasanjit Dash 1,2, Alex Ignatov 1, Yuri Kihai 1,3, John.
1 RTM/NWP-BASED SST ALGORITHMS FOR VIIRS USING MODIS AS A PROXY B. Petrenko 1,2, A. Ignatov 1, Y. Kihai 1,3, J. Stroup 1,4, X. Liang 1,5 1 NOAA/NESDIS/STAR,
Page 1 Validation Workshop, 9-13 th December 2002, ESRIN ENVISAT Validation Workshop AATSR Report Marianne Edwards Space Research Centre Department of.
GHRSST XI Meeting, IC-TAG Breakout Session, 22 June 2010, Lima, Peru Cross-monitoring of L4 SST fields in the SST Quality Monitor (SQUAM)
Retrieval Algorithms The derivations for each satellite consist of two steps: 1) cloud detection using a Bayesian Probabilistic Cloud Mask; and 2) application.
Request for SST Beta Maturity 23 February 2013, College Park, MD Request for SST Beta Maturity Sasha Ignatov – SST Lead John Stroup – SST Technical Liaison.
1 Validated Stage 1 Science Maturity Review for Surface Reflectance Eric Vermote Sept 04, 2014.
Infrared and Microwave Remote Sensing of Sea Surface Temperature Gary A. Wick NOAA Environmental Technology Laboratory January 14, 2004.
GHRSST XIV Science Team Meeting. Woods Hole June 2013, Cape Cod, MA 1 GHRSST 2013 Annual Meeting 17–21 June, 2013, Woods Hole, MA Preliminary analyses.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Monitoring of IR Clear-sky Radiances over Oceans for SST (MICROS) Alexander.
24 January th AMS Symposium on Future Operational Environmental Satellite Systems 22 – 26 January 2012, New Orleans, LA NPP VIIRS SST Algorithm.
2 November 2011JPSS SST at STAR 1 2 nd NASA SST Science Team Meeting 2 – 4 November 2011, Miami Beach, FL Joint Polar Satellite System (JPSS) SST Algorithm.
Quality Flags: Two separate sets of QFs are provided in IDPS Product: (1) SST QFs; and (2) VIIRS Cloud Mask. Analyses were aimed at identifying a suitable.
BLUElink> Regional High-Resolution SST Analysis System – Verification and Inter-Comparison Helen Beggs Ocean & Marine Forecasting Group, BMRC, Bureau of.
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Ship SST data for satellite SST and ocean.
GHRSST HL_TAG meeting Copenhagen, March 2010 Validation of L2P products in the Arctic Motivation: Consistent inter-satellite validation of L2p SST observations.
A comparison of AMSR-E and AATSR SST time-series A preliminary investigation into the effects of using cloud-cleared SST data as opposed to all-sky SST.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Image: MODIS Land Group, NASA GSFC March 2000 Advanced Clear-Sky Processor.
Use of high resolution global SST data in operational analysis and assimilation systems at the UK Met Office. Matt Martin, John Stark,
International GHRSST User Symposium Santa Rosa, California, USA 28-29th May 2009 MODIS Sea-Surface Temperatures Peter J Minnett & Robert H. Evans With.
2 Jun 09 UNCLASSIFIED 10th GHRSST Science Team Meeting Santa Rosa, CA 1 – 5 June Presented by Bruce McKenzie Charlie N. Barron, A.B. Kara, C. Rowley.
1 March 2011iQuam GHRSST DV-WG, HL-TAG and ST-VAL Meeting 28 February – 2 March 2011, Boulder, CO In situ Quality Monitor (iQuam) Near-real time.
Characterizing and comparison of uncertainty in the AVHRR Pathfinder Versions 5 & 6 SST field to various reference fields Robert Evans Guilllermo Podesta’
© Crown copyright Met Office Report of the GHRSST Inter-Comparison TAG (IC-TAG) Matt Martin GHRSST XI meeting, Lima, Peru, June 2010.
Edward Armstrong Jorge Vazquez Mike Chin Mike Davis James Pogue JPL PO.DAAC California Institute of Technology 28 May 2009 GHRSST Symposium, Santa Rosa,
ST-VAL Breakout Summary Gary Corlett. ST-VAL Breakout 16:20 Introduction and objectives for session (G Corlett) 16:30 The Data Buoy Co-operation Panel.
Validation in Arctic conditions Steinar Eastwood (met.no) David Poulter (NOCS) GHRSST9, Perros-Guirec OSI SAF METOP SST, days mean.
SST from MODIS AQUA and TERRA Kay Kilpatrick, Ed Kearns, Bob Evans, and Peter Minnett Rosenstiel School of Marine and Atmospheric Science University of.
Report from breakout session in the High Latitude Working group Prepared by Jacob L. Høyer, Bob Grumbine and Steinar Eastwood.
29 May 2009GHRSST User's Symp - SQUAM1 The SST Quality Monitor (SQUAM) 1 st GHRSST Int’l User’s Symposium May 2009, Santa Rosa, CA Alexander “Sasha”
Monitoring of SST Radiances
Joint GRWG and GDWG Meeting February 2010, Toulouse, France
SST – GSICS Connections
NOAA Report on Ocean Parameters - SST Presented to CGMS-43 Working Group 2 session, agenda item 9 Author: Sasha Ignatov.
Towards Understanding and Resolving Cross-Platform Biases in MICROS
Radiometric Consistency between AVHRR, MODIS, and VIIRS in SST Bands
Dorothee Coppens.
Presentation transcript:

S-NPP EDR Validated Maturity Readiness Review 3-4 September 2014, College Park, MD Validated Stage 3 SST Maturity Review Alexander Ignatov And STAR JPSS SST Team: Prasanjit Dash, Yury Kihai, John Stroup, Boris Petrenko, Xingming Liang, Irina Gladkova, Marouan Bouali, Karlis Mikelsons, John Sapper, Feng Xu, Xinjia Zhou NOAA; CIRA; GST Inc; CUNY Bruce Brasnett Canadian Met Centre 3 September 2014JPSS SST Val3 Maturity Review1

Outline 3 September 2014JPSS SST Val3 Maturity Review2 JPSS SST Team & Acknowledgements Product Requirements & VAL Maturity Stages Evaluate SST performance to spec requirements –VIIRS SST Products: IDPS, ACSPO, NAVO –JPSS SST – ACSPO; Processing Environment – NDE –VIIRS SST Performance 1.One day analysis 2.Time Series Jan 2012 – present 3.Annual statistics (1 Jun 2013 – 30 May 2014) 4.Examples VIIRS SST Imagery Documentation Users & Feedback Conclusion & Path Forward

NameAffiliation% FundingTasks IgnatovSTARNOAALead, JPSS Algorithm & Cal/Val Stroup, Kihai, Dash, Liang, Petrenko, Xu, Bouali, Zhou, Gladkova, Mikelsons STAR/CIRA STAR/STG STAR/GST JPO, NOAA ORS, GOES-R, NASA Quality Monitoring of VIIRS SSTs (SQUAM), Radiances (MICROS), and in Situ SSTs (iQuam) Data support; IDPS SST code, Match up, Cloud Mask, SST retrievals; Destriping L1b & SST May, Cayula, McKenzie, Willis NAVONavy, NJONAVO SEATEMP SST & Cal/Val VIIRS Cloud Mask evaluation Minnett Kilpatrick U. MiamiJPO, U. MiamiUncertainty & instrument analyses; RTM; VAL vs. drifters & radiometers; skin to sub-skin conversion Arnone Fargion USM/NRL UCSD NJO, USMSST Algorithm Analyses, SST improvements at slant view zenith angles/swath edge LeBorgne Roquet Meteo FranceEUMETSATProcessing VIIRS and Cal/Val using O&SI SAF heritage; Comparisons with AVHRR/SEVIRI 3 September 2014JPSS SST Val3 Maturity Review JPSS SST Team 3

JPSS Program – Mitch Goldberg, Kathryn Schontz, Bill Sjoberg NASA SNPP Project Scientist – Jim Gleason NOAA NDE Team – Tom Schott, Dylan Powell, Bonnie Reed JPSS DPA – Eric Gottshall, Janna Feeley, Bruce Gunther VIIRS SDR & GSICS – Changyong Cao, Fuzhong Weng, Mark Liu, Frank DeLuccia, Jack Xiong NOAA STAR JPSS Team – Lihang Zhou, Paul DiGiacomo, Ivan Csiszar, many others NOAA CRTM Team – Yong Chen, Mark Liu, Yong Han U. Wisconsin – Liam Gumley, Steve Dutcher, Jim Davies Acknowledgements 3 September 2014JPSS SST Val3 Maturity Review4

JPSS Skin SST Requirements EDR AttributeThresholdObjective a. Horizontal Cell Size (Res)1.6km km b. Mapping Uncertainty, 3σ2km 1 0.1km c. Measurement Range271 K to 313 K271 K to 318 K d. Measurement Accuracy 2 0.2K0.05K e. Measurement Precision 2 0.6K0.2K (<55° VZA) f. Refresh Rate12 hrs3 hrs g. Latency90 min15 min h. Geographic coverage Global cloud and ice-free ocean; excluding lakes and rivers Global cloud and ice-free ocean, plus large lakes and wide rivers 1 Worst case scenarios corresponding to swath edge; both numbers are ~1km at nadir 2 Represent global mean BIAS and STDV validation statistics against QCed drifting buoys (for day & night, and in full VIIRS swath & range of atmospheric conditions). Uncertainty = Square root of Accuracy squared plus Precision squared. Better performance is expected against ship radiometers. 3 September 2014JPSS SST Val3 Maturity Review5

Validated Stage 1 (Jul’2014): Using a limited set of samples, the algorithm output is shown to meet the threshold performance attributes identified in the JPSS Level 1 Requirements Supplement with the exception of the S-NPP Performance Exclusions Validated Stage 2 (Oct’2014): Using a moderate set of samples, the algorithm output is shown to meet the threshold performance attributes identified in the JPSS Level 1 Requirements Supplement with the exception of the S-NPP Performance Exclusions Validated Stage 3 (Apr’2015): Using a large set of samples representing global conditions over four seasons, the algorithm output is shown to meet the threshold performance attributes identified in the JPSS Level 1 Requirements Supplement with the exception of the S-NPP Performance Exclusions SST Team recommends to declare JPSS SST “Validated Stage 3”. We are ahead of schedule due to the use of mature ACSPO product 6 SNPP Validation Maturity Stages & SST Schedule 3 September 2014JPSS SST Val3 Maturity Review

JPSS/GOES-R Data Product Validation Maturity Stages – COMMON DEFINITIONS (Nominal Mission) 1.Beta o Product is minimally validated, and may still contain significant identified and unidentified errors. o Information/data from validation efforts can be used to make initial qualitative or very limited quantitative assessments regarding product fitness-for-purpose. o Documentation of product performance and identified product performance anomalies, including recommended remediation strategies, exists. 2.Provisional o Product performance has been demonstrated through analysis of a large, but still limited (i.e., not necessarily globally or seasonally representative) number of independent measurements obtained from selected locations, time periods, or field campaign efforts. o Product analyses are sufficient for qualitative, and limited quantitative, determination of product fitness-for-purpose. o Documentation of product performance, testing involving product fixes, identified product performance anomalies, including recommended remediation strategies, exists. o Product is recommended for operational use (user decision) and in scientific publications. 3.Validated o Product performance has been demonstrated over a large and wide range of representative conditions (i.e., global, seasonal). o Comprehensive documentation of product performance exists that includes all known product anomalies and their recommended remediation strategies for a full range of retrieval conditions and severity level. o Product analyses are sufficient for full qualitative and quantitative determination of product fitness-for-purpose. o Product is ready for operational use based on documented validation findings and user feedback. o Product validation, quality assurance, and algorithm stewardship continue through the lifetime of the instrument. 7 JPSS-1 Product Maturity Definition 3 September 2014JPSS SST Val3 Maturity Review

IDPS – NOAA Interface Data Processing Segment (IDPS) Official NPOESS SST EDR, Ownership transferred to NOAA JPSS PO Developed by NGAS; Operational at Raytheon; Archived at NOAA CLASS As of this review, meets specs at night, but not during the daytime Jan 2014: JPO recommends “discontinue IDPS EDR support, concentrate on ACSPO” IDPS SST EDR to phase out as soon as ACSPO SST is archived at JPL/NODC ACSPO – NOAA Advanced Clear-Sky Processor for Ocean (ACSPO) Jul 2014: JPO approves reallocation of SST Requirements to NDE ACSPO NOAA heritage SST system (building on AVHRR GAC and FRAC) Terra/Aqua MODIS experimentally processed at STAR since Jan 2012 NDE ACSPO: Operational (Mar 2014); GDS2 archived at JPL/NODC (May 2014) ftp://podaac-ftp.jpl.nasa.gov/allData/ghrsst/data/GDS2/L2P/VIIRS_NPP/OSPO/ Meet/exceed APU specs, provides complete global coverage NAVO – SEATEMP Builds on NOAA (pre-ACSPO) / NAVO AVHRR heritage VIIRS operational (Mar 2013); GDS2 archived at JPL/NODC (May 2013) Meet/exceed APU specs. Coverage is ~3 smaller compared to ACSPO VIIRS SST Products 3 September 20148JPSS SST Val3 Maturity Review

Evaluation of VIIRS SST Products 1. One day global analysis – 23 April September 2014JPSS SST Val3 Maturity Review9

3 September 2014JPSS SST Val3 Maturity Review10 We first compare SST Domain & Performance in IDPS, ACSPO, NAVO against two global reference SSTs -L4 SST – Canadian Met Centre CMC0.2 Analysis. -In situ SST – QCed drifting buoys in iQuam using one day of globally representative data – 23 April 2014 – in SST Quality Monitor (SQUAM) And then discuss corresponding time series All Products Are Continuously Monitored Online

NIGHT: IDPS L2 minus CMC L4 23 April September 2014JPSS SST Val3 Maturity Review11 Impressive global coverage Delta close to zero as expected Cold spots – Residual Cloud/Aerosol leakages

NIGHT: ACSPO L2 minus CMC L4 23 April September 2014JPSS SST Val3 Maturity Review12 ACSPO coverage comparable with IDPS Delta on average closer to zero Fewer Cold spots (Cloud/Aerosol leakages)

NIGHT: NAVO L2 minus OSTIA L4 23 April September 2014JPSS SST Val3 Maturity Review13 Retrievals limited to VZA<54° Fewer data in the Tropics and at high latitudes

NIGHT: IDPS L2 minus CMC L4 23 April September 2014JPSS SST Val3 Maturity Review14 Shape close to Gaussian Long tail on the left: Residual Cloud/Aerosol

NIGHT: ACSPO L2 minus CMC L4 23 April September 2014JPSS SST Val3 Maturity Review15 ACSPO: Shape closer to Gaussian than IDPS Tail on the left smaller: Less Cloud/Aerosol

NIGHT: NAVO L2 minus CMC L4 23 April September 2014JPSS SST Val3 Maturity Review16 Shape close to Gaussian Domain smaller, but STD slightly better

JPSS SST Val3 Maturity Review17 Current Validation standard are drifting buoys ARGO Floats and Tropical Moorings are being evaluated 3 September 2014 QCed in situ data come from iQuam system

JPSS SST Val3 Maturity Review18 ARGO: Rapid deployment after September 2014 Current JPSS in situ validation standard are drifting buoys ARGO Floats and Tropical Moorings are being evaluated Number of unique IDs

JPSS SST Val3 Maturity Review193 September 2014 Number of Observations ARGO Floats take relatively infrequent measurements However, they are high quality and provide most uniform global coverage Overall, they very well complement data from drifters ARGO take 3 measurements / month

NIGHT: IDPS L2 minus in situ drifter SST 23 April September 2014JPSS SST Val3 Maturity Review20 Much sparser coverage than against L4 Not fully representative of global ocean

NIGHT: ACSPO L2 minus in situ drifter SST 23 April September 2014JPSS SST Val3 Maturity Review21 ACSPO match-ups: Comparable with IDPS Fewer cold ΔSSTs than in IDPS

NIGHT: NAVO L2 minus in situ drifter SST 23 April September 2014JPSS SST Val3 Maturity Review22 NAVO match-ups much sparser than IDPS/ACSPO Similarly to ACSPO, very few cold outliers

NIGHT: IDPS L2 minus in situ SST 23 April September 2014JPSS SST Val3 Maturity Review23 Shape close to Gaussian – except cold tail & positive outliers Performance Stats within specs (Bias<0.2K, STD<0.6K)

NIGHT: ACSPO L2 minus in situ SST 23 April September 2014JPSS SST Val3 Maturity Review24 ACSPO: Shape closer to Gaussian than IDPS Performance Stats well within specs (Bias<0.2K, STD<0.6K)

NIGHT: NAVO L2 minus in situ SST 23 April September 2014JPSS SST Val3 Maturity Review25 NAVO: Shape close to Gaussian, Performance Stats within specs ~2.5 fewer matchups compared to ACSPO/IDPS

NOBS (%ACSPO)Min/ MaxMean/ STDMed/ RSD IDPS 2,082 (113%)-2.9/ / /0.26 ACSPO 1,846 (100%)-1.7/ / /0.24 NAVO 678 ( 37%)-2.3/ / /0.24 NIGHT 23 Apr 2014 – Summary NOBS (%ACSPO)Min/ MaxMean/ STDMed/ RSD IDPS116.8M (101%)-13.1/ / /0.31 ACSPO115.9M (100%)- 4.6/ / /0.30 NAVO 39.5M ( 34%)- 8.9/ / /0.28 ΔT = “VIIRS minus CMC” SST (expected ~0) 3 September 2014JPSS SST Val3 Maturity Review26 ΔT = “VIIRS minus in situ” SST (expected ~0) IDPS: SST domain is +13% larger than ACSPO, All stats degraded NAVO: SST domain is factor of ×3 smaller than ACSPO, stats comparable Vs. L4 Vs. in situ IDPS: SST domain is +1% larger than ACSPO, All stats degraded NAVO: SST domain is factor of ×3 smaller than ACSPO, stats improved

NOBS (%ACSPO)Min/ MaxMean/ STDMed/ RSD IDPS 1,758 (105%)-5.3/ / /0.48 ACSPO 1,680 (100%)-1.4/ / /0.37 NAVO 510 ( 30%)-1.2/ / /0.35 NOBS (%ACSPO)Min/ MaxMean/ STDMed/ RSD IDPS120.4M (100%)- 28.7/ / /0.45 ACSPO121.0M (100%)- 5.4/ / /0.41 NAVO 41.3M ( 34%)- 8.2/ / /0.40 ΔT = “VIIRS minus CMC” SST (expected ~>0) 3 September 2014JPSS SST Val3 Maturity Review27 ΔT = “VIIRS minus in situ” SST (expected ~>0) IDPS: SST domain is comparable with ACSPO, All stats degraded NAVO: SST domain is factor of ×3 smaller than ACSPO, stats comparable IDPS: SST domain is +5% larger than ACSPO, All stats degraded NAVO: SST domain is factor of ×3 smaller than ACSPO, stats improved DAY 23 Apr 2014 – Summary Vs. L4 Vs. in situ

Evaluation of VIIRS SST Products 2. Time Series of Daily Statistics Jan Present 3 September 2014JPSS SST Val3 Maturity Review28

MapsHistogramsTime-seriesDependenciesHovmöller NPP SST Daily Validation: Mean “VIIRS minus Drifters” SST NIGHT DAY 3 September JPSS SST Val3 Maturity Review ACSPO SST (adjusted by +0.14K for skin-bulk difference) meets accuracy specs ±0.2K Bias is closer to 0K at night, and slightly positive during daytime This is expected (satellite skin SST more subject to diurnal warming than in situ bulk SST)

MapsHistogramsTime-seriesDependenciesHovmöller NPP SST Daily Validation: Std Dev “VIIRS minus Drifters” SST NIGHT DAY 3 September JPSS SST Val3 Maturity Review ACSPO SST meets precision specs 0.6K. STD smaller at night, larger during daytime. This is expected because in situ SST is bulk and retrieved SST is skin In fact, nighttime STD is deemed to better represent performance of daytime SST

MapsHistogramsTime-seriesDependenciesHovmöller NPP SST Daily Validation: # of matchups with Drifters NIGHT DAY 3 September JPSS SST Val3 Maturity Review Number of matchups with in situ SST is ~2,000/day and ~2,000/night for ACSPO & IDPS For NAVO, number of matchups is ~3 smaller Daily number of matchups is sufficient for daily VAL, but residual day-to-day noise remains

MapsHistogramsTime-seriesDependenciesHovmöller NPP SST Daily Validation: # of matchups with OSTIA NIGHT DAY 3 September JPSS SST Val3 Maturity Review Number of VIIRS SST retrievals is >100M/day and >100M/night for ACSPO & IDPS For NAVO, number of SST retrievals is ~3 smaller Drop-outs in # Obs is due to VIIRS outages & processing at STAR (improved in NDE)

Evaluation of VIIRS SST Products 3. Yearly Statistics 1 June 2013 – 31 May September 2014JPSS SST Val3 Maturity Review33

MapsHistogramsTime-seriesDependenciesHovmöller IDPS VIIRS minus Drifters, 1-Jun-2013 to 31-May-2014 NPP SST Annual Validation: Night IDPS 3 September JPSS SST Val3 Maturity Review IDPS match-ups cover most global ocean, except a few “white spots” Cold speckles, suppression off Africa: Due to residual Cloud/Aerosols

MapsHistogramsTime-seriesDependenciesHovmöller ACSPO VIIRS minus Drifters, 1-Jun-2013 to 31-May-2014 NPP SST Annual Validation: Night ACSPO 3 September JPSS SST Val3 Maturity Review ACSPO: Similar coverage to IDPS Fewer speckles, less suppression: Less residual Cloud/Aerosols

MapsHistogramsTime-seriesDependenciesHovmöller NAVO VIIRS minus Drifters, 1-Jun-2013 to 31-May-2014 NPP SST Annual Validation: Night NAVO 3 September JPSS SST Val3 Maturity Review NAVO coverage sparser than IDPS/ACSPO Fewer speckles, less suppression: Less residual Cloud/Aerosols

MapsHistogramsTime-seriesDependenciesHovmöller NPP SST Annual Validation: Night IDPS 3 September JPSS SST Val3 Maturity Review IDPS histogram: Biased & Skewed negatively IDPS product meets specs at night

MapsHistogramsTime-seriesDependenciesHovmöller NPP SST Annual Validation: Night ACSPO 3 September JPSS SST Val3 Maturity Review ACSPO: Fewer matchups, Reduced Bias / Skew / STD Meets specs by a wider margin than IDPS

MapsHistogramsTime-seriesDependenciesHovmöller NPP SST Annual Validation: Night NAVO 3 September JPSS SST Val3 Maturity Review NAVO: fewer matchups, reduced Bias / Skew / STD Meets specs by even wider margin, but in a ~2.5 smaller sample

MapsHistogramsTime-seriesDependenciesHovmöller IDPS VIIRS minus Drifters, 1-Jun-2013 to 31-May-2014 NPP SST Validation: Day IDPS 3 September JPSS SST Val3 Maturity Review IDPS match-ups cover most global ocean, except a few “white spots” Cold speckles, suppression off Africa: Due to residual Cloud/Aerosols

MapsHistogramsTime-seriesDependenciesHovmöller ACSPO VIIRS minus Drifters, 1-Jun-2013 to 31-May-2014 NPP SST Validation: Day ACSPO 3 September JPSS SST Val3 Maturity Review ACSPO coverage: Similar to IDPS Fewer speckles, less suppression: Less residual Cloud/Aerosols

MapsHistogramsTime-seriesDependenciesHovmöller NAVO VIIRS minus Drifters, 1-Jun-2013 to 31-May-2014 NPP SST Validation: Day NAVO 3 September JPSS SST Val3 Maturity Review NAVO coverage sparser than IDPS/ACSPO, warm bias at Hi-Lat Fewer speckles, less suppression: Less residual Cloud/Aerosols

MapsHistogramsTime-seriesDependenciesHovmöller NPP SST Validation: Day IDPS 3 September JPSS SST Val3 Maturity Review IDPS histogram: Biased/skewed negatively STD~0.78K: IDPS product out of spec

MapsHistogramsTime-seriesDependenciesHovmöller NPP SST Validation: Day ACSPO 3 September JPSS SST Val3 Maturity Review ACSPO: fewer matchups, reduced Bias / Skew / STD Meets JPSS specs, by a wide margin

MapsHistogramsTime-seriesDependenciesHovmöller NPP SST Validation: Day NAVO 3 September JPSS SST Val3 Maturity Review NAVO: ~2.5 fewer matchups, reduced Bias / Skew / STD Meets specs by even wider margin, nut in a smaller domain

ProcessorNOBS (%ACSPO)Min / MaxMean / STDMed / RSD IDPS685,614 (118%) / / / 0.27 ACSPO578,851 (100%) / / / 0.25 NAVO254,045 ( 44%) / / / 0.22 Annual Validation statistics against drifters summary for “01-JUN-2013 to 31-MAY-2014” ProcessorNOBS (%ACSPO)Min / MaxMean / STDMed / RSD IDPS655,336 (115%) / / / 0.46 ACSPO568,610 (100%) / / / 0.37 NAVO219,725 ( 39%) / / / 0.34 Night Day

Evaluation of VIIRS SST Products 4. Examples SST Imagery NAVO images shown below are from the then-operational product. Following STAR feedback, NAVO has fixed imagery artifacts. 3 September 2014JPSS SST Val3 Maturity Review47

3 September 2014JPSS SST Val3 Maturity Review48 NAVO Rectangular shapes? Missed lines? Africa ACSPO_V2.30b01_NPP_VIIRS_ _ _ _NAVO

3 September 2014JPSS SST Val3 Maturity Review49 ACSPO Africa

3 September 2014JPSS SST Val3 Maturity Review50 NAVO Tri-angular shape? Florida ACSPO_V2.30b01_NPP_VIIRS_ _ _ _NAVO

3 September 2014JPSS SST Val3 Maturity Review51 ACSPO Florida

3 September 2014JPSS SST Val3 Maturity Review52 NAVO Too-Regular shapes? India ACSPO_V2.30b01_NPP_VIIRS_ _ _ _NAVO

3 September 2014JPSS SST Val3 Maturity Review53 ACSPO India

3 September 2014JPSS SST Val3 Maturity Review54 China Korea SEATEMP Tri-angular shape? Missed lines? ACSPO_V2.30b01_NPP_VIIRS_ _ _ _NAVO

3 September 2014JPSS SST Val3 Maturity Review55 Korea China ACSPO

Documentation 3 September 2014JPSS SST Val3 Maturity Review56

573 September 2014JPSS SST Val3 Maturity Review Liang, X., and A. Ignatov, 2013: AVHRR, MODIS and VIIRS Radiometric Stability and Consistency in SST Bands. JGR, 118, 6, , doi: /jgrc Bouali, M., A. Ignatov, 2014: Adaptive Reduction of Striping for improved SST Imagery from S-NPP VIIRS. JTech, 31, , doi: /JTECH-D Xu, F., A. Ignatov, 2014: In situ SST Quality Monitor (iQuam). JTech, 31, , doi: /JTECH-D Petrenko, B., A. Ignatov, Y. Kihai, J. Stroup, P. Dash, 2014: Evaluation and Selection of SST Regression Algorithms for JPSS VIIRS. JGR, 119, 8, , doi: /2013JD Liu, Q., A. Ignatov, F. Weng, and X. Liang, 2014: Removing Solar Radiative Effect from the VIIRS M12 Band at 3.7 µm for daytime SST Retrievals. JTech, in press, doi: /JTECH-D Gladkova, I., Y. Kihai, A. Ignatov, F. Shahriar, and B. Petrenko, 2014: SST Pattern Test in ACSPO Clear-Sky Mask for VIIRS. RSE, in review. ACSPO SST ATBD, ACSPO SST External Users Manual (NDE) JPSS SST Documentation

ACSPO VIIRS SST Users 3 September 2014JPSS SST Val3 Maturity Review58

Current and/or Interested Users NOAA STAR (GEO/POLAR Blended L4) – Eileen Maturi NOAA STAR (Coral Reef Watch) – Mark Eakin NOS (Chesapeake Bay Ecosystem analysis) – Chris Brown NCDC (Reynolds SST L4) – Viva Banzon NASA JPL (JPL MUR L4) – Mike Chin Work also underway with NCEP/CPC/OPC – Bob Grumbine, Avichal Mehra; Joe Sienkiewicz NASA GMAO (MERRA) – Ricardo Todling Coast Watch – John Sapper NMFS – Cara Wilson/John Sapper URI – Peter Cornillon NASA and NOAA JPSS SST Users 3 September 2014JPSS SST Val3 Maturity Review59

Current and/or Interested Users Canadian Met Centre (CMC L4) – Bruce Brasnett Australian Bureau of Meteorology (GAMSSA L4) – Helen Beggs UK Met Office (OSTIA L4) – Emma Fiedler Japanese Met Agency (MGD L4) – Shiro Ishizaki DMI, Denmark (DMI L4) – Jacob L. Høyer EUMETSAT (EUMETCAST) – Simon Elliott JPL/PO DAAC (Archive) – Ed Armstrong IFREMER, France (Odyssea L4) – Jean-François Piolle, Emmanuelle Autret To be polled Other users to be polled at GHRSST Meeting, June 2014, Cape Town International JPSS SST Users 3 September 2014JPSS SST Val3 Maturity Review60

Users’ Feedback 3 September 2014JPSS SST Val3 Maturity Review61

Harris – NOAA Geo-Polar Blended L4 SST VIIRS successfully incorporated into Geo-Polar Blended 5-km global SST analysis Super-Ob’d VIIRS SST data Final SST analysis 3 September JPSS SST Val3 Maturity Review

Results of Assimilating ACSPO VIIRS L2P Datasets Bruce Brasnett Canadian Meteorological Centre May, 2014

ACSPO VIIRS L2P Datasets Received courtesy of colleagues at STAR Two periods: 1 Jan – 31 Mar 2014 & 15 Aug – 9 Sep 2013 Daily coverage is excellent with this product Experiments carried out assimilating VIIRS data only and VIIRS data in combination with other satellite products Rely on independent data from Argo floats to verify results Argo floats do not sample coastal regions or marginal seas 3 September JPSS SST Val3 Maturity Review

Assessing relative value of 2 VIIRS datasets: NAVO vs. ACSPO Using ACSPO instead of NAVO improves CMC assimilation 3 September JPSS SST Val3 Maturity Review

CMC Summary ACSPO VIIRS L2P is an excellent product Based on the Jan – Mar 2014 sample, VIIRS contains more information than either NAVO VIIRS, OSI-SAF Metop-A or the RSS AMSR2 datasets L2P ancillary information: quality level flags and wind speeds are useful CMC started assimilating ACSPO VIIRS L2P dataset on 29 May September JPSS SST Val3 Maturity Review

CMC Feedback – Bruce Brasnett 7 July 2014, “We are now running with ACSPO VIIRS and are very happy with the results. You and your team have done an outstanding job with the VIIRS product. Any idea when the backfilled data to Jan will be available?” 28 July 2014, “For years, we at CMC have wanted more retrievals of lake temperatures. Unfortunately, all we were getting from NAVO was retrievals for the largest lakes... nothing for medium and small lakes. This is one reason why we are excited about the ACSPO VIIRS product (many other reasons to be excited about the ACSPO product, by the way)” 19 August 2014, “We have been assimilating ACSPO VIIRS data (downloaded from JPL) for a couple of months now and I am very happy with the results. Excellent global coverage and good performance at high latitudes.” 3 September JPSS SST Val3 Maturity Review

JPSS ACSPO SST Product Has been validated online in NRT, globally and continuously since opening VIIRS cryoradiator doors on 18 Jan It meets JPSS L1RD specs and validated 3 stage requirements IDPS SST domain is comparable with ACSPO. At night, IDPS VAL stats are degraded compared to ACSPO but meet specs. During the daytime, IDPS SST does not meet specs NAVO SST meets L1RD specs, by a slightly wider margin than ACSPO. However, the retrieval domain is ~2.5 smaller ACSPO SST is assimilated into two global L4 analyses: NOAA geo-polar blended and Canadian Met Centre CMC02. Positive feedback received on excellent global coverage, and in particular, coverage in high latitudes and over internal waters SST Team recommends to declare SNPP SST Validated stage 3 Summary 3 September JPSS SST Val3 Maturity Review

Priority of SST Team is to focus on ACSPO users: Work individually, customize product, ensure meeting their needs Continue monitoring ACSPO SST in SQUAM, cross-evaluate against IDPS and NAVO, and validate all against iQuam Generate Level 3 SST product (multiple users requests) Establish VIIRS SST reprocessing capability (in conjunction with UW). Back-fill ACSPO L2/L3 SSTs to Jan’2012 Archive ACSPO L2/3 at JPL/NODC, discontinue IDPS Focus on dynamic, coastal and high-latitude areas Enhance ACSPO clear-sky and ice masks, using pattern recognition approach and day-night band Improve Single Scanner Error Statistics (SSES; part of GDS2) Implement destriping operationally Explore optional VIIRS bands for aerosol correction/flagging Near-Term ACSPO Priorities & Plans 3 September JPSS SST Val3 Maturity Review