© Crown copyright Met Office IQuOD Workshop 2, 5 th June 2014 Chris Atkinson, Simon Good and Nick Rayner Assigning Uncertainties to Individual Observations:

Slides:



Advertisements
Similar presentations
Use of VOS data in Climate Products Elizabeth Kent and Scott Woodruff National Oceanography Centre, Southampton NOAA Earth System Research Laboratory.
Advertisements

1 Hadley Centre Development of a daily gridded mean sea level pressure dataset over the North Atlantic – European region from 1850 – 2003 T. Ansell, R.
© Crown copyright Met Office The EN dataset Simon Good and Claire Bartholomew.
Sandra Castro, Gary Wick, Peter Minnett, Andrew Jessup & Bill Emery.
Foundation Sea Surface Temperature W. Emery, S. Castro and N. Hoffman From Wikipedia: Sea surface temperature (SST) is the water temperature close to the.
IQuOD An International Quality Control Effort Tim Boyer EDM workshop September 10, 2014.
Climate quality data and datasets from VOS and VOSClim Elizabeth Kent and David Berry National Oceanography Centre, Southampton.
Dr Mark Cresswell Model Assimilation 69EG6517 – Impacts & Models of Climate Change.
Grid for Coupled Ensemble Prediction (GCEP) Keith Haines, William Connolley, Rowan Sutton, Alan Iwi University of Reading, British Antarctic Survey, CCLRC.
Measurements in the Ocean Peter Challenor University of Exeter and National Oceanography Centre.
Global Interannual Upper Ocean Heat Content Variability Gregory C. Johnson (NOAA/PMEL), John M. Lyman (UH/JIMA & NOAA/PMEL), Josh K. Willis (NASA/JPL),
The role of gliders in sustained observations of the ocean Deliverable 4.1 or WP 4.
Inter-comparison and Validation Task Team Breakout discussion.
SAMOS-GOSUD Meeting. Boulder 2-4 May Potential collaboration between the Coriolis project and the Samos initiative L. Petit de la Villéon. Ifremer-France-
© Crown copyright Met Office The EN QC system for temperature and salinity profiles Simon Good.
Scientific Needs from the Climate Change Study in the Ocean Toshio Suga Tohoku University (Japan) International Workshop for GODAR-WESTPAC Hydrographic.
Dataset Development within the Surface Processes Group David I. Berry and Elizabeth C. Kent.
ESA Climate Change Initiative Phase-II Sea Surface Temperature (SST) Results from the SST CCI User Workshop on Uncertainty John Kennedy.
Changes in the Seasonal Cycle of Sea Surface Salinity from Jim Reagan 1,2, Tim Boyer 2, John Antonov 2,3, Melissa Zweng 2 1 University of Maryland.
Ocean Salinity validation of mission requirements review / improvements: Points of Reflexion ESL teams Mission Requirements: The so-called GODAE requirements:
Sea surface salinity from space: new tools for the ocean color community Joe Salisbury, Doug Vandemark, Chris Hunt, Janet Campbell, Dominic Wisser, Tim.
Automated Weather Observations from Ships and Buoys: A Future Resource for Climatologists Shawn R. Smith Center for Ocean-Atmospheric Prediction Studies.
Page 1 © Crown copyright 2004 Modern biases in in situ SST measurements John Kennedy, Nick Rayner, Philip Brohan, David Parker, Chris Folland, et al.
Reanalysis: When observations meet models
Topics describe the Data Buoy Cooperation Panel (DBCP) –aims, –achievements and –Challenges network status developments of data buoy technology JCOMM.
AN ENHANCED SST COMPOSITE FOR WEATHER FORECASTING AND REGIONAL CLIMATE STUDIES Gary Jedlovec 1, Jorge Vazquez 2, and Ed Armstrong 2 1NASA/MSFC Earth Science.
1 Argo Update Ming Ji, NOAA/NCEP/OPC Steve Piotrowicz, Ocean.US David Behringer NOAA/NCEP/EMC IGST, St John’s, NL Canada August 7, 2007.
Page 1© Crown copyright HadISST2: progress and plans Nick Rayner, 14 th March 2007.
Sophie RICCI CALTECH/JPL Post-doc Advisor : Ichiro Fukumori The diabatic errors in the formulation of the data assimilation Kalman Filter/Smoother system.
DMI-OI analysis in the Arctic DMI-OI processing scheme or Arctic Arctic bias correction method Arctic L4 Reanalysis Biases (AATSR – Pathfinder) Validation.
Observing System Monitoring Center (OSMC) Status Update April 2005 Steve Hankin – PMEL (co-PI) Kevin Kern – NDBC (co-PI)
Application of in situ Observations to Current Satellite-Derived Sea Surface Temperature Products Gary A. Wick NOAA Earth System Research Laboratory With.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 In Situ SST for Satellite Cal/Val and Quality Control Alexander Ignatov.
1 Motivation Motivation SST analysis products at NCDC SST analysis products at NCDC  Extended Reconstruction SST (ERSST) v.3b  Daily Optimum Interpolation.
Quality Control for the World Ocean Database GSOP Quality Control Workshop June 12, 2013.
2nd GODAE Observing System Evaluation Workshop - June Ocean state estimates from the observations Contributions and complementarities of Argo,
© Crown copyright Met Office The EN4 dataset of quality controlled ocean temperature and salinity profiles and monthly objective analyses Simon Good.
ESA Climate Change Initiative Phase-II Sea Surface Temperature (SST) Phase I results and Phase II plans for climate research Nick Rayner.
Temporal Variability of Thermosteric & Halosteric Components of Sea Level Change, S. Levitus, J. Antonov, T. Boyer, R. Locarnini, H. Garcia,
© Crown copyright Met Office An ocean database of surface / sub-surface temperature and salinity observations (D4.5) Chris Atkinson and Nick Rayner Prototype.
Outcomes of CLIMAR-IV DAVID I. BERRY ETMC-V, 22 – 25 JUNE 2015.
Page 1 Validation Workshop, 9-13 th December 2002, ESRIN ENVISAT Validation Workshop AATSR Report Marianne Edwards Space Research Centre Department of.
1 Global Ocean Heat Content in light of recently revealed instrumentation problems Syd Levitus, John Antonov, Tim Boyer Ocean Climate Laboratory.
? INFORMATION CONTENT IN LOW DENSITY XBT TRANSECTS RELATIVE TO OCEANIC GYRES ROBERT L. MOLINARI FORMERLY: NOAA/AOML PRESENTLY: UNIVERSITY OF MIAMI- CIMAS.
AMSR-E Vapor and Cloud Validation Atmospheric Water Vapor –In Situ Data Radiosondes –Calibration differences between different radiosonde manufactures.
Assimilating Satellite Sea-Surface Salinity in NOAA Eric Bayler, NESDIS/STAR Dave Behringer, NWS/NCEP/EMC Avichal Mehra, NWS/NCEP/EMC Sudhir Nadiga, IMSG.
Page 1 Andrew Lorenc WOAP 2006 © Crown copyright 2006 Andrew Lorenc Head of Data Assimilation & Ensembles Numerical Weather Prediction Met Office, UK Data.
IQuOD Project Proposal To produce and make publically available, the highest-quality standard possible historical subsurface ocean temperature (salinity)
Impact of TAO observations on Impact of TAO observations on Operational Analysis for Tropical Pacific Yan Xue Climate Prediction Center NCEP Ocean Climate.
Impact of Blended MW-IR SST Analyses on NAVY Numerical Weather Prediction and Atmospheric Data Assimilation James Cummings, James Goerss, Nancy Baker Naval.
Infrared and Microwave Remote Sensing of Sea Surface Temperature Gary A. Wick NOAA Environmental Technology Laboratory January 14, 2004.
The Mediterranean Forecasting INGV-Bologna.
© Crown copyright Met Office Ensemble of subsurface databases Simon Good, presented by Nick Rayner, ERA-CLIM workshop on observation errors, Vienna, April.
The IQuOD initiative International Quality-Controlled Ocean Database Current international partners: Argentina, Australia, Brazil, Canada, China, France,
Diurnal Variability Working Group: GHRSST-10 Breakout Session Report Chris Merchant Gary Wick.
A comparison of AMSR-E and AATSR SST time-series A preliminary investigation into the effects of using cloud-cleared SST data as opposed to all-sky SST.
Use of high resolution global SST data in operational analysis and assimilation systems at the UK Met Office. Matt Martin, John Stark,
Understanding and Improving Marine Air Temperatures David I. Berry and Elizabeth C. Kent National Oceanography Centre, Southampton
Uncertainty estimation from first principles: The future of SSES? Gary Corlett (University of Leicester) Chris Merchant (University of Edinburgh)
ST-VAL Breakout Summary Gary Corlett. ST-VAL Breakout 16:20 Introduction and objectives for session (G Corlett) 16:30 The Data Buoy Co-operation Panel.
Status of the Global Ocean Observing System – April 2015 David Legler, Director Climate Obs Division NOAA Climate Program Office OOPC – April 2015 Sendai,
Argo Delayed-Mode Salinity Data
Nick Rayner (Met Office Hadley Centre)
Instrumental Surface Temperature Record
Candyce Clark JCOMM Observations Programme Area Coordinator
Aquarius SSS space/time biases with respect to Argo data
Y. Xue1, C. Wen1, X. Yang2 , D. Behringer1, A. Kumar1,
Instrumental Surface Temperature Record
NOAA Objective Sea Surface Salinity Analysis P. Xie, Y. Xue, and A
Instrumental Surface Temperature Record
Presentation transcript:

© Crown copyright Met Office IQuOD Workshop 2, 5 th June 2014 Chris Atkinson, Simon Good and Nick Rayner Assigning Uncertainties to Individual Observations: Experience from Building HadIOD

© Crown copyright Met Office Presentation Overview HadIOD – Hadley Centre Integrated Ocean Database What is HadIOD? The HadIOD observation error model Allocating uncertainties to sub-surface profile observations How well do we know sub-surface observation uncertainties? What next? Conclusions

© Crown copyright Met Office What is HadIOD? HadIOD is the Met Office Hadley Centre Integrated Ocean Database. It was created as part of the EU-FP7 project ERA-CLIM (European Reanalysis of Global Climate Observations), an aim of which was to prepare the observations for future fully-coupled ocean-atmosphere reanalyses. It is a relational database of ocean temperature and salinity observations, covering the period The database is ‘integrated’, in that it brings together surface and sub- surface components of the ocean observing system, which have traditionally been treated separately for climate monitoring purposes, but are both needed for coupled reanalysis. ERA-CLIM ended in 2013, but development of HadIOD will be continued as part of the follow on project ERA-CLIM2 ( ).

© Crown copyright Met Office What is HadIOD? At present, HadIOD brings together surface temperature observations from ICOADS release 2.5 and sub-surface temperature and salinity profile observations from EN4. An observation is a measurement of a single geophysical variable at some time, position and depth. Currently there are ~1.2 billion observations in HadIOD. From Atkinson et. al., submitted J. Geophys. Res

© Crown copyright Met Office What is HadIOD? HadIOD comprises basic observation information such as: - platform ID, position, time, depth, platform & instrument type, observed temperature & salinity, provenance information, unique ID Each observation also receives quality flags: - ICOADS surface obs receive Met Office Hadley Centre basic QC flags - EN4 sub-surface obs preserve EN4 QC flags - duplicate flag for duplicates introduced by merging ICOADS and EN4 For assimilation, reanalyses want unbiased observations with an estimate of random measurement uncertainty, and so observations in HadIOD are also allocated bias corrections (where available) and uncertainties.

© Crown copyright Met Office The error model used for assigning b ias corrections and uncertainties to individual observations is as follows: observation value = true value + platform-type bias + platform-specific bias + random measurement error This is the error model used by HadSST3, the latest Met Office Hadley Centre dataset of bias corrected, gridded in situ SST anomalies HadSST3 has a sophisticated treatment of observation error The HadIOD observation error model

© Crown copyright Met Office The platform-type bias applies at the level of platform-type, e.g. for measurements made by XBTs or ship buckets The platform-specific bias applies at the individual platform level e.g. a bias in the measurements from a particular ship or float Bias corrections are (where possible) assigned to observations, which can be added to the observed value to correct for observational biases The HadIOD observation error model An example of drifter SST discrepancies (pink) relative to the satellite-based analysis OSTIA (blue) and ATSR (green). Atkinson et. al., J. Geophys. Res., 2013 Global average temperature anomaly for m. Each time series has had a different set of XBT bias adjustments applied except the black line. The grey area is an estimate of the 95% confidence range of the unadjusted data due to uncertainties caused by the limited sampling. All time series are relative to a climatology based on the data with the Levitus et al. (2009) bias adjustments applied. Courtesy S. Good.

© Crown copyright Met Office The error model used for assigning b ias corrections and uncertainties to individual observations is as follows: observation value = true value + platform-type bias + platform-specific bias + random measurement error Bias corrections are (where possible) assigned an associated bias correction uncertainty, which represents the uncertainty in our knowledge of the true correction value Observations are also assigned a random measurement uncertainty, which is the uncertainty due to an unknowable random error Reanalyses do not presently make use of correlated uncertainties and so a combined (correction & random) uncertainty is provided for this user The HadIOD observation error model

© Crown copyright Met Office The error model used for assigning b ias corrections and uncertainties to individual observations is as follows: observation value = true value + platform-type bias + platform-specific bias + random measurement error A further source of uncertainty is the ‘structural uncertainty’, which arises due to the choices made during dataset creation An example would be the different XBT bias correction schemes that are available, i.e. for each XBT observation, multiple realisations of the platform-type bias correction could be assigned (not yet done in HadIOD) These could be applied in turn to explore the structural uncertainty when bias correcting XBT observations The HadIOD observation error model

© Crown copyright Met Office For sub-surface profile observations in HadIOD, uncertainties assigned to observations are largely based on descriptions of measurement accuracy found in WOD 2013 documentation and the recent review by Abraham et al. [2014] of global ocean temperature observations These sources generally provide manufacturer product specification accuracies sorted by WOD probe-type (e.g. CTD, MBT, profiling float) and sensor type (YSI thermistor, Seabird-41 CTD) EN4 (the source of sub-surface observations in HadIOD) combines WOD, GTSPP, Argo GDAC and ASBO observations To assign uncertainties in HadIOD, non-WOD ‘probe-type’ metadata are first coerced to a WOD probe-type, and literature derived uncertainties then applied at (mostly) probe-type level Literature accuracies are treated as random measurement uncertainties One set of XBT/MBT bias corrections are also applied (an updated set of the Gouretski and Reseghetti [2010] corrections). Allocating uncertainties to sub-surface profile observations

© Crown copyright Met Office Allocating uncertainties to sub-surface profile observations From Atkinson et. al., submitted J. Geophys. Res

© Crown copyright Met Office Allocating uncertainties to sub-surface profile observations surface sub-surface From Atkinson et. al., submitted J. Geophys. Res

© Crown copyright Met Office Allocating uncertainties to sub-surface profile observations surface sub-surface From Atkinson et. al., submitted J. Geophys. Res

© Crown copyright Met Office Allocating uncertainties to sub-surface profile observations surface sub-surface From Atkinson et. al., submitted J. Geophys. Res

© Crown copyright Met Office What do literature sourced accuracies refer to? Is this a random measurement error uncertainty, or an uncertainty due to systematic measurement errors which is correlated somehow amongst observations (i.e. a bias), or both? For example, XBTs: -Typical literature sourced XBT accuracy is ~0.15°C; currently HadIOD treats this as an estimate of random measurement uncertainty -However, this seems, at least in part, to refer to uncertainties that are correlated amongst particular types of XBT (e.g. T-4, T-6, etc.) -This uncertainty of 0.15°C needs decomposing into its constituents (due to random and systematic effects) -When an XBT bias correction is provided, which intends to remove systematic error, an associated correction uncertainty should be given -For XBTs, the uncertainty in any depth correction also needs quantifying How well do we know sub-surface observation uncertainties?

© Crown copyright Met Office Sensor accuracies taken from the literature tend to be overly-optimistic relative to assessments of measurement error uncertainty in quality controlled data recovered from deployed instruments, for example: How well do we know sub-surface observation uncertainties? Instrument (measurand) Accuracy← SourceUncertainty← Source Drifting Buoy (sea surface temperature) 0.05°C 1σ-level accuracy recommended by WMO for operational drifting buoys (DBCP website documentation) ~ °C / 0.25°C Kennedy et al. [2013] (Table 2) review of in situ SST uncertainty / WMO guide to typical drifting buoy uncertainty (DBCP website documentation) GTMBA Moored Buoy (sea surface temperature) 0.02°C Next generation ATLAS mooring sensor specification (PMEL TAO website) 0.12°C Kennedy et al. [2011] assessment of tropical mooring measurement error uncertainty using AATSR Animal Mounted CTD (sub-surface temperature) 0.005°C Southern elephant seal Sea Mammal Research Unit CTD- SRDL (WOD13 documentation) 0.05°C Roquet et al. [2013] estimated accuracy of post-processed CTD-SRDL (built after 2007) seal measurements in MEOP- CTD database Argo CTD (salinity) psu Seabird-41 CTD for ALACE float specifications (WOD13 documentation) ~0.01 psu? Thadathil et al. [2012] suggest biases up to ~0.02 psu in delayed mode Argo vs. CTD

© Crown copyright Met Office How can we refine our uncertainty estimates? - By further collating community expertise and knowledge - Difficult to do unilaterally! What next? Annual drift of SALD (representing salinity drift in float CTD) for the four selected floats with linear fits. Data points are plotted only for the years that have CTD matchups. From Thadathil et al. [2012], J. Atmos. Oceanic. Technol.

© Crown copyright Met Office How can we refine our uncertainty estimates? - By considering finer metadata subdivisions (e.g. Argo sensor type) - By considering less prevalent observation types if important at some time or place (e.g. bottle data, profiling drifting buoys) What next? From Atkinson et. al., submitted J. Geophys. Res Excerpt from WOD’13 instrument type code table: v_5_instrument.txt

© Crown copyright Met Office How can we refine our uncertainty estimates? - By inter-comparing different observation types (e.g. XBT ↔ CTD, surface ↔ near-surface profile obs, satellite ↔ near-surface profile obs) - Using feedback information from (re)analysis? What next? Distribution of Argo float and XBT temperatures (shallowest ob in depth range 4-6m) minus near-coincident drifting buoy temperatures, Global temperature anomaly time series relative to the average calculated for the sea surface (red) and the near surface (0-20m) layer (blue). Time series are obtained by area weighted averaging of 5x5-degree box anomalies. Inset maps show spatial distribution and number of observations/profiles for select time periods. From Gouretski et al. [2012], Geophys. Res. Let. 39:L19606.

© Crown copyright Met Office Quantifying and refining uncertainties in sub-surface observations will require significant efforts by the ocean observation community Efforts should probably be prioritised based on user requirements, but who are the users and what might their needs be? What next?

© Crown copyright Met Office Efforts should probably be prioritised based on user requirements, but who are the users and what might their needs be? What next? 1. Ocean model data assimilation (e.g. coupled and ocean reanalyses, monthly-to-decadal forecasting) → ERA-CLIM2 project may help Requirements? Unbiased observations with an estimate of random measurement uncertainty Priorities? Multiple realisations of sub-surface data that explore the uncertainties in existing bias correction schemes (structural and perhaps non-structural) and the creation of new bias correction schemes where needed Refined estimates of random uncertainty, provided that this is comparable in magnitude to representivity error

© Crown copyright Met Office Efforts should probably be prioritised based on user requirements, but who are the users and what might their needs be? What next? 2. Climate monitoring Requirements? Multi-decadal time series of key climate variables with bias adjustments as appropriate and carefully quantified uncertainties Priorities? Multiple realisations of sub-surface data that explore the uncertainties in existing bias correction schemes (structural and perhaps non-structural) and the creation of new bias correction schemes where needed Decomposed and refined estimates of uncertainty terms, including correlated uncertainties

© Crown copyright Met Office Efforts should probably be prioritised based on user requirements, but who are the users and what might their needs be? What next? 2. Climate monitoring Global average temperature anomaly for m. Each time series has had a different set of XBT bias adjustments applied except the black line. The grey area is an estimate of the 95% confidence range of the unadjusted data due to uncertainties caused by the limited sampling. All time series are relative to a climatology based on the data with the Levitus et al. (2009) bias adjustments applied. Courtesy S.Good Timeseries of estimated uncertainties arrising from various sources in global annual sea surface temperature area average (from Figure 8 of Kennedy [2013], Rev. Geophys.).

© Crown copyright Met Office Efforts should probably be prioritised based on user requirements, but who are the users and what might their needs be? What next? 3. Satellite Validation Requirements? High quality near surface temperature measurements suitable as reference data for validating satellite SST retrievals and analyses Priorities? Observations made in the near surface ocean (<10m) with small, well characterised uncertainties and significant coverage in time and space over the satellite era, e.g. Argo, CTD Positions of Argo array floats that have delivered data in the last 30 days (taken from 27/05/14)

© Crown copyright Met Office Summary As part of creating the Met Office Hadley Centre Integrated Ocean Database (HadIOD), and with a reanalysis motivation in mind, we have made an attempt to allocate bias adjustments and uncertainties to sub-surface profile observations. This was done largely at platform-type level. In HadIOD, uncertainty is decomposed into random measurement uncertainty and bias correction uncertainties. For sub-surface profile observations our knowledge of these uncertainties is perhaps more limited than for surface-only observations. Structural uncertainties can be explored by attaching multiple realisations of bias corrections and uncertainties to the observations. To refine our understanding of uncertainties will likely require effort from across the observations community, and should probably be driven by understanding user needs Assigning Uncertainties to Individual Observations: Experience from Building HadIOD