Download presentation
Presentation is loading. Please wait.
Published byPaul Damian Long Modified over 8 years ago
1
S5P Ozone Profile (including Troposphere) verification: RAL Algorithm R.Siddans, G.Miles, B.Latter S5P Verification Workshop, MPIC, Mainz 19-20 th May 2015
2
Outline RAL Algorithm Initial Results Verification plans and next steps
3
RAL UV Ozone Scheme ESA Climate Change Initiative - Essential Climate Variable –Producing climate quality long-term datasets from satellite measurements –RAL scheme is O3 ECV UV nadir profile product for: GOME (1996-2011) SCIAMACHY (2002-2012) OMI (2005-2015) GOME2A&B (2007-present day) RAL currently produces NRT profiles for GOME-2 Part of trial assimilation into ECMWF analysis
4
7-year GOME-2A Lower tropospheric ozone climatology (2007-2013) January FebruaryMarch April May June JulyAugust September OctoberNovember December
5
RAL GOME-2 Ozone Scheme Overview 3-step retrieval: band 1a, surface albedo, band 2b. Use sun-normalised radiance in Hartley and Huggins bands to measure ozone in Earth’s atmosphere Forward model inc. Rayleigh+cloud scattering, surface Huggins band reveals information on tropospheric ozone, requires precision of fit >0.1%. For band 1, absolute calibration is important, especially for stratospheric ozone. For band 2, a good estimate of noise is important for precision of the fitting for tropospheric ozone Fit residuals < 0.1% cloud-free cloudy Measured spectra in Huggins bands Ozone absorption
6
Main differences to prototype algorithm Sequential fit rather than global fit to all measurements Wavelength range – uses channels 1-3 (not just 1-2) In standard configuration, different prior covariance (to allow measurments to be fit to sufficient precision in troposphere) Differential fit (rather than direct fit of sun-normalised radiance) in Huggins bands (see above) Other smaller differences (e.g. number of streams, number of retrieval levels, number of FM levels use of noise floor in some bands…)
7
Ozone Profile (inc. Troposphere) Verification Approach Aim: ensure the prototype algorithm performs within the requirements –From verification report:
8
Progress/status Linear simulations based on simplified version of RAL algorithm completed, but not directly compared with Prototype or Bremen algorithm results Updated RAL processor to work with S5P-like L1b files (also potential to ingest OMI) Non-linear, iterative simulations mechanically working with full RAL processor –Some adjustments still to be made –Settings still need to be fully unified with Prototype/Bremen algorithm before finalising and comparing results
9
Linear simulations – error mapping Estimated retrieved error of retrieved subcolumn (using S5 noise and ISRF)
10
Linear simulations – error mapping Estimated noise error contribution to retrieved subcolumn (using S5 noise and ISRF)
11
Non-linear simulations First effort! Simulations based on CAMELOT radiances generated by Prototype algorithm Based on minimum measurement noise threshold requirement Integration time and slit function shape not quite correct RAL algorithm not fully optimised for S5P-like data yet
12
Non-linear simulations First effort! Simulations based on CAMELOT radiances generated by Prototype algorithm Based on minimum measurement noise threshold requirement Integration time and slit function shape not quite correct RAL algorithm not fully optimised for S5P-like data yet
13
CAMELOT Simulation Statistics Some bias’ evident throughout profile more levels potentially required in FM (as indicated in previous FM comparison). Improvement expected when algorithm settings corrected (e.g. slit functions).
14
Next Steps Direct comparison of results of linear simulations (update with S5P rather than S5 settings?) Finalise settings for non-linear retrieval simulations and compare results Comparison of convolved spectra (c.f. monochromatic FM comaprison) Apply all algorithms to existing real data (that can be validated to some degree) and compare (which data?, how much?) Comparison of some basic retrieval diagnostics based on some of the above (e.g. AKs, DFS). –Some comparison of estimated errors has been attempted for linear simulations Agree a timeline to progress work
15
Open Questions Selection of real data to compare (when/where/which instrument)? Treatment of cloud? Mode of direct comparison?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.