Atmospheric Correction Inter-comparison eXercise

Slides:



Advertisements
Similar presentations
MCST – MODLAND Eric F. Vermote –MODLAND representative.
Advertisements

Surface Reflectance over Land: Extending MODIS to VIIRS Eric Vermote NASA GSFC Code 619 MODIS/VIIRS Science Team Meeting, May.
2010 CEOS Field Reflectance Intercomparisons Lessons Learned K. Thome 1, N. Fox 2 1 NASA/GSFC, 2 National Physical Laboratory.
Co-authors: Maryam Altaf & Intikhab Ulfat
The IOCCG Atmospheric Correction Working Group Status Report The Eighth IOCCG Committee Meeting Department of Animal Biology and Genetics University.
Soe Hlaing *, Alex Gilerson, Samir Ahmed Optical Remote Sensing Laboratory, NOAA-CREST The City College of the City University of New York 1 A Bidirectional.
® Kick off meeting. February 17th, 2011 QUAlity aware VIsualisation for the Global Earth Observation system of systems GEOVIQUA workshop February, the.
CEOS WGCV Land Product Validation Sub-Group Overview Joanne Nightingale, Jaime Nickeson (Sigma Space Corp / NASA GSFC) With input from LPV Focus Group.
Radiometric Correction and Image Enhancement Modifying digital numbers.
Page 1 ENVISAT Validation Review – Frascati – 9-13 December st Envisat Validation Workshop MERIS, December 2002 Conclusions and Recommendations.
Page 1 ENVISAT Validation Review – Frascati – 9-13 December st Envisat Validation Workshop MERIS Conclusions and recommendations.
WGCV Report and Actions Pascal Lecomte – Chair Greg Stensaas – Vice chair Marie-Claire Greening - Secretariat 1 23 rd CEOS Plenary I Phuket, Thailand I.
Intercomparison of OMI NO 2 and HCHO air mass factor calculations: recommendations and best practices A. Lorente, S. Döerner, A. Hilboll, H. Yu and K.
DEPARTMENT OF GEOMATIC ENGINEERING MERIS land surface albedo: Production and Validation Jan-Peter Muller *Professor of Image Understanding and Remote Sensing.
1 Validated Stage 1 Science Maturity Review for Surface Reflectance Eric Vermote Sept 04, 2014.
A surface reflectance standard product from LDCM and supporting activities Principal Investigator Dr. Eric Vermote Geography Dept., University of Maryland.
BRDF/ALBEDO GROUP Román, Schaaf, Strahler, Hodges, Liu Assessment of Albedo Derived from MODIS at ChEAS - Park Falls ChEAS 2006 Meeting: June 5 - June.
INSIGHTS FROM CAR AIRBORNE MEASUREMENTS DURING INTEX-B (Mexico) FOR SATELLITE VALIDATION C. K. Gatebe, 1,2 Michael D. King, 2 G.T. Arnold, 1,3 O. Dubovik,
QA4EO in 10 Minutes! A presentation to the 10 th GHRSST Science Team Meeting.
The International Ocean Colour Coordinating Group International Network for Sensor Inter- comparison and Uncertainty assessment for Ocean Color Radiometry.
Incoming Themes for 2017 Frank Kelly, USGS
Analysis Ready Data LSI-VC – Adam Lewis Co-chair
Atmospheric Correction Discussion
A Atmospheric Correction Update and ACIX Status
Future directions of CEOS WGCV: Incoming chair perspective
Landsat Analysis Ready Data for LCMAP
Moderate Resolution Sensor Interoperability (MRI) Initiative
PATMOS-x Reflectance Calibration and Reflectance Time-Series
Algorithm Theoretical Basis Document GlobAlbedo Aerosol Retrieval
Analysis Ready Data July 18, 2016 John Dwyer Leo Lymburner
Analysis Ready Data ..
The ROLO Lunar Calibration System Description and Current Status
USGS Status Frank Kelly, USGS EROS CEOS Plenary 2017 Agenda Item #4.14
Committee on Earth Observation Satellites
Action Item Review for WGCV-44
Action Item Review for WGCV-43
WGISS-WGCV Joint Session
Towards achieving continental scale field validation and multi-sensor interoperability of satellite derived surface reflectance in Australia Medhavy Thankappan1,
WGCV Work Plan Actions K. . Thome NASA WGCV Plenary # 43
WGCV Overview K. Thome WGISS#45 / WGCV#43
USGS Agency Update: CARD4L Production Roadmap
WGCV Report Albrecht von Bargen (DLR) Kurt Thome (NASA)
Site classifications, definitions, and updates to Landnet
Action Item Status / CEOS Work Plan Status
Cloud mask update K. . Thome NASA WGCV Plenary # 43
Carbon Actions for WGCV
Carbon Actions for WGCV
Proposed WGCV CARD4L assessment process
Atmospheric Correction Inter-comparison eXercise
WGCV Chair’s Report K. Thome NASA/GSFC WGCV 42
WGISS Working Group for Information Systems & Services
Committee on Earth Observation Satellites
Recent activities of OCR-VC
Committee on Earth Observation Satellites
Closure of WGCV actions (CV-01; CV-09; CV-13; CV-16)
TanSat/CAPI Calibration and validation
WGCV Work Plan Actions K. Thome NASA WGCV Plenary # 44
Igor Appel Alexander Kokhanovsky
Uncertainties for Analysis Ready Data
Current Status of ROLO and Future Development
DEM update and discussion
Committee on Earth Observation Satellites
Day 2 Summary K. Thome NASA WGCV Plenary # 42 May 16-19, 2017.
Discussion on product interoperability: from CARD4L to MRI
WG Calibration and Validation
Atmospheric Composition Virtual Constellation (AC-VC)
SR Self Assessment for CARD4L - Geoscience Australia
WGCV CARD4L Peer Review Medhavy Thankappan WGCV-45 July 15-19, 2019.
WGCV Chair’s Report C. Ong CSIRO WGCV Plenary # 45
DEM related topics K. Thome NASA WGCV Plenary # 45 CSIRO, Perth
Presentation transcript:

Atmospheric Correction Inter-comparison eXercise ACIX Atmospheric Correction Inter-comparison eXercise

WHAT? International collaborative initiative to inter-compare a set of atmospheric correction (AC) processors for Sentinel-2 and Landsat-8 imagery Better understanding of the different uncertainty contributors and help in improving the AC processors How? Definition of the inter-comparison protocol All the proposals were discussed at the 1st workshop and the final inter-comparison procedure was agreed by all participants. The inter-comparison of the derived Bottom-of-Atmosphere (BOA) products is expected to contribute to the understanding of the different uncertainty contributors and help in improving the AC processors. Application of the AC processors Participants applied their AC schemes for a set of test sites keeping the processing parameters constant. Analysis of the results ACIX coordinators processed the results submitted by all participants. All the results were presented and discussed during the 2nd ACIX workshop.

WHO? Sensors Processors Sentinel-2 (MSI) Landsat-8 (OLI) 1 Mandatory Run Optional Run 1 ACOLITE [Belgium]  2 ATCOR [Germany] 3 Brockmann [Germany] 4 FORCE [Germany] 5 GA-PABT [Australia]  (Only Canberra) (Only Canberra) 6 LaSRC [USA] 7 LAC [France] 8 MACCS [France] 9 OPERA [Belgium] 10 SCAPE-M [Germany] 11 SeaDAS [USA] 12 Sen2Cor [France&Germany]  Data provided Data NOT provided

Conclusions Recommendations Many lessons learned for all processors that will help in improving them Learning on the configuration and performance of other processors Importance of quality flagging pixels (cloud/high-aerosol) to guarantee a good AC ACIX results need to be further analysed and quality controlled Provision of per-pixel uncertainties hampered by the quality of cloud/high- aerosol masks Recommendations Metrics in terms of global difference (combining mean and std) Results’ presentation for each test site, per band all the processors, time series Consistency in usage of sun irradiance model Flag to indicate that processors switch to default values Harmonisation of the data processed by all participants to have conclusions over the same sites/same dates Development of a platform supporting the online validation of the processors

! All the plots and results of ACIX will be available soon on CEOS Cal/Val portal {http://calvalportal.ceos.org/home} ! All the plots and results of ACIX will be available soon on CEOS Cal/Val portal {http://calvalportal.ceos.org/home}

! Way Forward - Next steps for ACIX - Plans for ACIX 2 Consolidate results/metrics (organizing committee) considering conclusions/feedback Distribute the ACIX report to all participants & Receive their feedback Videoconference to discuss and agree on the final report (target September) Publication in a scientific journal ! All the plots and results of ACIX will be available on CEOS Cal/Val portal {http://calvalportal.ceos.org/home} - Plans for ACIX 2 Complete time series from L8/S2A/S2B of at least 1 year period Selection of sites to be refined Metrics: the same of the 1st ACIX + comparison with RadCalNet, AeroNet OC, smoothness in time series, consistency across tiles Assess quality of cloud/shadows masks Go to higher spatial resolution (pixel scale) Single run (with adjacency, without BRDF/topography corrections) Harmonise the datasets between participants

Perspective of WGCV Chair Teams learned many lessons of where to improve their schemes while at the same time increasing their confidence in their schemes Will be difficult for some of the tailored schemes to go broader Flagging of results improved Many lessons learned during the first exercise that will help any follow-on activities Cloud mask characterization is needed DEM at global scale with SRTM quality for processing is sufficient Pixel-by-pixel uncertainty reporting for interoperability is not feasible right now Complexity of the problem prevents any confidence in a result developed More resources needed to develop a pixel-by-pixel and to validate that uncertainty Uncertainties with view angle? - Solar zenith angle >75 is suspect. View angle is not an issue out to MODIS scan angles. BRDF? - diffuse light included in code - not typically. The correction to an equivalent nadir reflectance is not recommended because knowledge of surface type limits accuracy ant the DEM is limitation for surface slope correction. Ground truth reflectance? - LandValNet

Perspective of WGCV Chair Uncertainties from angle Results at solar zenith angle >75 are suspect View angle is not an issue out to MODIS scan angles BRDF Diffuse light effects typically not included Correction to an equivalent nadir reflectance is not recommended because knowledge of surface type limits accuracy and DEM is a limitation for surface slope correction Ground truth reflectance could be obtained from RadCalNet