Download presentation
Presentation is loading. Please wait.
Published byEmmeline McCoy Modified over 9 years ago
1
EOS Program VALIDATION EOS Field Campaign Experience DAVID STARR Former EOS Validation Scientist For Steve Platnick EOS Senior Project Scientist GOES-R Field Campaign Workshop, April 2015
2
EOS Program VALIDATION EOS - A Science-driven Program Algorithm development and validation responsibility of PI-led teams End product was quality science results Field campaign validation data acquisition initiated & organized by PI’s Calibration also primarily PI responsibility Supported by EOS Program Office
3
EOS Program VALIDATION EOS Project Science Office Support Flight-hour cost matching Support Airborne Simulators MAS, MASTER, AirMISR, MOPITT-A, S-HIS Support correlative measurement infrastructure –AERONET, MPLNET, BSRN, SUFRAD, etc – HITRAN Calibration Office Coordination with NASA R&A programs EOS Validation NRAs (not open to team members)
4
EOS Program VALIDATION EOS Validation Field Campaigns Critical in pre-launch development (“Butler mandate”) Typical hiatus before launch Post-launch field campaigns typically integrated with R&A programs, highly integrated in some cases (~contribution) Context Post-launch validation heavily focused on using long- term observing networks and leveraging other programs –AERONET, MPLNET, BSRN, SUFRAD, DoE ARM, etc EOS Validation NRAs: Mix of selected independent data acquisitions and analysis
5
EOS Program VALIDATION Field Campaigns Surface Networks Satellites Remote Sensing & In Situ Observations Remote Sensing & In Situ Observations Temporal Context Spatial Context EOS Validation Strategy
6
EOS Program VALIDATION Lessons Learned Integration of Algorithm, Calibration (sensor), Validation and Production Processing Teams is critical end-to-end, birth-to-death => access and interaction must be enabled Investigator commitment and insight into end-to-end is crucial (Easy to waste your resources !!!) Algorithms and resultant data products must evolve in time – especially for L2 data products Key result of some “validation” field campaigns is to learn something that will ultimately lead to algorithm improvement, or even new products (Reprocessing is necessary, even as motivation) Some problems are more tractable than others. Allow appropriate approaches depending on the discipline and science culture (Listen to science community!)
7
EOS Program VALIDATION More Lessons Learned AERONET critical for aerosol products, cruises also highly valuable Chemistry similarly depends critically on existing networks - NOAA CMDL, funded supplemental aircraft soundings Clouds are extremely difficult, small scale structure/variability in time & space, and consequent view angle dependence. DoE ARM sampling helps address. AMSR-E had unique degree of success – Wakasa Bay data acquisition has proven hugely valuable over the past decade, and continues so. Atmospheric Sounding => achieving high degree of coincidence has big impact (special DoE ARM soundings) Snow and Ice cover – AMSR-E had excellent field campaign in arctic, similar for sea ice in Antarctic, problematic in Rockies, soil moisture? Vegetation – unclear (e.g., SAFARI-2000), emphasis on vicarious calibration Ocean Ecosystems – relied on community activities, well-organized. Enabled incorporation and QA of independent correlative measurements. Also MOBY. Take advantage, do not attempt to re-invent.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.