1 CALIPSO: Validation activities and requirements Dave Winker NASA LaRC GALION, WMO Geneva, 20-23 September 2010.

Slides:



Advertisements
Similar presentations
GEMS-Aerosol WP_AER_4: Evaluation of the model and analysis Lead Partners: NUIG & CNRS-LOA Partners: DWD, RMIB, MPI-M, CEA- IPSL-LSCE,ECMWF, DLR (at no.
Advertisements

Robin Hogan, Richard Allan, Nicky Chalmers, Thorwald Stein, Julien Delanoë University of Reading How accurate are the radiative properties of ice clouds.
Robin Hogan Julien Delanoe University of Reading Remote sensing of ice clouds from space.
Products from the OMPS Limb Profiler (LP) instrument on the Suomi NPP Satellite Pawan K. Bhartia Earth Sciences Division- Atmospheres NASA Goddard Space.
Calibration Scenarios for PICASSO-CENA J. A. REAGAN, X. WANG, H. FANG University of Arizona, ECE Dept., Bldg. 104, Tucson, AZ MARY T. OSBORN SAIC,
1 ASIC-3 Workshop March 2006 Climate Quality Observations from Satellite Lidar Dave Winker, NASA LaRC, Hampton, VA 28 April ‘06.
Earth System Science Teachers of the Deaf Workshop, August 2004 S.O.A.R. High Earth Observing Satellites.
Validation of MODIS based GOES-R ABI AOD retrievals using Ground based LIDAR Data R. Bradley Pierce Ed Eloranta, Dave Turner, Shobha Kondragunta, Pubu.
1 An initial CALIPSO cloud climatology ISCCP Anniversary, July 2008, New York Dave Winker NASA LaRC.
EarthCARE: The next step forward in global measurements of clouds, aerosols, precipitation & radiation Robin Hogan ECMWF & University of Reading With input.
ESA Explorer mission EarthCARE: Earth Clouds, Aerosols and Radiation Explorer Joint ESA/JAXA mission Launch 2016 Budget 700 MEuro.
Atmospheric structure from lidar and radar Jens Bösenberg 1.Motivation 2.Layer structure 3.Water vapour profiling 4.Turbulence structure 5.Cloud profiling.
Page 1 1 of 20, EGU General Assembly, Apr 21, 2009 Vijay Natraj (Caltech), Hartmut Bösch (University of Leicester), Rob Spurr (RT Solutions), Yuk Yung.
Cooperative Institute for Meteorological Satellite Studies University of Wisconsin - Madison Steve Ackerman Director, Cooperative Institute for Meteorological.
Ben Kravitz November 5, 2009 LIDAR. What is LIDAR? Stands for LIght Detection And Ranging Micropulse LASERs Measurements of (usually) backscatter from.
1 Satellite Remote Sensing of Particulate Matter Air Quality ARSET Applied Remote Sensing Education and Training A project of NASA Applied Sciences Pawan.
Direct Radiative Effect of aerosols over clouds and clear skies determined using CALIPSO and the A-Train Robert Wood with Duli Chand, Tad Anderson, Bob.
Space Borne and Ground Based Lidar NASA ARSET- EPA Training ARSET - AQ Applied Remote Sensing Education and Training – Air Quality A project of NASA Applied.
Direct aerosol radiative forcing based on combined A-Train observations – challenges in deriving all-sky estimates Jens Redemann, Y. Shinozuka, M.Kacenelenbogen,
1 CALIPSO Status and Plans Dave Winker Winds Working Group, June 2009, Wintergreen, VA.
Metr 415/715 Monday May Today’s Agenda 1.Basics of LIDAR - Ground based LIDAR (pointing up) - Air borne LIDAR (pointing down) - Space borne LIDAR.
Meteorological Observatory Lindenberg – Richard Assmann Observatory The GCOS Reference Upper Air Network.
1 Satellite Remote Sensing of Particulate Matter Air Quality ARSET Applied Remote Sensing Education and Training A project of NASA Applied Sciences Pawan.
EARLINET and Satellites: Partners for Aerosol Observations Matthias Wiegner Universität München Meteorologisches Institut (Satellites: spaceborne passive.
EOS Program VALIDATION EOS Field Campaign Experience DAVID STARR Former EOS Validation Scientist For Steve Platnick EOS Senior Project Scientist GOES-R.
Diagnosing Climate Change from Satellite Sounding Measurements – From Filter Radiometers to Spectrometers William L. Smith Sr 1,2., Elisabeth Weisz 1,
Lidar Working Group on Space-Based Winds, Snowmass, Colorado, July 17-21, 2007 A study of range resolution effects on accuracy and precision of velocity.
Summer Institute in Earth Sciences 2009 Comparison of GEOS-5 Model to MPLNET Aerosol Data Bryon J. Baumstarck Departments of Physics, Computer Science,
1 Satellite Remote Sensing of Particulate Matter Air Quality ARSET Applied Remote Sensing Education and Training A project of NASA Applied Sciences Pawan.
Aerosol Data Assimilation with Lidar Observations and Ensemble Kalman Filter T. Thomas Sekiyama (MRI/JMA, Japan) T. Y. Tanaka(MRI/JMA, Japan) A. Shimizu(NIES,
Page 1 Validation by Balloons and Aircraft - ESRIN - 9– 13 December 2002 Observations of aerosol and clouds obtained during the M-55 Geophysica ENVISAT.
1 GOES-R AWG Product Validation Tool Development Aviation Application Team – Volcanic Ash Mike Pavolonis (STAR)
1 GOES-R AWG Product Validation Tool Development Aviation Application Team – Volcanic Ash Mike Pavolonis (STAR)
Determination of atmospheric structures, aerosol optical properties and particle type with the R-MAN 510 Raman dual polarization lidar super ceilometer.
The US Proposal for ADM Calibration and Validation Mike Hardesty, Dave Bowdle, Jason Dunion, Ed Eloranta, Dave Emmitt, Brian Etherton, Rich Ferrare, Iliana.
Ocean subsurface studies from space-based lidar measurements Xiaomei Lu, 1 Yongxiang Hu, 2 1 Science Systems and Applications, Inc. (SSAI), Hampton, Virginia.
The Second TEMPO Science Team Meeting Physical Basis of the Near-UV Aerosol Algorithm Omar Torres NASA Goddard Space Flight Center Atmospheric Chemistry.
Group proposal Aerosol, Cloud, and Climate ( EAS 8802) April 24 th, 2006 Does Asian dust play a role as CCN? Gill-Ran Jeong, Lance Giles, Matthew Widlansky.
Using GLAS to Characterize Errors in Passive Satellite Cloud Climatologies Michael J Pavolonis* and Andrew K Heidinger# *CIMSS/SSEC/UW-Madison #NOAA/NESDIS.
CLOUD PHYSICS LIDAR for GOES-R Matthew McGill / Goddard Space Flight Center April 8, 2015.
(to optimize its vertical sampling)
ACKNOWLEDGEMENTS: Rob Albee, Jim Wendell, Stan Unander, NOAA Climate Forcing program, DOE ARM program, NASA, Met. Service Canada, Chinese Met. Agency,
Bryan A. Baum, Richard Frey, Robert Holz Space Science and Engineering Center University of Wisconsin-Madison Paul Menzel NOAA Many other colleagues MODIS.
Jetstream 31 (J31) in INTEX-B/MILAGRO. Campaign Context: In March 2006, INTEX-B/MILAGRO studied pollution from Mexico City and regional biomass burning,
The Orbiting Carbon Observatory (OCO) Mission: Retrieval Characterisation and Error Analysis H. Bösch 1, B. Connor 2, B. Sen 1, G. C. Toon 1 1 Jet Propulsion.
Satellites Storm “Since the early 1960s, virtually all areas of the atmospheric sciences have been revolutionized by the development and application of.
Characterization of GOES Aerosol Optical Depth Retrievals during INTEX-A Pubu Ciren 1, Shobha Kondragunta 2, Istvan Laszlo 2 and Ana Prados 3 1 QSS Group,
Cloud property retrieval from hyperspectral IR measurements Jun Li, Peng Zhang, Chian-Yi Liu, Xuebao Wu and CIMSS colleagues Cooperative Institute for.
Direct aerosol radiative effects based on combined A-Train observations Jens Redemann, Y. Shinozuka, J. Livingston, M. Vaughan, P. Russell, M.Kacenelenbogen,
JAPAN’s GV Strategy and Plans for GPM
ACE - Aerosol Working Group Science Traceability Matrix Category 1: Aerosols, Clouds and Climate Category 2: Aerosols, Clouds and Precipitation.
Ball Aerospace & Technologies Corporation -
Validation of OMI and SCIAMACHY tropospheric NO 2 columns using DANDELIONS ground-based data J. Hains 1, H. Volten 2, F. Boersma 1, F. Wittrock 3, A. Richter.
1 CALIPSO VALIDATION and DATA QUALITY IMPROVEMENT EECLAT T0, J. Pelon.
May 15, 2002MURI Hyperspectral Workshop1 Cloud and Aerosol Products From GIFTS/IOMI Gary Jedlovec and Sundar Christopher NASA Global Hydrology and Climate.
1 Recent advances in CALIPSO cloud retrievals: Progress over the last 7 years Looking ahead to the next 30 ISCCP at 30: City College of New York, 23 April.
12 April 2013 VARSY progress meeting Robin Hogan and Nicola Pounder (University of Reading)
UNIVERSITY OF BASILICATA CNR-IMAA (Consiglio Nazionale delle Ricerche Istituto di Metodologie per l’Analisi Ambientale) Tito Scalo (PZ) Analysis and interpretation.
Studying the radiative environment of individual biomass burning fire plumes using multi-platform observations: an example ARCTAS case study on June 30,
What Are the Implications of Optical Closure Using Measurements from the Two Column Aerosol Project? J.D. Fast 1, L.K. Berg 1, E. Kassianov 1, D. Chand.
Aerosol properties in a cloudy world (from MODIS and CALIOP) Alexander Marshak (GSFC) Bob Cahalan (GSFC), Tamas Varnai (UMBC), Guoyong Wen, Weidong Yang.
Vertically resolved CALIPSO-CloudSat aerosol extinction coefficient in the marine boundary layer and its co-variability with MODIS cloud retrievals David.
Fourth TEMPO Science Team Meeting
LIDAR Ben Kravitz November 5, 2009.
Need for TEMPO-ABI Synergy
NPOESS Airborne Sounder Testbed (NAST)
A Discussion on TEMPO Draft CH2O Validation Plan
CALIPSO Total Attenuated Backscatter 532 nm 7 June 2006 Volcanic plume
Mike Pavolonis (NOAA/NESDIS/STAR)
Assessment of Satellite Ocean Color Products of the Coast of Martha’s Vineyard using AERONET-Ocean Color Measurements Hui Feng1, Heidi Sosik2 , and Tim.
Presentation transcript:

1 CALIPSO: Validation activities and requirements Dave Winker NASA LaRC GALION, WMO Geneva, September 2010

2 705 km, sun-synchronous CALIOP: backscatter Nd:YAG lidar 532, 532-perp, 1064 Launch: 28 April 2006

3 Lidar Data Products Level 1 (geolocated and calibrated) DP profiles of attenuated lidar backscatter (532, 532 , 1064 nm) DP 1.2 – IR radiances (8.65, 10.6,  m) DP 1.3 – Visible radiances (650 nm) (WFC) Level 2 DP 2.1A – Cloud/Aerosol layer product – layer base and top heights, layer-integrated properties DP 2.1B – Aerosol profile product – backscatter, extinction, depolarization profiles DP 2.1C – Cloud profile product – backscatter, extinction, depolarization, ice/water content profiles DP 2.1D – Vertical Feature mask – cloud/aerosol locations (cloud & aerosol Level 3 products in development) (available at

4 Scope of CALIPSO validation Validate instrument performance – calibration, SNR, linearity, transient recovery, etc. Establish/verify basis for algorithm assumptions – S a, S c, spectral independence of cirrus backscatter, etc. Quantify the random and bias errors in Level 2 products – identify sources of errors, if possible instrument performance, inadequate retrieval model, retrieval assumptions, etc. – Validate the parameter uncertainties provided in Version 3 Level 3 products – Validation of time-space averaged properties present unique challenges

5 Approaches to acquiring validation data Long-term surface sites – cost is minimal – instruments well-calibrated and characterized – spatial matching requirements make sites more or less useful – can compare surface statistics with regional CALIOP statistics But nadir-view means it takes a long time to build up statistics Field campaigns – can provide comprehensive measurement suites required to fully understand the retrieval performance – can provide spatially and temporally matched data in any location – historically, the number of independent samples obtained is limited – field campaigns can be large and complex, or small and focused Difficult to control large campaigns – validation is one of many objectives Impractical – or only rare opportunities - to take campaigns to many desired locations Other satellites – spatial matching problems with surface sites increases the attractiveness of using satellite data for validation – the A-train provides a large coincident data set from many instruments – many CALIOP measurements are unique and not available from other satellites Passive satellites more useful as ‘sanity checks’ than true validation

6 Tracks per 5x5 grid cell over 16 days 50 23

7 CALIOP Validation Needs (1/2) All data for validation use must be accompanied by error bars 532 nm calibration – Need to assess latitudinal dependence – Currently based on HSRL comparisons: 10N – 70N, no SH data – Accurate assessment (< 5%) from G/B lidar possible? 1064 nm calibration – Accurate 1064 calibration difficult for all lidars (?) – Currently, relying on comparisons with 532 nm returns from clouds and ocean surface to assess 1064 calibration Aerosols – Detection sensitivity fairly well established, is sometimes useful to quantify what is missed (eg: Arctic) – Extinction profiles and AOD – And …. continued:

8 CALIOP Validation Needs (2/2) The things that AOD and extinction depend on: Calibration Cloud-Aerosol Discrimination Correct classification of aerosol type Lidar ratio LaRC HSRL provides 532 nm lidar ratios over US/Canada Must be supplemented by G/B measurements for additional spatial/temporal coverage Need better information on 1064 nm lidar ratios Multiple scattering corrections Etc. Uncertainty parameters included in Version 3 primarily for extinction and AOD Monthly gridded data (Level 3) – G/B network measurements required to validate Level 3 products, especially profiles Aircraft data too limited in space/time Other satellites do not provide profiles

9 Validation from LaRC HSRL A key resource for CALIOP validation:  Aerosol backscatter and depolarization at 532 nm and 1064 nm Aerosol extinction via HSRL technique at 532 nm Flies on NASA-LaRC King Air : >100 CALIPSO underflights

10 C 1064 = e9C 532 = e10  = -2%  = 1% 532 nm 1064 nm Validation of Level 1 profiles w/ LaRC HSRL HSRL CALIPSO

11 Mean daytime bias = 4.4 % Mean night bias = 4.91 % HSRL comparisons used to assess CALIOP 532 nm calibration Comparison uncertainty ~ 4-5%

12 HSRL measurements of Sa Typical range of Sa for continental aerosol? Maryland (CATZ) Oklahoma (CHAPS)

13 S a comparisons from most recent campaign marine “polluted dust” dust smoke From co-located HSRL-CALIOP measurements (HSRL observations partitioned according to CALIOP aerosol typing) August 2010 campaign:

14 What are we missing? (Eureka HSRL, 82° N) Eureka HSRL Backscatter (km -1 sr -1 ) Sept. 9,

15 Thorsen, Fu, & Comstock (ASR STM 2010) Tropical Cirrus: Nauru ARM lidar

16 w.r.t. EarthCARE: CALIPSO Timeline Launch L=0 Assessment Validation ~L+45 days L+36 months Platform Checkout L- 8 months ~L+7 days LEOP Activities Payload Checkout L+135 days L+18 months preliminary data release Pre-Launch Activities Version 1.0 data release Science Operations Level 1 & 2a Level 2b

17 Long-term aerosol climate data Bridging from CALIPSO to EarthCARE and beyond not trivial – Especially if there is no on-orbit overlap CALIOP: 532, 532-perp, 1064 ATLID: 355, 355-perp, Sa Spectral aerosol , S a need to be characterized (globally) Instrument designs will also result in differences in: – cloud-aerosol discrimination – aerosol type identification Long-term, widespread groundbased lidar networks required

18 Validation lessons-learned Calibration via molecular normalization requires validation Validation of CALIOP is greatly complicated by sampling issues related to the “zero swath” – space/time matched observations are necessary to eliminate questions due to matching errors – different degrees of inhomogeneity for aerosols and clouds result in different validation strategies Thorough validation in one region ≠ global validation – Validation in many different regions is required Dedicated aircraft campaigns (LaRC HSRL, CC-VEX) are much more flexible, and can be more productive than large field campaigns Difficult to make use of in situ measurements due to limited sampling – but can provide critical information not available by other means Validation is never finished – The need for validation continues after completion of funded field campaigns