Download presentation
Presentation is loading. Please wait.
Published byLillian O’Connor’ Modified over 9 years ago
1
Science Innovation Fund: Quantifying the Variability of Hyperspectral Shortwave Radiation for Climate Model Validation Yolanda Roberts 1 Constantine Lukashin 1, Patrick Taylor 1 Collaborators: Peter Pilewskie 2, Daniel Feldman 3, William Collins 3, Zhonghai Jin 1, Xu Liu 1, Hui Li 1 1 NASA Langley 2 CU-Boulder/LASP 3 UC-Berkeley/LBNL
2
Demonstrate using the information in highly accurate, hyperspectral shortwave reflectance measurements for climate model validation Why use direct measurements of reflectance? Why hyperspectral sampling? Does shortwave hyperspectral model validation tell the modelers something new about model performance? 2 How well do climate models reproduce observed the variability in Earth’s climate system and why? SIF 2013 Project Goals
3
Importance of continuous spectral sampling for climate benchmarking 3 SCIAMACHY POLDER - 9 AVHRR - 3 MODIS - 19 APS - 8 VIIRS - 11
4
4 Linking physical processes to observed variability using spectral information
5
Spectrally resolved reflectance exhibits annual and seasonal variability 5
6
Quantitative Comparison of Subspaces SCIA Reflectances OSSE Reflectances SCIA Eigenvectors Calculate Intersection Spectrally Decompose Intersection The relationship between each pair of transformed eigenvectors. Range = [0,Subspace Dimension] OSSE Eigenvectors PCA SCIA Transformed Eigenvectors OSSE Transformed Eigenvectors 1 2 3 SVD Roberts et al. 2013 (ACP)
7
Quantitative Comparison of Subspaces 7
8
8
9
Spectral Variability of Hyperspectral Shortwave Radiation: What have we learned? Importance of continuous spectral sampling for climate benchmarking – Contains spectral information needed to link physical processes to observed variability – Spectrally resolved reflectance exhibits regional, annual, and seasonal variability Quantitative comparison using spectral information in shortwave hyperspectral reflectance – At the beginning of the 21 st century, OSSE and SCIAMACHY reflectance have similar variability. 9
10
SCIAMACHY validation product SCIAMACHY CLARREO-like validation product – Spectral Resolution: 8 nm FWHM – Spectral Sampling Resolution: 4 nm – Spatial Sampling: 5.625 degrees (T85 * 4) – Temporal Sampling: Monthly averages – Output Format: netCDF Variables Included: Clear sky and All Sky reflectance and radiance, surface scene type IDs using IGBP database, cloud optical properties, etc. 10
11
Compare to OSSE output Generating 2003-2010 OSSE output to correspond with ENVISAT orbital info (10AM descending node) MODIS monthly average surface products instead of climatology SORCE Total Solar Irradiance 11
12
Comparing decadal trends What secular trends are there in the observed decadal temporal variability and what physical processes drive these secular trends? – Regional Changes: e.g. Arctic Ocean, Eastern US, sub- Saharan Africa, Greenland, Amazon – What are the differences among broadband, multispectral, and hyperspectral data sets in detecting and attributing those changes? – How do these trends compare between observations and model simulations? 12
13
13 Lon: -170.000 -135.000 Lat: 73.0000 85.0000
14
Quantifying Difference in Information 14 Lon: -170.000 -135.000 Lat: 73.0000 85.0000
15
CERES EBAF(Energy Balanced and Filled) Level 4 Data Product. TOA SW flux change/decade
16
Locations where the trend is significant at 1σ
17
Decadal spectral reflectance trends 17 Trends significant at 1σ
18
Validating Climate Model Output Compare SCIA and OSSE spectral decadal trends Compare spectral variability using PC spectral shapes Quantify data set differences and similarities Utilize distance metric from intersection comparison method 18
19
Attribution: If models and observations differ, why? Radiative Response Using Shortwave Spectral Kernels PCRTM spectral fitting Intersection Database Method – Use intersection to match the spectral shape of observations to simulated spectra efficiently – Quickly matching the spectral shapes provides link between model physical inputs to observed data variance drivers 19
20
Intersection Database Method SCIA PCA Scores SCIA Shared Intersection Scores Database Shared Intersection Scores 1. For each PC, find the SCIA spectra corresponding to scores more than 3 standard deviations from the mean. 2. Using the spectra found in (1.), calculate the Euclidean distance between the corresponding Shared Intersection SCIA Scores and all Database Intersection Scores. 4. Examine Database inputs used to simulate reflectances to understand which model inputs drive measured variance. 3. Find the minimum Euclidean distance for each spectrum. This finds Database spectrum with closest spectral shape to SCIA spectrum of interest. SCIA Reflectances Database Reflectances Database Physical Inputs PCA Space Intersection Space Measurement Space
21
What’s Next? Beyond the 2013 SIF What does delivery look like for modeling groups? We will have tested our methods using the CCSM3 model. How do other models compare? No CLARREO SW instrument yet, we can convince modelers of importance of using available data for model validation – MODIS/SCIAMACHY radiance/reflectance Continued attribution efforts Publish initial results Explore further funding options to expand upon project results 21
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.