Evaluation of cloudiness simulated by climate models using CALIPSO and PARASOL H. Chepfer Laboratoire Météorologie Dynamique / Institut Pierre Simon Laplace,

Slides:



Advertisements
Similar presentations
The Cloud Feedback Model Intercomparison Project Plans for CFMIP-2
Advertisements

GEMS-Aerosol WP_AER_4: Evaluation of the model and analysis Lead Partners: NUIG & CNRS-LOA Partners: DWD, RMIB, MPI-M, CEA- IPSL-LSCE,ECMWF, DLR (at no.
R. Forbes, 17 Nov 09 ECMWF Clouds and Radiation University of Reading ECMWF Cloud and Radiation Parametrization: Recent Activities Richard Forbes, Maike.
Robin Hogan, Richard Allan, Nicky Chalmers, Thorwald Stein, Julien Delanoë University of Reading How accurate are the radiative properties of ice clouds.
Robin Hogan Julien Delanoe University of Reading Remote sensing of ice clouds from space.
A Methodology for Simultaneous Retrieval of Ice and Liquid Water Cloud Properties O. Sourdeval 1, L. C.-Labonnote 2, A. J. Baran 3, G. Brogniez 2 1 – Institute.
A thermodynamic model for estimating sea and lake ice thickness with optical satellite data Student presentation for GGS656 Sanmei Li April 17, 2012.
Exploiting multiple scattering in CALIPSO measurements to retrieve liquid cloud properties Nicola Pounder, Robin Hogan, Lee Hawkness-Smith, Andrew Barrett.
Earth System Science Teachers of the Deaf Workshop, August 2004 S.O.A.R. High Earth Observing Satellites.
1 An initial CALIPSO cloud climatology ISCCP Anniversary, July 2008, New York Dave Winker NASA LaRC.
Evaluation of ECHAM5 General Circulation Model using ISCCP simulator Swati Gehlot & Johannes Quaas Max-Planck-Institut für Meteorologie Hamburg, Germany.
© Crown copyright Met Office COSP: status and CFMIP-2 experiments A. Bodas-Salcedo CFMIP/GCSS meeting, Vancouver, 8-12 June 2009.
Evaluation of cloudiness in LMDZ GCM using CALIPSO and PARASOL H. Chepfer, S. Bony, G. Cesana, JL Dufresne, D. Konsta, LMD/IPSL D. Winker, NASA/LaRC D.
Towards stability metrics for cloud cover variation under climate change Rob Wood, Chris Bretherton, Matt Wyant, Peter Blossey University of Washington.
Steve Klein and Yuying Zhang Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory Mark Webb United Kingdom Meteorological.
Profiling Clouds with Satellite Imager Data and Potential Applications William L. Smith Jr. 1, Douglas A. Spangenberg 2, Cecilia Fleeger 2, Patrick Minnis.
1 Satellite Remote Sensing of Particulate Matter Air Quality ARSET Applied Remote Sensing Education and Training A project of NASA Applied Sciences Pawan.
Cloud Biases in CMIP5 using MISR and ISCCP simulators B. Hillman*, R. Marchand*, A. Bodas-Salcedo, J. Cole, J.-C. Golaz, and J. E. Kay *University of Washington,
Direct Radiative Effect of aerosols over clouds and clear skies determined using CALIPSO and the A-Train Robert Wood with Duli Chand, Tad Anderson, Bob.
Direct aerosol radiative forcing based on combined A-Train observations – challenges in deriving all-sky estimates Jens Redemann, Y. Shinozuka, M.Kacenelenbogen,
Cyclone composites in the real world and ACCESS Pallavi Govekar, Christian Jakob, Michael Reeder and Jennifer Catto.
Lidar algorithms to retrieve cloud distribution, phase and optical depth Y. Morille, M. Haeffelin, B. Cadet, V. Noel Institut Pierre Simon Laplace SYMPOSIUM.
1 Une description statistique multi-variable des nuages au dessus de l’océan tropical à partir des observations de jour de l’A-train en haute résolution.
Numerical diffusion in sectional aerosol modells Stefan Kinne, MPI-M, Hamburg DATA in global modeling aerosol climatologies & impact.
Applications and Limitations of Satellite Data Professor Ming-Dah Chou January 3, 2005 Department of Atmospheric Sciences National Taiwan University.
NCAR’s CFMIP/COSP contact
MT Workshop October 2005 JUNE 2004 DECEMBER 2004 End of OCTOBER 2005 ? MAY 2002 ? Capabilities of multi-angle polarization cloud measurements from.
Evaluation of CMIP5 Simulated Clouds and TOA Radiation Budgets in the SMLs Using NASA Satellite Observations Erica K. Dolinar Xiquan Dong and Baike Xi.
Marine organic matter in sea spray Nd vs. SO4, binned into low-OM, intermediate OM, and high-OM groups Adding marine organic matter as a source into ACME.
Introduction Invisible clouds in this study mean super-thin clouds which cannot be detected by MODIS but are classified as clouds by CALIPSO. These sub-visual.
The combined use of MODIS, CALIPSO and OMI level 2 aerosol products for calculating direct aerosol radiative effects Jens Redemann, M. Vaughan, Y. Shinozuka,
Clouds in the Southern midlatitude oceans Catherine Naud and Yonghua Chen (Columbia Univ) Anthony Del Genio (NASA-GISS)
Jim Boyle, Peter Gleckler, Stephen Klein, Mark Zelinka, Yuying Zhang Program for Climate Model Diagnosis and Intercomparison / LLNL Robert Pincus University.
Estimation of Cloud and Precipitation From Warm Clouds in Support of the ABI: A Pre-launch Study with A-Train Zhanqing Li, R. Chen, R. Kuligowski, R. Ferraro,
1 CALIPSO: Validation activities and requirements Dave Winker NASA LaRC GALION, WMO Geneva, September 2010.
1 Mathieu Reverdy Hélène Chepfer Workshop EECLAT janvier 2013 Implementation of ATLID/Earthcare lidar simulator within CFMIP–Observation Simulator.
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Southern Ocean cloud biases in ACCESS.
Optical properties Satellite observation ? T,H 2 O… From dust microphysical properties to dust hyperspectral infrared remote sensing Clémence Pierangelo.
C LOUD R ADIATIVE K ERNELS : C LOUDS IN W/m 2 with T. Andrews, J. Boyle, A. Dessler, P. Forster, P. Gleckler, J. Gregory, D. Hartmann, S. Klein, R. Pincus,
Andrew Heidinger and Michael Pavolonis
Steve Klein and Jim Boyle Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory Mark Webb United Kingdom Meteorological.
Characterization of Aerosols using Airborne Lidar, MODIS, and GOCART Data during the TRACE-P (2001) Mission Rich Ferrare 1, Ed Browell 1, Syed Ismail 1,
Group proposal Aerosol, Cloud, and Climate ( EAS 8802) April 24 th, 2006 Does Asian dust play a role as CCN? Gill-Ran Jeong, Lance Giles, Matthew Widlansky.
Testing LW fingerprinting with simulated spectra using MERRA Seiji Kato 1, Fred G. Rose 2, Xu Liu 1, Martin Mlynczak 1, and Bruce A. Wielicki 1 1 NASA.
Numerical simulations of optical properties of nonspherical dust aerosols using the T-matrix method Hyung-Jin Choi School.
Yuying Zhang, Jim Boyle, and Steve Klein Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory Jay Mace University.
Investigations of Artifacts in the ISCCP Datasets William B. Rossow July 2006.
Clouds & Radiation: Climate data vs. model results A tribute to ISCCP Ehrhard Raschke, University of Hamburg Stefan Kinne, MPI-Meteorology Hamburg 25 years.
Thomas Ackerman Roger Marchand University of Washington.
Introduction 1. Advantages and difficulties related to the use of optical data 2. Aerosol retrieval and comparison methodology 3. Results of the comparison.
Zhibo (zippo) Zhang 03/29/2010 ESSIC
ISCCP SO FAR (at 30) GOALS ►Facilitate "climate" research ►Determine cloud effects on radiation exchanges ►Determine cloud role in global water cycle ▬
Aerosol Radiative Forcing from combined MODIS and CERES measurements
Cloud property retrieval from hyperspectral IR measurements Jun Li, Peng Zhang, Chian-Yi Liu, Xuebao Wu and CIMSS colleagues Cooperative Institute for.
Direct aerosol radiative effects based on combined A-Train observations Jens Redemann, Y. Shinozuka, J. Livingston, M. Vaughan, P. Russell, M.Kacenelenbogen,
Nan Feng and Sundar A. Christopher Department of Atmospheric Science
AEROCOM AODs are systematically smaller than MODIS, with slightly larger/smaller differences in winter/summer. Aerosol optical properties are difficult.
1 CALIPSO VALIDATION and DATA QUALITY IMPROVEMENT EECLAT T0, J. Pelon.
Barbuda Antigua MISR 250 m The Climatology of Small Tropical Oceanic Cumuli New Findings to Old Problems (Analysis of EOS-Terra data) Larry Di Girolamo,
Cloudnet meeting Oct Martial Haeffelin SIRTA Cloud and Radiation Observatory M. Haeffelin, A. Armstrong, L. Barthès, O. Bock, C. Boitel, D.
1 Recent advances in CALIPSO cloud retrievals: Progress over the last 7 years Looking ahead to the next 30 ISCCP at 30: City College of New York, 23 April.
Important data of cloud properties for assessing the response of GCM clouds in climate change simulations Yoko Tsushima JAMSTEC/Frontier Research Center.
Toward Continuous Cloud Microphysics and Cloud Radiative Forcing Using Continuous ARM Data: TWP Darwin Analysis Goal: Characterize the physical properties.
Aerosol properties in a cloudy world (from MODIS and CALIOP) Alexander Marshak (GSFC) Bob Cahalan (GSFC), Tamas Varnai (UMBC), Guoyong Wen, Weidong Yang.
Vertically resolved CALIPSO-CloudSat aerosol extinction coefficient in the marine boundary layer and its co-variability with MODIS cloud retrievals David.
Extinction measurements
TOA Radiative Flux Estimation From CERES Angular Distribution Models
Dave Winker1 and Helene Chepfer2
Relationships inferred from AIRS-CALIPSO synergy
Using A-train observations to evaluate clouds in CAM
Presentation transcript:

Evaluation of cloudiness simulated by climate models using CALIPSO and PARASOL H. Chepfer Laboratoire Météorologie Dynamique / Institut Pierre Simon Laplace, France Contributors : S. Bony, G. Cesana, JL Dufresne, D. Konsta, LMD/IPSL D. Winker, NASA/LaRC D. Tanré, LOA C. Nam, MPI Y. Zhang, S. Klein, LLNL J. Cole, U. Toronto A. Bodas-Salcedo, UKMO

Les nuages : principale source d’incertitude pour les prévisions d’évolution du climat Reducing this uncertainty => a thorough evaluation of cloud description in climate models

Evaluation of clouds in climate models - generally based on monthly mean TOA fluxes ERBE, ScaRaB, CERES, and ISCCP (e.g. Zhang et al. 2005, Webb et al. 2001, Bony et al. 2004, ….) -Problem: a good reproduction of monthly mean TOA fluxes can be due to error compensations Same TOA fluxes These errors impact cloud feedback as predicted by climate models Errors compensate between: Need to unravel error compensations using: (1) Independent, simultaneous obs of cloud cover, vertical structure, cloud optical depth (2) instantaneous and monthly mean observations (3) methods for consistent model/obs comparisons (a) Cloud vertical distri. (c) Instantaneous/ Monthly Time averaging (b) Cloud optical thickness and Total cloud cover Vertically integrated value within lat x lon grid box

A-train Observation simulators CFMIP-OSP/CALIPSO/PARASOL Subgridding, Overlap, Detection, Stat etc… (Chepfer, Bony et al. 2008) Observation simulators CFMIP-OSP/CALIPSO/PARASOL Subgridding, Overlap, Detection, Stat etc… (Chepfer, Bony et al. 2008) Observational datasets CALIPSO-GOCCP (see poster G. Cesana) PARASOL-Reflectance Observational datasets CALIPSO-GOCCP (see poster G. Cesana) PARASOL-Reflectance not consistent Method of comparison between observations and models consistent Simulated datasets CALIPSO-like PARASOL-like Simulated datasets CALIPSO-like PARASOL-like Specific data processing starting from Level 1 Spatio-temporal resolution. detection threshold consistent wih the simulator (Chepfer, Bony et al. submitted) Specific data processing starting from Level 1 Spatio-temporal resolution. detection threshold consistent wih the simulator (Chepfer, Bony et al. submitted) Ensures that model/obs differences are due to model deficiencies

Models participating : LMDZ4, IPSLnew, CAM3, CCCMA, ECHAM-5, ECMWF, NICAM, UKMO Model Experiment : Year 2007, forced SST, Run CALIPSO/PARASOL simulator, Outputs on a monthly mean and daily, Diagnostics (d1 to d4). Models participating : LMDZ4, IPSLnew, CAM3, CCCMA, ECHAM-5, ECMWF, NICAM, UKMO Model Experiment : Year 2007, forced SST, Run CALIPSO/PARASOL simulator, Outputs on a monthly mean and daily, Diagnostics (d1 to d4). A preliminary inter-comparison to evaluate cloudiness in climate models using 4 diagnostics: (d1) Cloud Cover (d2) Cloud Vertical distribution (d3) Cloud Optical Thickness (d1)+(d3) Cloud Cover / Optical Thickness relationship (d4) Cloud « Type »

(d1) Cloud Cover obs CALIPSO-GOCCP: GCM-Oriented Calipso Cloud Product (built from Caliop level 1) Observational dataset fully consistent with COSP/CALIPSO outputs CALIPSO – GOCCP compared with others cloud climatologies Chepfer et al., 2009, JGR. submitted Cloud detec: Horiz. Res=330m See poster G. Cesana

(d1) Cloud cover model evaluation CALIPSO- GOCCP OBS CCCMA +sim CAM3.5 +sim ECHAM5 +sim LMDZ4 +sim LMDZnew +sim

(d2) Cloud Vertical Distribution LMDZ4 + SIM CCCMA + SIM CALIPSO-GOCCP CAM3.5 + SIM ECHAM5 + SIM Overestimate: -High clouds Underestimate: - Tropical low clouds - Congestus - Mid level mid lat OBS 0 0.3

(d2) High clouds model evaluation CCCMA +sim CAM +sim ECHAM5 +sim LMDZ4 +sim LMDZnew +sim CALIPSO- GOCCP OBS

(d2) Mid clouds model evaluation CCCMA +sim CAM +sim ECHAM5 +sim LMDZ4 +sim LMDZnew +sim CALIPSO- GOCCP OBS

(d2) Low clouds model evaluation CCCMA +sim CAM3.5 +sim ECHAM5 +sim LMDZ4 +sim LMDZnew +sim CALIPSO- GOCCP OBS

Model Tau : 2x2deg. Iwc, lwc = f(day,lat,lon) Hypothesis: PFice’, PFliq’ Tau-cloud-mod = f(day, lat, lon) Model Tau : 2x2deg. Iwc, lwc = f(day,lat,lon) Hypothesis: PFice’, PFliq’ Tau-cloud-mod = f(day, lat, lon) not consistent consistent Diagnostic of the Reflectance as a model output - Sub-gridding Iwc, Lwc (Klein, Jacob, 1999 ) - A-train orbit: (lat,lon,day) => tetaS - Direct Rad. Transfert computation with PFice_mod and PFliq_mod Diagnostic of the Reflectance as a model output - Sub-gridding Iwc, Lwc (Klein, Jacob, 1999 ) - A-train orbit: (lat,lon,day) => tetaS - Direct Rad. Transfert computation with PFice_mod and PFliq_mod Satellite Tau retrievals: 10km to 1 km Refl_obs =f(day,lat,lon). + Cloud detection + Sunglint correction + hypothesis:PFice, PFliq… Tau-cloud-obs = f(day, lat, lon) Satellite Tau retrievals: 10km to 1 km Refl_obs =f(day,lat,lon). + Cloud detection + Sunglint correction + hypothesis:PFice, PFliq… Tau-cloud-obs = f(day, lat, lon) (d3) Cloud Optical Depth: method model/obs comparison PARASOL Reflectance (864nm): subset of observations: In a fixed “no-glint” viewing direction (tetaV=30°, PhiV=320°) Built maps of Reflectance Refl_mod (tetas, tau) Refl_obs (tetas, tau) Another approach: use reflectance as a proxy of tau Ensures that model/obs differences are due to tau-model

14 (d3) Cloud optical thickness model evaluation PARASOL-REFL ECHAM5 + sim CCCMA+sim LMDZ4+sim LMDZ-new physic+sim

(d1+d3) Cloud cover / cloud optical thickness relationship 0.8 Reflectance Monthly means Observations LMDZ4 + Sim.LMDZ new physics + Sim. Obs CCCMA Cloud fraction ECHAM-5 LMDZ4 LMDZ-New physic Reflectance Models reproduce roughly the relationship between the reflectance and the cloud fraction in monthly mean

(d1+d3) Cloud cover / cloud optical thickness relationship Reflectance Observations LMDZ4 + Sim.LMDZ new physics + Sim Cloud fraction Obs instantaneous CCCMA instant. LMDZ4 instant. LMDZ-New physic instant Reflectance Obs monthly Models have difficulties to reproduce the « instantaneous » relationship between tau and the cloud fraction … needed for cloud feedbacks

Focus on Tropics (d1) Cloud cover and (d3) Optical thickness in dynamical regimes (d4) REFLECTANCE w500 (hPa/day) (d1) CLOUD COVER w500 (hPa/day) Error compensations between Cloud cover and Optical depth Vertical wind speed et 500hPa

(d4) Cloud types : focus on Tropics Tropical warm pool Hawaii trade cumulus LMDZ4 + Sim. CALIPSO-GOCCP-obs “CFADs” LMDZ New Physics + Sim. Pressure Lidar signal intensity (SR ) Clouds Pressure Clouds 805 Clouds 805 Clouds 805 Clouds 805

Conclusions CALIPSO and PARASOL obs. can help identifying GCM error compensations: 1) between vertically integrated Cloud Cover and Optical Thickness 2) between time scales: instantaneous vs monthly mean 3) in cloud vertical distribution All the models : - overestimates high cloud amount - underestimate total cloud amount - underestimate tropical low level oceanic cloud amount in subsidence regions All models exhibit error compensations None of the models can reproduce the « Cloud Types », characterized by lidar intensity, e.g. the 2 modes of low level clouds and the congestus clouds Physical interpretations of model/obs differences and inter-model differences … just starts now

CALIPSO and PARASOL simulators are included in COSP: Chepfer H., S. Bony, D. Winker, M. Chiriaco, J-L. Dufresne, G. Sèze, 2008: Use of CALIPSO lidar observations to evaluate the cloudiness simulated by a climate model, Geophys. Res. Let., 35, L15704, doi: /2008GL Simulators: “CFMIP Observation Simulator Package”: ISCCP, CloudSat, CALIPSO/PARASOL, MISR (UKMO, LLNL, LMD/IPSL, CSU, UW) This preliminary pilot inter-comparison will be extended to others climate models : - CFMIP-2 experiment comparison with actual observations - WGCM/CMIP-5 experiment (Taylor et al. 2009) – inter-models comparisons via simulators (doubling CO2, long term) Today, about 20 models use COSP (CFMIP Obs. Simulator Package)- CALIPSO- GOCCP « GCM Oriented CALIPSO Cloud Product » : Chepfer H., S. Bony, D. Winker, G. Cesana, JL. Dufresne, P. Minnis, C. J. Stubenrauch, S. Zeng, 2009: The GCM Oriented Calipso Cloud Product (CALIPSO-GOCCP), J. Geophys. Res., under revision. Observations: CALIPSO-GOCCP, PARASOL-REFL, CLOUDSAT-CFAD, CERES-EBAF, … (LMD/IPSL, UW, LOA, NASA/LarC, …)

Qques autres études avec le simulateur lidar: -Dust : Chimère-dust (Afrique, IPSL/LMD).vs. Calipso (Vuolo et a. 2009) -Nuages : WRF/MM5 (Méditerrannée, IPSL/SA).vs. SIRTA (Chiriaco et al. 2006) - Nuages : MM5.vs. Calipso (Chepfer et al. 2007) -Nuages : Méso-NH (Pinty, LA).vs. Calipso (Chaboureau et al. soumis) -Pollution: Chimère (Europe, IPSL/LMD).vs. SIRTA (Hodzic et al. 2004) - Nuages : LMDZ.vs. Calipso (Chepfer et al et 2009) Perspectives: - Simulateur générique : sol, sat, nuages, aérosols, régional, - Observations dédiées : GOCCP, CFMIP-OB, … G. Cesana (Cdd) - SIRTA, A-train + EarthCare … vers une évaluation quantitative, systématique, statistique, des nuages et des aérosols dans les modèles globaux et régionaux ….

fin

(d4) Cloud Optical Thickness obs PARASOL Reflectance in a “well suited” direction : a « proxy » for the cloud optical depth (JFM) 1 PARASOL image (level 1) This specific viewing direction: - avoid sunglint dir. - avoid nadir dir. - avoid backscattering dir. Model:. Avoid uncertainties due to hypothesis required in retrieval algorithms Background : t_cloud retrieved from satellite observations is uncertain because : hypothesis on the particle size, sunglint in the tropics, backscattering dir highly sensitive to the shape of the particles.

23

Focus: Tropics (d4) Cloud Vertical Distribution CLOUD FRACTION – Seasonal mean LMDZ4 + Sim. CALIPSO-GOCCPLMDZ New Physics +Sim. log w500 (hPa/day)

Model LMDz GCM Ouputs : 19 vertical levels, lat-lon 2.5°x3.5° LWC(P,lat,lon),IWC(P,lat,lon), R e (P,lat,lon) Simulated 1dir radiance Imposed by model parametrizations: Particle shape hypothesis (spheres for Lmdz) Shape of the size distribution Parameterized optical properties P( ,z), Cdiff (z), Cabs(z) for liq, ice SCOPS : Subgrid Cloud Overlap Profile Sampler Cloud condensate / Cloud fraction / Cloud overlap assumption 19 vertical levels, lat/lon :a few km or m Radiative transfer computation parameterized from exact computation (Doubling Adding RT) Parametrisation de  s = f(lat,lon,date) pour orbite Parasol Relation Reflectance_1dir= f(  s,  ) PARASOL L1 : 6 x 6 km2 Reflectances Directionnelles 14 directions de visées 4 ou 5 longueurs d’ondes visible (proche IR) Extraction de la Reflectance : - à 1 seule longueur d’onde : 865 nm principalement sensible aux nuages - dans une seule direction : principalement sensible à  nuages hors glitter, backscattering, nadir Observations Ref GCM (6km) Ref obs (6km) Same cloud diagnostics for model and obs at the same resolution : Averaging obs and model diagnostics : spatially over lat-lon 2.5°x3.5° temporally (month, season) Parasol

Model LMDz GCM Ouputs : 19 vertical levels, lat-lon 2.5°x3.5° LWC(P,lat,lon),IWC(P,lat,lon), R e (P,lat,lon) Simulated lidar profile: lidar equation Imposed by model parametrizations: Particle shape hypothesis (spheres for Lmdz) Shape of the size distribution Parameterized optical properties P( ,z), Cdiff (z), Cabs(z) for liq, ice SCOPS : Subgrid Cloud Overlap Profile Sampler Cloud condensate / Cloud fraction / Cloud overlap assumption 19 vertical levels, lat/lon :a few km or m Multiple scattering coefficient = cste Molecular signal from model ( P,T) CALIOP L1 : 583 vertical levels, 330m along the track ATB(z,lat,lon), Molecular density (33z, lat, lon) Compute ATBmol : - scale molecular density to ATB in cloud / aerosol free regions (22-26 km) - average over 10km (SNR) Convert altitude in pressure (GMAO) ATB (583P, lat, lon), ATBmol(583P, lat, lon) Observations SR GCM (19P, 330m) Average over the 19 vertical levels (same as GCM one) Strong increase of SNR SRobs (19P, 330m) Same cloud diagnostics for model and obs at the same resolution : Cloudy (SR>3), Clear (SR<1.2), Fully attenuated (SR<0.01), Unclassified (1.2<SR<3) Spatial homogeneity index, Cloud top height, Phase index, etc… Averaging obs and model diagnostics : spatially over lat-lon 2.5°x3.5° temporally (month, season) Lidar

Comparisons CALIOP / LMDz-GCM : Zonal mean GCM GCM + SIMULATORCALIOP/GOCCP CLOUD FRACTION LOW MID HIGH Chepfer et al. 2008

Why is it « complicated » to interface satellite data and GCM for clouds ? A cloud in GCM world … looks like a cube or sum of cubes … feel a grid (about 2.5°) or a subgrid (0/1) … overlap assumption (random, fully, partially) … clouds and aerosols are distincts … strike microphysical approximation … but each GCM has his own (!!) … etc … A cloud in satellite world … looks like whatever you want … can cover 1000’s km 2 or less than 100m … infinite overlap configurations … clouds and aerosols are difficult to distinguish … strike microphysical approximation (but different of GCM one’s) … clouds are sometimes difficults to detect Moreover : uncertainty on measurements signal-to-noise ratio multiple scattering (or RT equation) « tilt ».vs. « no tilt »

29

30

31

« Etat de l’art » : évaluation des nuages dans les GCM à partir des observations satellitales Comment les modèles sont-ils évalués? Observations (ERBE) Modèles de climat forçage radiatif (W/m²) Zhang et al Les modèles sont évalués principalement à partir de flux TOA (et ISCCP) : - basic, mais … fondamental car c’est le bilan d’énergie global du système OA Conséquences : - les modèles reproduisent tous correctement les flux - on peut évaluer d’autres grandeurs à condition de ne pas dégrader les flux, sinon le système Océan-Atmosphère-Biosphère-Cryosphère etc… explose/implose !! (dans le modèle)

Clouds and climate feedback Dufresne and Bony, J. Clim Reducing this uncertainty => a thorough evaluation of cloudd description in climate models