NCAR’s CFMIP/COSP contact

Slides:



Advertisements
Similar presentations
The Cloud Feedback Model Intercomparison Project Plans for CFMIP-2
Advertisements

Robin Hogan, Andrew Barrett
Robin Hogan, Richard Allan, Nicky Chalmers, Thorwald Stein, Julien Delanoë University of Reading How accurate are the radiative properties of ice clouds.
1 Evaluating climate model using observations of tropical radiation and water budgets Richard P. Allan, Mark A. Ringer Met Office, Hadley Centre for Climate.
Robin Hogan (with input from Anthony Illingworth, Keith Shine, Tony Slingo and Richard Allan) Clouds and climate.
Satellite Cloud and Aerosol climate records for the ESA Climate Change Initiative (CCI) Caroline Poulsen, Gareth Thomas, Richard Siddans, Don Grainger,
Low-Latitude Cloud Feedbacks CPT Chris Bretherton University of Washington US CLIVAR activity sponsored by NSF and NOAA at ~$1M/yr, along with 2 ocean.
Evaluation of ECHAM5 General Circulation Model using ISCCP simulator Swati Gehlot & Johannes Quaas Max-Planck-Institut für Meteorologie Hamburg, Germany.
Geophysical Fluid Dynamics Laboratory Review June 30 - July 2, 2009 Geophysical Fluid Dynamics Laboratory Review June 30 - July 2, 2009.
Tropical Water Vapor and Cloud Feedbacks in CCSM3.5: A Preliminary Evaluation D.-Z. Sun and T. Zhang University of Colorado & National Oceanic & Atmospheric.
Climate Forcing and Physical Climate Responses Theory of Climate Climate Change (continued)
Cpt.UCLA Our work is motivated by two observations ‣ our understanding of cloud feedbacks is zonally symmetric. ‣ all pbl parameterizations strive to well.
1 Cloud Droplet Size Retrievals from AERONET Cloud Mode Observations Christine Chiu Stefani Huang, Alexander Marshak, Tamas Várnai, Brent Holben, Warren.
© Crown copyright Met Office COSP: status and CFMIP-2 experiments A. Bodas-Salcedo CFMIP/GCSS meeting, Vancouver, 8-12 June 2009.
64 or 128 Columns 2°2° 2.5° Depiction of Multi-scale Modeling Framework (MMF) A Cloud Resolving Model with an Adaptive Vertical Grid Roger Marchand and.
The importance of clouds. The Global Climate System
outline Motivation: strato-cumulus, aerosol effect
Steve Klein and Yuying Zhang Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory Mark Webb United Kingdom Meteorological.
Cloud Biases in CMIP5 using MISR and ISCCP simulators B. Hillman*, R. Marchand*, A. Bodas-Salcedo, J. Cole, J.-C. Golaz, and J. E. Kay *University of Washington,
Direct Radiative Effect of aerosols over clouds and clear skies determined using CALIPSO and the A-Train Robert Wood with Duli Chand, Tad Anderson, Bob.
Applications and Limitations of Satellite Data Professor Ming-Dah Chou January 3, 2005 Department of Atmospheric Sciences National Taiwan University.
© Crown copyright Met Office Evaluation of cloud regimes in climate models Keith Williams and Mark Webb (A quantitative performance assessment of cloud.
Processes controlling Southern Ocean Shortwave Climate Feedbacks Jen Kay University of Colorado at Boulder.
Preliminary Results of Global Climate Simulations With a High- Resolution Atmospheric Model P. B. Duffy, B. Govindasamy, J. Milovich, K. Taylor, S. Thompson,
Use of CCSM3 and CAM3 Historical Runs: Estimation of Natural and Anthropogenic Climate Variability and Sensitivity Bruce T. Anderson, Boston University.
Marine organic matter in sea spray Nd vs. SO4, binned into low-OM, intermediate OM, and high-OM groups Adding marine organic matter as a source into ACME.
THE USE OF NWP TYPE SIMULATIONS TO TEST CONVECTIVE PARAMETERIZATIONS David Williamson National Center for Atmospheric Research.
Clouds in the Southern midlatitude oceans Catherine Naud and Yonghua Chen (Columbia Univ) Anthony Del Genio (NASA-GISS)
Jim Boyle, Peter Gleckler, Stephen Klein, Mark Zelinka, Yuying Zhang Program for Climate Model Diagnosis and Intercomparison / LLNL Robert Pincus University.
Arctic Cloud Biases in CCSM3 Steve Vavrus Center for Climatic Research University of Wisconsin.
Chemistry Climate Modeling of the UTLS An update on model inter-comparison and evaluation with observations Andrew Gettelman, NCAR & CCMVal Collaborators.
CPPA Past/Ongoing Activities - Ocean-Atmosphere Interactions - Address systematic ocean-atmosphere model biases - Eastern Pacific Investigation of Climate.
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Southern Ocean cloud biases in ACCESS.
C LOUD R ADIATIVE K ERNELS : C LOUDS IN W/m 2 with T. Andrews, J. Boyle, A. Dessler, P. Forster, P. Gleckler, J. Gregory, D. Hartmann, S. Klein, R. Pincus,
Steve Klein and Jim Boyle Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory Mark Webb United Kingdom Meteorological.
Morrison/Gettelman/GhanAMWG January 2007 Two-moment Stratiform Cloud Microphysics & Cloud-aerosol Interactions in CAM H. Morrison, A. Gettelman (NCAR),
Coordinated CESM/CanESM Large Ensembles for the CanSISE Community
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Image: MODIS Land Group, NASA GSFC March 2000 The Influences of Changes.
Fog- and cloud-induced aerosol modification observed by the Aerosol Robotic Network (AERONET) Thomas F. Eck (Code 618 NASA GSFC) and Brent N. Holben (Code.
Yuying Zhang, Jim Boyle, and Steve Klein Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory Jay Mace University.
Robert Wood, Atmospheric Sciences, University of Washington The importance of precipitation in marine boundary layer cloud.
Thomas Ackerman Roger Marchand University of Washington.
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology Pasadena, California 1 João Teixeira Jet Propulsion.
Climate Modeling Research & Applications in Wales John Houghton C 3 W conference, Aberystwyth 26 April 2011.
Retrieval of Cloud Phase and Ice Crystal Habit From Satellite Data Sally McFarlane, Roger Marchand*, and Thomas Ackerman Pacific Northwest National Laboratory.
Implementation Plan for CCSM 4 CCSM 4 needs to be ready by the end of 2008 for AR5 in early 2013.
Physical Feedbacks Mike Steele Polar Science Center University of Washington Steve Vavrus Center for Climatic Research University of Wisconsin Co-Chairs:
Intercomparison of model simulations of mixed-phase clouds observed during the ARM Mixed-Phase Arctic Cloud Experiment, Part II: Multi-layered cloud GCSS.
Sensitivity of MJO to the CAPE lapse time in the NCAR CAM3.1 Ping Liu, Bin Wang International Pacific Research Center University of Hawaii Sponsored by.
Update on the 2-moment stratiform cloud microphysics scheme in CAM Hugh Morrison and Andrew Gettelman National Center for Atmospheric Research CCSM Atmospheric.
Evidence in ISCCP for regional patterns of cloud response to climate change Joel Norris Scripps Institution of Oceanography ISCCP at 30 Workshop City College.
Evaluating the Met Office global forecast model using GERB data Richard Allan, Tony Slingo Environmental Systems Science Centre, University of Reading.
Atmospheric Circulation Response to Future Arctic Sea Ice Loss Clara Deser, Michael Alexander and Robert Tomas.
March 9, 2004CCSM AMWG Interactive chemistry in CAM Jean-François Lamarque, D. Kinnison S. Walters and the WACCM group Atmospheric Chemistry Division NCAR.
ESSL Holland, CCSM Workshop 0606 Predicting the Earth System Across Scales: Both Ways Summary:Rationale Approach and Current Focus Improved Simulation.
Status of CAM, March 2004 Phil Rasch. Differences between CAM2 and CAM3 (standard physics version) Separate liquid and ice phases Advection, sedimentation.
Community Earth System Model (CESM) for CMIP6
H. Morrison, A. Gettelman (NCAR) , S. Ghan (PNL)
Isotope enable Community Earth System Model
A sensitivity study of the sea ice simulation in the global coupled climate model, HadGEM3 Jamie Rae, Helene Hewitt, Ann Keen, Jeff Ridley, John Edwards,
Dave Winker1 and Helene Chepfer2
Ryan Kramer and Brian Soden University of Miami/RSMAS
Using A-train observations to evaluate clouds in CAM
NCAR-GFDL Workshops on Model Development
Using satellite simulators for cloud evaluation
Beijing Climate Center, China Meteorological Administration
Update on CAM and the AMWG. Recent activities and near-term priorities. by the AMWG.
ARM Validation Data for Climate Models
Recent CCSM development simulations: Towards CCSM4
Comparing the Greenhouse Sensitivities of CCM3 and ECHAM4.5
Presentation transcript:

NCAR’s CFMIP/COSP contact Analysis of CAM clouds with COSP (a lot has happened in the last year!) Jen Kay (jenkay@ucar.edu) NCAR’s CFMIP/COSP contact Collaborators: Steve Klein, Yuying Zhang, Jim Boyle (LLNL), Ben Hillman, Roj Marchand, Tom Ackerman (UW), Brian Medeiros, Andrew Gettelman, Brian Eaton, Ben Sanderson (NCAR), Robert Pincus (CU)

Why are satellite simulators (COSP) useful? When satellite simulators accurately mimic the observational retrieval process, they enable “apple-to-apple” comparisons between models and observations. Trusted comparison Climate Model Satellite simulator Satellite retrieval Observed radiances

Is it easy to evaluate CAM clouds with COSP? Yes! ../models/atm/cam/src/physics/cosp ../models/atm/cam/src/physics/cam/cospsimulator_intr.F90 COSP v1.3 (with local modifications) validated for use with CAM4 and CAM5. Code on CAM trunk and in CESM1 releases. For details, see http://www.cgd.ucar.edu/staff/jenkay/cosp/cosp.htm. Using COSP requires only three simple steps: 1) Configure with cosp. (configure –cosp ….) 2) Set cosp_amwg=.true. in the CAM namelist. 3) Run CAM at least one year, and then use the AMWG diagnostics package to look at COSP outputs. (Note: Setting cosp_amwg=.true. approximately doubles the CAM run time.) Andrew says ~4:40 baseline CAM5 on 2 nodes, 2 degree

COSP in the AMWG Diagnostics Package (http://www. cgd. ucar Set now includes COSP plots Ben Hillman (UW)

Cloud Feedbacks Model Intercomparison Project NCAR run progress (years complete/years planned) Simulation CAM4 CAM4(ext) CAM5 CAM5(ext) AMIP 30/30 4/4 20/30 0/4 AMIP(4XCO2) 21/30 AMIP(+4K) 18/30 AMIP(patt) Control SST - 0/30 Control SST (4XCO2) Control (coupled) 120/120 CO2ramp (coupled) 105/105 Courtesy: Ben Sanderson (NCAR)

Paper on COSP-enabled evaluation of CAM clouds for CESM Special Issue We analyze 2001-2010 ten-year 1 degree AMIP (prescribed SST/ice) CAM4 and CAM5 runs. CAM4 and CAM51 degree release code. Tag cam5_0_55. Kay, J. E., Hillman, B., Klein, S., Zhang, Y., Medeiros, B., Gettelman, G., Pincus, R., Eaton, B., Boyle, J., Marchand, R. and T. Ackerman (2012): Exposing global cloud biases in the Community Atmosphere Model (CAM) using satellite observations and their corresponding instrument simulators, J. Climate, in press (available at: http://www.cgd.ucar.edu/staff/jenkay/)

COSP-enabled total cloud fraction comparisons Figure 2. Global maps of observed and COSP-simulated annual mean total cloud fraction: a) CALIPSO GOCCP observations (SR > 5), b) ISCCP observations (>0.3), c) MISR observations (>0.3), d) CAM4 CALIPSO, e) CAM4 ISCCP, f) CAM4 MISR, g) CAM5 CALIPSO, h) CAM5 ISCCP, and i) CAM5 MISR. The value after each panel name reports the global annual mean, while the values in parentheses report the global annual mean model bias (CAM - observations) and model RMSE respectively. Because they do not include thin or broken clouds, the global annual mean cloud fractions observed by MODIS (0.49) and CloudSat (0.50) are significantly smaller than those observed by CALIOP, MISR, or ISCCP, and are not included in this plot. The solid gray line in each panel shows the GCSS Pacific Cross Section. Observational Uncertainty < Model Bias: CAM4 bias > CAM5 bias, Kay et al. (2012)

COSP enables evaluation of cloud amount by height and optical depth credit Swati Gehlot (DWD)

COSP-enabled comparisons robustly show that the CAM5 physics has reduced long-standing climate model cloud biases (too many optically thick clouds, too few clouds in CAM4 and many other models, see Zhang et al. 2005). How do CAM4 and CAM5 produce similar annual mean cloud forcing biases (Figure 1b-f) with such large differences in their annual mean total cloud fractions (Figure 2d-i)? The answer lies in the relationship between cloud fraction and . Figure 3a shows observed and simulator-diagnosed global annual mean  distributions for MISR, MODIS, and ISCCP. This figure shows CAM4 and CAM5 achieve a similar radiative balance because they have compensating differences in the amounts of optically thin and optically thick cloud. Consistent with earlier ISCCP-simulator based evaluations, such as the 10 model inter-comparison study by Z05, CAM4 has too much optically thick cloud and not enough optically thin cloud, resulting in a flatter  distribution than is observed. In contrast, CAM5 has a steeper  distribution that agrees quite well with observations. To the best of our knowledge, CAM5 is the first climate model to match the observed  distribution with this level of fidelity, which is a significant achievement. The reproduction of observed cloud albedos, which are largely determined by , is an important metric of model performance. Figure 3. Global column-integrated cloud optical depth () distributions: a) MISR, MODIS, and ISCCP from satellite observations, CAM4, and CAM5, b) ISCCP from CAM4 and CAM5 at both 0.9x1.25 and 1.9x2.5 horizontal grid resolutions. Below =3.6, there are large inter-satellite differences in cloud fraction due to difference in the detection and treatment of cloud-edges and sub-pixel clouds. Observational agreement below =1.3 is especially poor (see P11 Figure 6), and is therefore not plotted in a). Kay et al. (2012)

Improved Arctic cloud seasonal cycle in CAM5 (despite known low aerosol issues…) Kay et al. (2012)

CAM5 has improved clouds and increased sensitivity to 2xCO2 forcing… both globally and in the Arctic Kay et al. (2012): The influence of local feedbacks and northward heat transport on the equilibrium Arctic climate response to increased greenhouse gas forcing in coupled climate models, J. Climate

Snow has a large impact on CAM5 COSP diagnostics

Important biases remain in both CAM versions (e. g Important biases remain in both CAM versions (e.g., low cloud deficit in transition from stratocumulus to deep convection) Figure 5. Global maps of annual mean cloud fraction bias (CAM - CALIPSO GOCCP observations): a) CAM4 CALIPSO low cloud fraction bias, b) CAM5 CALIPSO low cloud fraction bias, c) CAM4 CALIPSO mid cloud fraction bias, d) CAM5 CALIPSO mid cloud fraction bias, e) CAM4 CALIPSO high cloud fraction bias, and f) CAM5 CALIPSO high cloud fraction bias. The value after each panel name reports the global annual mean bias and the value in parentheses reports the RMSE. The solid gray line in each panel shows the GCSS Pacific Cross Section. Figure 5, Kay et al. (2012)

Summary: 1) COSP and CFMIP-requested diagnostics are validated and ready to use within CESM. 2) Analysis using COSP is beginning (and documenting large improvements in clouds from CAM4 to CAM5). COSP/CFMIP help address key climate questions for the AMWG and the larger climate community, such as… How do we know if we have the clouds “right”?

EXTRA

How does COSP work? COSP contains satellite simulators for both passive (MODIS, MISR, ISCCP) and active (CloudSat radar and CALIPSO lidar) observations.

For COSP metrics, CAM5 > CAM4. Kay et al. (2012)