Presentation is loading. Please wait.

Presentation is loading. Please wait.

Parameterization Cloud Scheme Validation

Similar presentations


Presentation on theme: "Parameterization Cloud Scheme Validation"— Presentation transcript:

1 Parameterization Cloud Scheme Validation
ECMWF Training Course May 2001 Parameterization Cloud Scheme Validation AN ICTP LECTURER ADRIAN Clouds 3

2 Cloud Validation: The issues
ECMWF Training Course May 2001 Cloud Validation: The issues AIM: To perfectly simulate one aspect of nature: CLOUDS APPROACH: Validate the model generated clouds against observations, and to use the information concerning apparent errors to improve the model physics, and subsequently the cloud simulation Cloud observations Cloud simulation error parameterisation improvements sounds easy? Clouds 3

3 Cloud Validation: The problems
ECMWF Training Course May 2001 Cloud Validation: The problems How much of the ‘error’ derives from observations? Cloud observations error = e1 parameterisation improvements error Cloud simulation error = e2 Clouds 3

4 Cloud Validation: The problems
ECMWF Training Course May 2001 Cloud Validation: The problems Which Physics is responsible for the error? Cloud observations parameterisation improvements error Cloud simulation radiation convection cloud physics dynamics turbulence Clouds 3

5 The path to improved cloud parameterisation…
ECMWF Training Course May 2001 The path to improved cloud parameterisation… parameterisation improvement ? Composite studies NWP validation Case studies cloud validation Comparison to Satellite Products Clouds 3

6 Model climate - Broadband radiative fluxes
ECMWF Training Course May 2001 Model climate - Broadband radiative fluxes Can compare Top of Atmosphere (TOA) radiative fluxes with satellite observations: TOA Shortwave radiation (TSR) JJA 87 TSR CY18R6-ERBE Stratocumulus regions bad - also North Africa (old cycle!) Clouds 3

7 Model climate - Cloud radiative “forcing”
ECMWF Training Course May 2001 Model climate - Cloud radiative “forcing” Problem: Can we associate these “errors” with clouds? Another approach is to examine “cloud radiative forcing” JJA 87 SWCRF CY18R6-ERBE Cloud Problems: strato-cu YES, North Africa NO! Note CRF sometimes defined as Fclr-F, also differences in model calculation Clouds 3

8 Model climate - “Cloud fraction” or “Total cloud cover”
ECMWF Training Course May 2001 Model climate - “Cloud fraction” or “Total cloud cover” Can also compare other variables to derived products: CC JJA 87 TCC CY18R6-ISCCP references: ISCCP Rossow and Schiffer, Bull Am Met Soc. 91, ERBE Ramanathan et al. Science 89 Clouds 3

9 Climatology: The Problems
ECMWF Training Course May 2001 Climatology: The Problems If more complicated cloud parameters are desired (e.g. vertical structure) then retrieval can be ambiguous Channel 1 Channel 2 ….. Another approach is to simulate irradiances using model fields Radiative transfer model Assumptions about vertical structures Liquid cloud 1 Liquid cloud 2 Liquid cloud 3 Ice cloud 1 Ice cloud 2 Ice cloud 3 Vapour 1 Vapour 2 Vapour 3 Height Clouds 3

10 Simulating Satellite Channels
ECMWF Training Course May 2001 Simulating Satellite Channels Examples: Morcrette MWR 1991 Chevallier et al, J Clim. 2001 More certainty in the diagnosis of the existence of a problem. Doesn’t necessarily help identify the origin of the problem Clouds 3

11 A more complicated analysis is possible:
ECMWF Training Course May 2001 A more complicated analysis is possible: DIURNAL CYCLE OVER TROPICAL LAND VARIABILITY Observations: late afternoon peak in convection Model: morning peak (Common problem) Clouds 3

12 Has this Improved? 29r1, 06 UTC

13 29r1, 12 UTC

14 29r1, 18UTC

15 NWP forecast evaluation
ECMWF Training Course May 2001 NWP forecast evaluation Differences in longer simulations may not be the direct result of the cloud scheme Interaction with radiation, dynamics etc. E.g: poor stratocumulus regions Using short-term NWP or analysis restricts this and allows one to concentrate on the cloud scheme Introduction of Tiedtke Scheme cloud cover bias Time Clouds 3

16 Example over Europe -8:-3 -3:-1 -1:1 1:3 3:8 What are your conclusions concerning the cloud scheme?

17 Example over Europe Who wants to be a Meteorologist?
-8:-3 -3:-1 -1:1 1:3 3:8 Who wants to be a Meteorologist? Which of the following is a drawback of SYNOP observations? (a) They are only available over land (b) They are only available during daytime (c) They do not provide information concerning vertical structure (d) They are made by people

18 ECMWF Training Course May 2001 Case Studies These were examples of general statistics: globally or for specific regions Can look concentrate on a particular location in more details, for which more data is collected: CASE STUDY Examples: GATE, CEPEX, TOGA-COARE, ARM... Clouds 3

19 Evaluation of vertical cloud structure
ECMWF Training Course May 2001 Evaluation of vertical cloud structure Mace et al., 1998, GRL Examined the frequency of occurrence of ice cloud Reasonable match to data found ARM Site - America Great Plains Clouds 3

20 Evaluation of vertical cloud structure
ECMWF Training Course May 2001 Evaluation of vertical cloud structure Hogan et al., 2000, JAM Analysis using the Chilbolton radar and Lidar Reasonable Match Clouds 3

21 Hogan et al. More details possible
ECMWF Training Course May 2001 Hogan et al. More details possible Clouds 3

22 Found that comparison improved when snow was taken into account
ECMWF Training Course May 2001 Hogan et al. Found that comparison improved when snow was taken into account Clouds 3

23 Issues Raised: WHAT ARE WE COMPARING? HOW STRINGENT IS OUR TEST?
ECMWF Training Course May 2001 Issues Raised: WHAT ARE WE COMPARING? Is the model statistic really equivalent to what the instrument measures? e.g: Radar sees snow, but the model may not include this is the definition of cloud fraction. Small ice amounts may be invisible to the instrument but included in the model statistic HOW STRINGENT IS OUR TEST? Perhaps the variable is easy to reproduce e.g: Mid-latitude frontal clouds are strongly dynamically forced, cloud fraction is often zero or one. Perhaps cloud fraction statistics are easy to reproduce in short term forecasts Clouds 3

24 Can also use to validate “components” of cloud scheme
ECMWF Training Course May 2001 Can also use to validate “components” of cloud scheme EXAMPLE: Cloud Overlap Assumptions Hogan and Illingworth, 00, QJRMS Issues Raised: HOW REPRESENTATIVE IS OUR CASE STUDY LOCATION? e.g: Wind shear and dynamics very different between Southern England and the tropics!!! Clouds 3

25 Composites We want to look at a certain kind of model system:
ECMWF Training Course May 2001 Composites We want to look at a certain kind of model system: Stratocumulus regions Extra tropical cyclones An individual case may not be conclusive: Is it typical? On the other hand general statistics may swamp this kind of system Can use compositing technique Clouds 3

26 Composites - a cloud survey
ECMWF Training Course Composites - a cloud survey May 2001 From Satellite attempt to derive cloud top pressure and cloud optical thickness for each pixel - Data is then divided into regimes according to sea level pressure anomaly Use ISCCP simulator Data Model Modal-Data 1. 900 500 620 750 350 250 120 -ve SLP 900 500 620 750 350 250 120 Tselioudis et al., 2000, JCL +ve SLP Cloud top pressure High Clouds too thin Low clouds too thick 2. Optical depth Clouds 3

27 Composites – Extra-tropical cyclones
ECMWF Training Course Composites – Extra-tropical cyclones May 2001 Overlay about 1000 cyclones, defined about a location of maximum optical thickness Plot predominant cloud types by looking at anomalies from 5-day average High Clouds too thin Low clouds too thick High tops=Red, Mid tops=Yellow, Low tops=Blue Klein and Jakob, 1999, MWR Clouds 3

28 A strategy for cloud parametrization evaluation
ECMWF Training Course May 2001 A strategy for cloud parametrization evaluation Jakob, Thesis Where are the difficulties? Clouds 3

29 Recap: The problems All Observations Long term climatologies:
ECMWF Training Course May 2001 Recap: The problems All Observations Are we comparing ‘like with like’? What assumptions are contained in retrievals/variational approaches? Long term climatologies: Which physics is responsible for the errors? Dynamical regimes can diverge NWP, Reanalysis, Column Models Doesn’t allow the interaction between physics to be represented Case studies Are they representative? Do changes translate into global skill? Composites As case studies. And one more problem specific to NWP… Clouds 3

30 NWP cloud scheme development
Timescale of validation exercise Many of the above validation exercises are complex and involved Often the results are available O(years) after the project starts for a single version of the model NWP operational models are updated 2 to 4 times a year roughly, so often the validation results are no longer relevant, once they become available. Requirement: A quick and easy test bench

31 Example: LWP ERA-40 and recent cycles
26r1: April 2003 model 23r4: June 2001 SSMI Diff

32 Example: LWP ERA-40 and recent cycles
model 23r4: June 2001 26r3: Oct 2003 SSMI Diff

33 Example: LWP ERA-40 and recent cycles
model 23r4: June 2001 28r1: Mar 2004 SSMI Diff

34 Example: LWP ERA-40 and recent cycles
model 23r4: June 2001 28r3: Sept 2004 SSMI Diff

35 Example: LWP ERA-40 and recent cycles
model 23r4: June 2001 29r1: Apr 2005 SSMI Diff Do ERA-40 cloud studies still have relevance for the operational model?

36 All datasets treated as “truth”
So what is used at ECMWF? T799-L91 Standard “Scores” (rms, anom corr of U, T, Z) “operational” validation of clouds against SYNOP observations Simulated radiances against Meteosat 7 T159-L91 – “climate” runs 3 ensemble members of 13 months Automatically produces comparisons to: ERBE, NOAA-x, CERES TOA fluxes Quikscat & SSM/I, 10m winds ISCCP & MODIS cloud cover SSM/I, TRMM liquid water path (soon MLS ice water content) GPCP, TRMM, SSM/I, Xie Arkin, Precip Dasilva climatology of surface fluxes ERA-40 analysis winds All datasets treated as “truth”

37 OLR Model T95 L91 CERES Conclusions? Difference -150 -300 -150 -300
too high Difference too low

38 SW Model T95 L91 CERES Conclusions? Difference 350 100 350 100
albedo high Difference albedo low

39 TCC Model T95 L91 ISCCP Conclusions? Difference 80 5 80 5 TCC high
TCC low

40 First Ice Validation: microwave limb sounders
316 hPa 215 hPa Color: MLS White bars: EC IWC sampled with MLS tracks

41 80 Model T95 L91 TCLW LWP 5 80 SSMI Conclusions? 5 high Difference low

42 Map of MLS ice error

43 Daily Report 11th April 2005 Lets see what you think? 
“Going more into details of the cyclone, it can be seen that the model was able to reproduce the very peculiar spiral structure in the clouds bands. However large differences can be noticed further east, in the warm sector of the frontal system attached to the cyclone, were the model largely underpredicts the typical high-cloud shield. Look for example in the two maps below where a clear deficiency of clod cover is evident in the model generated satellite images north of the Black Sea. In this case this was systematic over different forecasts.” – Quote from ECMWF daily report 11th April 2005

44 Same Case, water vapour channels
Blue: moist Red: Dry 30 hr forecast too dry in front region Is not a FC-drift, does this mean the cloud scheme is at fault?

45 Future: Long term ground-based validation, CLOUDNET
Network of stations processed for multi-year period using identical algorithms, first Europe, now also ARM sites Some European provide operational forecasts so that direct comparisons are made quasi-realtime Direct involvement of Met services to provide up-to-date information on model cycles

46 Cloudnet Example In addition to standard quicklooks, longer-term statistics are available This example is for ECMWF cloud cover during June 2005 Includes preprocessing to account for radar attenuation and snow See for more details and examples!

47 Future: Improved remote sensing capabilities, CLOUDSAT
CloudSat is an experimental satellite that will use radar to study clouds and precipitation from space. CloudSat will fly in orbital formation as part of the A-Train constellation of satellites (Aqua, CloudSat, CALIPSO, PARASOL, and Aura) Launched 28th April

48 Cloudsat transect Zonal mean cloud cover

49 In summary: Many methods for examining clouds
ECMWF Training Course May 2001 In summary: Many methods for examining clouds but all too often... parameterisation improvement A chasm still exists!!! cloud validation Clouds 3


Download ppt "Parameterization Cloud Scheme Validation"

Similar presentations


Ads by Google