Download presentation
Presentation is loading. Please wait.
Published byLinette Wilkins Modified over 9 years ago
1
Evaluating RCM Experiments PRECIS Workshop Tanzania Meteorological Agency, 29 th June – 3 rd July 2015
2
Objectives of the session 1.Understand the context for RCM evaluation 2.Identify the main components of a model evaluation 3.Discuss different evaluation techniques and aspects to consider 4.Provide some examples
3
What should we expect from a model? © Crown copyright Met Office “All models are wrong, but some are useful” George Box, 1987
4
What should we expect from a model? Models are not “truth machines” © Crown copyright Met Office
5
“For numerical weather prediction, for example, skill is relatively well defined because forecasts can be verified on a daily basis. For a climate model, it is more difficult to define a unique overall figure of merit, metric or skill score for long-term projections. Each model tends to simulate some aspects of the climate system well and others not so well, and each model has its own set of strengths and weaknesses. We do not need a perfect model, just one that serves the purpose. An aeroplane, for example, can be constructed with the help of numerical models that are not able to properly simulate turbulent flow.” R Knutti (2008) Should we believe model predictions of future climate change? What should we expect from a model? © Crown copyright Met Office
6
What is a model evaluation and why is it important? What: – An assessment of how well the model is able to simulate the “present day”, observed climate – A model evaluation is not a model verification Why: – It enables you to gain familiarity with the model characteristics – It indicates which aspects of the model simulation are most credible – …and therefore indicates how to make the best, most credible, use of the data to answer relevant questions © Crown copyright Met Office
7
Undertaking a model evaluation © Crown copyright Met Office
8
The model evaluation process Identify the target and purpose of the evaluation Obtain multiple sources of observed data to evaluate model performance Assess the errors and biases in the GCMs that provide the LBCs for the RCM Evaluate the RCM acknowledging the multiple sources of uncertainty © Crown copyright Met Office
9
Identify the target and purpose of the evaluation What aspects of the climate system are of most interest? What climate processes are key to understanding climate variability/change in the focus region? What variables (e.g. temperature, precipitation, humidity) are of most interest? What time and space scales are of interest? Are you interested in extreme or rare events, or multi-year averages? Does the model need to provide accurate data at a specific spatial scale? © Crown copyright Met Office
10
Choice of observed data Use as many relevant observed datasets as possible Gridded datasets – Observed datasets – e.g. CRU (land surface), TRMM (satellite rainfall), GPCP (merged rain gauge and satellite rainfall) – Reanalysis data – e.g. ERA-Interim (atmosphere) Station data – Use with caution! It can be useful to compare directly to model output but be aware of differences in spatial scales; ultimately one would not expect the data to match. © Crown copyright Met Office
11
Hewitson et al (2013) Interrogating empirical-statistical downscaling. Climatic Change, 122(4), 539-554 Different datasets provide different answers. There is no single truth but an envelope of possible or probable pasts. Choice of observed data © Crown copyright Met Office
12
DJF (Winter) mean, maximum and minimum temperatures at each grid cell over the period 1963 to 2010 for West Africa; data taken from the CRU TS3.22 and UDEL dataset. (green lines show semi-arid region) Choice of observed data
13
Assess GCM data providing LBCs Knutti et al 2013 GRL © Crown copyright Met Office
14
Evaluating RCM Output © Crown copyright Met Office
15
Model system = GCM + RCM Q1. Are there discrepancies in the model output? Between parts of the RCM and GCM model output Between parts of the model output and ‘reality’ Q2. If so, why? Systematic model bias (error in the model’s physical formulation) Spatial sampling issues (differences in resolution of model and observations) Observational error (gridding issues, instrument dependent errors) © Crown copyright Met Office Evaluating how well the RCM represents the current climate
16
RCM GCM Observations consistency realism Evaluating how well the RCM represents the current climate RCM errors have three sources: Physical errors in the GCM affecting the LBCs Physical errors in the RCM RCM/GCM consistency errors © Crown copyright Met Office
17
There is potential for four separate validations: GCM vs Observations RCM driven by GCM vs GCM RCM driven by GCM vs Observations RCM driven by Observations vs Observations © Crown copyright Met Office Evaluating how well the RCM represents the current climate RCM GCM Observations consistency realism
18
There is potential for four separate validations: GCM vs Observations RCM driven by GCM vs GCM RCM driven by GCM vs Observations RCM driven by Observations vs Observations © Crown copyright Met Office Evaluating how well the RCM represents the current climate RCM GCM Observations consistency realism
19
There is potential for four separate validations: GCM vs Observations RCM driven by GCM vs GCM RCM driven by GCM vs Observations RCM driven by Observations vs Observations © Crown copyright Met Office Evaluating how well the RCM represents the current climate RCM GCM Observations consistency realism
20
There is potential for four separate validations: GCM vs Observations RCM driven by GCM vs GCM RCM driven by GCM vs Observations RCM driven by Observations vs Observations © Crown copyright Met Office Evaluating how well the RCM represents the current climate RCM GCM Observations consistency realism
21
There is potential for four separate validations: GCM vs Observations RCM driven by GCM vs GCM RCM driven by GCM vs Observations RCM driven by Observations vs Observations © Crown copyright Met Office Evaluating how well the RCM represents the current climate RCM GCM Observations consistency realism
22
Aspects to consider in evaluation Assess as many meteorological variables as possible At least: Surface air temperature, precipitation, upper air winds Examine the physical realism exhibited within the model E.g. In cool and wet conditions we may expect high soil moisture. Is this so? Use both spatial and temporal information Spatial: Temporal: Full fields Smaller areas Vertical profiles Area averages © Crown copyright Met Office Time series Seasonal, annual and decadal means Higher order statistics (variability, extremes) Different seasons, different regimes
23
Compare like with like Data only have skill at spatial scales resolved by their grids Make sure to aggregate or interpolate datasets to the coarsest grid before comparing data In general: Average (Index) ≠ Index (Average) When comparing datasets at different resolutions, must be careful to compare like with like © Crown copyright Met Office Aspects to consider in evaluation Chen (2008) On the Verification and Comparison of Extreme Rainfall Indices from Climate Models, J Climate Average data first Index first
24
Model forecasts (or hindcasts) are not constrained by the observations (i.e. weather) that actually happened. They are, however, constrained by forcings (i.e. CO2, lateral boundary data, surface boundary data). Therefore, one cannot, in general, compare individual model years with their corresponding observed years. Rather, we are looking for agreement in the aggregated distribution of weather states (i.e. climate) over time. However when models are run using observed boundary data from reanalyses, model year to actual year comparisons can be worthwhile – reanalysis data is “quasi-observed” data. © Crown copyright Met Office Aspects to consider in evaluation
25
Limits of evaluating models against observations © Crown copyright Met Office Evaluation of climate models based on past climate observations has some important limitations. We can only evaluate those variables and phenomena for which observations exist. In some places, there is a lack of, or insufficient quality of, long- term observations. The presence of long-term climate variability. These limitations can be reduced, but not entirely eliminated, through the use of multiple independent observations of the same variable as well as the use of model ensembles.
26
The Regional Climate Model Evaluation System (RCMES) was designed for addressing evaluation needs for programs such as CORDEX, NARCCAP, etc. It was designed by NASA's & Caltech's Jet Propulsion Laboratory (JPL) and the University of California, Los Angeles (UCLA). RCMES is composed of two main components: 1. The Regional Climate Model Evaluation Database (RCMED) 2. Regional Climate Model Evaluation Toolkit (RCMET) Details of RCMES are presented at the web page: https://rcmes.jpl.nasa.gov/ https://rcmes.jpl.nasa.gov/ © Crown copyright Met Office
27
Examples © Crown copyright Met Office
28
Example 1: CORDEX RCMs, Africa Biases in the simulated annual-mean precipitation (mm/day) against the CRU data. From Kim et al (2013)Evaluation of the CORDEX-Africa multi-RCM hindcast systematic model errors.
29
Example 2a: Seasonal mean precipitation, from PRECIS © Crown copyright Met Office
30
Example 2b: Frequency of wet days, from PRECIS
31
Example 3: Extreme rainfall event in a river catchment (using PRECIS) Area average precipitation in the Jhelum river basin (Pakistan) for September 1992, showing RCM simulations at 50 and 25 km and observations © Crown copyright Met Office
32
Example 4: Individual station vs. area averages 26 stations in a 25km×25km area (black bars) and their area averages, (red bars). The area average (c.f. model grid box output) is considerably and inconsistently different to most individual stations © Crown copyright Met Office
33
What now? Use of the RCM output beyond the evaluation Having evaluated the RCM output, is it appropriate to use the simulated future climate output directly? For what scales, variables and types of questions is the model output able to provide “useful” information? © Crown copyright Met Office
34
To summarise There are many uncertainties which need to be taken into account when assessing climate change (and its impact) over a region Some account may currently be taken for most (BUT NOT ALL) uncertainties Even those uncertainties that can be accounted for are currently not well described There is a lot more work for us all to do! Summary Model evaluation is ESSENTIAL: It enables familiarisation with the model and its projected output A simulation may be over an area where the model performance is untested An evaluation provides a baseline for assessing the credibility of future projections from RCMs, which has implications for how the output can and should be used
35
Thanks for listening. Questions?
36
The ability of RCMs to simulate the regional climate depends strongly on the realism of the large-scale circulation that is provided by the LBCs, from the GCMs. IPCC Fig 9.4, WG1, Chapter 9 Assess GCM data providing LBCs © Crown copyright Met Office
37
Raw Data: Various sources, formats, Resolutions, Coverage RCMED (Regional Climate Model Evaluation Database) A large scalable database to store data from variety of sources in a common format RCMET (Regional Climate Model Evaluation Toolkit) A library of codes for extracting data from RCMED and model and for calculating evaluation metrics Metadata Data Table Common Format, Native grid, Efficient architecture Common Format, Native grid, Efficient architecture Extractor for various data formats TRMM MODIS AIRS CERES ETC Soil moisture Extract OBS data Extract model data User input Regridder (Put the OBS & model data on the same time/space grid) Regridder (Put the OBS & model data on the same time/space grid) Metrics Calculator (Calculate evaluation metrics) Metrics Calculator (Calculate evaluation metrics) Visualizer (Plot the metrics) Visualizer (Plot the metrics) URL Use the re- gridded data for user’s own analyses and VIS. Data extractor (Binary or netCDF) Model data Other Data Centers (ESG, DAAC, ExArch Network) Other Data Centers (ESG, DAAC, ExArch Network) © Crown copyright Met Office
38
Break Active GCM RCM Example 4: South Asian monsoon break-active phase precipitation © Crown copyright Met Office
39
Example 4: Observed precipitation over the Alps Average rainfall for the period 1971-1990 from the (left) CRU 3.0 data set (resolution 0.5 x 0.5°) and the (right) Frei and Schaer Alpine analysis (resolution 0.3 x 0.22°).
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.