© Crown copyright Met Office Using a perturbed physics ensemble to make probabilistic climate projections for the UK Isaac Newton workshop, Exeter David.

Slides:



Advertisements
Similar presentations
© UKCIP 2006 © Crown copyright Met Office Probabilistic climate projections from the decadal to centennial time scale WCRP Workshop on Regional Climate,
Advertisements

Measuring the performance of climate predictions Chris Ferro, Tom Fricker, David Stephenson Mathematics Research Institute University of Exeter, UK IMA.
LRF Training, Belgrade 13 th - 16 th November 2013 © ECMWF Sources of predictability and error in ECMWF long range forecasts Tim Stockdale European Centre.
ECMWF long range forecast systems
Climate changes in Southern Africa; downscaling future (IPCC) projections Olivier Crespo Thanks to M. Tadross Climate Systems Analysis Group University.
© Crown copyright Met Office 2011 Climate impacts on UK wheat yields using regional model output Jemma Gornall 1, Pete Falloon 1, Kyungsuk Cho 2,, Richard.
Februar 2003 Workshop Kopenhagen1 Assessing the uncertainties in regional climate predictions of the 20 th and 21 th century Andreas Hense Meteorologisches.
Theme D: Model Processes, Errors and Inadequacies Mat Collins, College of Engineering, Mathematics and Physical Sciences, University of Exeter and Met.
What is the point of this session? To use the UK’s experience to give ideas about creating and using climate change scenarios in other countries and situations.
June 17th 2003 Spruce VI1 On the use of statistics in complex weather and climate models Andreas Hense Meteorological Institute University Bonn.
A statistical method for calculating the impact of climate change on future air quality over the Northeast United States. Collaborators: Cynthia Lin, Katharine.
Use of probabilistic climate change information for impact & adaptation work (ENSEMBLES & UKCP09) Clare Goodess Climatic Research Unit
© Crown copyright Met Office Regional/local climate projections: present ability and future plans Research funded by Richard Jones: WCRP workshop on regional.
Analysis of Extremes in Climate Science Francis Zwiers Climate Research Division, Environment Canada. Photo: F. Zwiers.
Applying probabilistic scenarios to environmental management and resource assessment Rob Wilby Climate Change Science Manager
Climate case study. Outline The challenge The simulator The data Definitions and conventions Elicitation Expert beliefs about climate parameters Expert.
Earth Systems Science Chapter 6 I. Modeling the Atmosphere-Ocean System 1.Statistical vs physical models; analytical vs numerical models; equilibrium vs.
Progress in Downscaling Climate Change Scenarios in Idaho Brandon C. Moore.
16 March 2011 | Peter Janssen & Arthur Petersen Model structure uncertainty A matter of (Bayesian) belief?
Uncertainty and Climate Change Dealing with uncertainty in climate change impacts Daniel J. Vimont Atmospheric and Oceanic Sciences Department Center for.
Assessment of Future Change in Temperature and Precipitation over Pakistan (Simulated by PRECIS RCM for A2 Scenario) Siraj Ul Islam, Nadia Rehman.
Introduction to Numerical Weather Prediction and Ensemble Weather Forecasting Tom Hamill NOAA-CIRES Climate Diagnostics Center Boulder, Colorado USA.
© Crown copyright Met Office Andrew Colman presentation to EuroBrisa Workshop July Met Office combined statistical and dynamical forecasts for.
Development of a combined crop and climate forecasting system Tim Wheeler and Andrew Challinor Crops and Climate Group.
Page 1© Crown copyright 2004 Development of probabilistic climate predictions for UKCIP08 David Sexton, James Murphy, Mat Collins, Geoff Jenkins, Glen.
The Scenarios Network for Alaska and Arctic Planning is a collaborative network of the University of Alaska, state, federal, and local agencies, NGOs,
Page 1GMES - ENSEMBLES 2008 ENSEMBLES. Page 2GMES - ENSEMBLES 2008 The ENSEMBLES Project  Began 4 years ago, will end in December 2009  Supported by.
OUCE Oxford University Centre for the Environment “Applying probabilistic climate change information to strategic resource assessment and planning” Funded.
ESA DA Projects Progress Meeting 2University of Reading Advanced Data Assimilation Methods WP2.1 Perform (ensemble) experiments to quantify model errors.
© Crown copyright Met Office Climate Projections for West Africa Andrew Hartley, Met Office: PARCC national workshop on climate information and species.
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
From Climate Data to Adaptation Large-ensemble GCM Information and an Operational Policy-Support Model Mark New Ana Lopez, Fai Fung, Milena Cuellar Funded.
EUROBRISA Workshop – Beyond seasonal forecastingBarcelona, 14 December 2010 INSTITUT CATALÀ DE CIÈNCIES DEL CLIMA Beyond seasonal forecasting F. J. Doblas-Reyes,
Changes in Floods and Droughts in an Elevated CO 2 Climate Anthony M. DeAngelis Dr. Anthony J. Broccoli.
© Crown copyright Met Office Providing High-Resolution Regional Climates for Vulnerability Assessment and Adaptation Planning Joseph Intsiful, African.
Recent Advances in Climate Extremes Science AVOID 2 FCO-Roshydromet workshop, Moscow, 19 th March 2015 Simon Brown, Met Office Hadley Centre.
EUROBRISA WORKSHOP, Paraty March 2008, ECMWF System 3 1 The ECMWF Seasonal Forecast System-3 Magdalena A. Balmaseda Franco Molteni,Tim Stockdale.
Model dependence and an idea for post- processing multi-model ensembles Craig H. Bishop Naval Research Laboratory, Monterey, CA, USA Gab Abramowitz Climate.
Future Climate Projections. Lewis Richardson ( ) In the 1920s, he proposed solving the weather prediction equations using numerical methods. Worked.
© Crown copyright Met Office Climate change and variability - Current capabilities - a synthesis of IPCC AR4 (WG1) Pete Falloon, Manager – Impacts Model.
Why it is good to be uncertain ? Martin Wattenbach, Pia Gottschalk, Markus Reichstein, Dario Papale, Jagadeesh Yeluripati, Astley Hastings, Marcel van.
Center for Radiative Shock Hydrodynamics Fall 2011 Review Assessment of predictive capability Derek Bingham 1.
Research Needs for Decadal to Centennial Climate Prediction: From observations to modelling Julia Slingo, Met Office, Exeter, UK & V. Ramaswamy. GFDL,
. Outline  Evaluation of different model-error schemes in the WRF mesoscale ensemble: stochastic, multi-physics and combinations thereof  Where is.
Climate Scenario and Uncertainties in the Caribbean Chen,Cassandra Rhoden,Albert Owino Anthony Chen,Cassandra Rhoden,Albert Owino Climate Studies Group.
2.There are two fundamentally different approaches to this problem. One can try to fit a theoretical distribution, such as a GEV or a GP distribution,
COST 723 WORKSHOP – SOFIA, BULGARIA MAY 2006 USE OF RADIOSONDE DATA FOR VALIDATION OF REGIONAL CLIMATE MODELLING SIMULATIONS OVER CYPRUS Panos Hadjinicolaou.
The evolution of climate modeling Kevin Hennessy on behalf of CSIRO & the Bureau of Meteorology Tuesday 30 th September 2003 Canberra Short course & Climate.
Climate Modeling Research & Applications in Wales John Houghton C 3 W conference, Aberystwyth 26 April 2011.
© Crown copyright Met Office Uncertainties in the Development of Climate Scenarios Climate Data Analysis for Crop Modelling workshop Kasetsart University,
Welcome to the PRECIS training workshop
Reducing the risk of volcanic ash to aviation Natalie Harvey, Helen Dacre (Reading) Helen Webster, David Thomson, Mike Cooke (Met Office) Nathan Huntley.
1 MET 112 Global Climate Change MET 112 Global Climate Change - Lecture 12 Future Predictions Eugene Cordero San Jose State University Outline  Scenarios.
© Crown copyright Met Office Working with climate model ensembles PRECIS workshop, MMD, KL, November 2012.
© Crown copyright 2007 Global changes in extreme daily temperature since 1950 using non-stationary extreme value analysis Simon Brown, John Caesar and.
Arctic climate simulations by coupled models - an overview - Annette Rinke and Klaus Dethloff Alfred Wegener Institute for Polar and Marine Research, Research.
Using the past to constrain the future: how the palaeorecord can improve estimates of global warming 大氣所碩一 闕珮羽 Tamsin L. Edwards.
NOAA Northeast Regional Climate Center Dr. Lee Tryhorn NOAA Climate Literacy Workshop April 2010 NOAA Northeast Regional Climate.
1/39 Seasonal Prediction of Asian Monsoon: Predictability Issues and Limitations Arun Kumar Climate Prediction Center
National Oceanic and Atmospheric Administration’s National Weather Service Colorado Basin River Forecast Center Salt Lake City, Utah 11 The Hydrologic.
making certain the uncertainties
Ruth Doherty, Edinburgh University Adam Butler & Glenn Marion, BioSS
A BAYESIAN ENSEMBLE METHOD FOR CLIMATE CHANGE PROJECTION
On HRM3 (a.k.a. HadRM3P, a.k.a. PRECIS) North American simulations
Modeling the Atmos.-Ocean System
Predictability of Indian monsoon rainfall variability
Measuring the performance of climate predictions
Presentation transcript:

© Crown copyright Met Office Using a perturbed physics ensemble to make probabilistic climate projections for the UK Isaac Newton workshop, Exeter David Sexton Met Office Hadley Centre September 20th 2010

Three different emission scenarios Seven different timeframes 25km grid, 16 admin regions, 23 river-basins and 9 marine regions UKCP09: A 5-dimensional problem Uncertainty including information from models other than HadCM3 Variables and months This is what users requested

Why we cannot be certain… Internal variability (initial condition uncertainty) Modelling uncertainty –Parametric uncertainty (land/atmosphere and ocean perturbed physics ensembles) –Structural uncertainty (multimodel ensembles) –Systematic errors common to all current climate models Forcing uncertainty –Different emission scenarios –carbon cycle (perturbed physics ensembles) –aerosol forcing (perturbed physics ensembles)

Production of UKCP09 predictions Simple Climate Model Time-dependent PDF 25km PDF UKCP09 Equilibrium PPE 4 time-dependent Earth System PPEs (atmos, ocean, carbon, aerosol) Equilibrium PDF Observations Other models 25km regional climate model

Stage 1: Uncertainty in equilibrium response Simple Climate Model Time-dependent PDF 25km PDF UKCP09 Equilibrium PPE 4 time-dependent Earth System PPEs (atmos, ocean, carbon, aerosol) Equilibrium PDF Observations Other models 25km regional climate model

Perturbed physics ensemble There are plenty of different variants of the climate model (i.e. different values for model input parameters) that are as good if not better than the standard tuned version But their response can be different to the standard version Cast the net wide, explore parameter space with view to finding pockets of good quality parts of parameter space and see what that implies for uncertainty

Perturbed physics ensemble 280 equilibrium runs, 31 parameters Parameters varied within ranges elicited from experts

Probabilistic prediction of equilibrium response to double CO2

Stage 2: Time Scaling (Glen Harris and Penny Boorman) Simple Climate Model Time-dependent PDF 25km PDF UKCP09 Equilibrium PPE 4 time-dependent Earth System PPEs (atmos, ocean, carbon, aerosol) Equilibrium PDF Observations Other models 25km regional climate model

Ensembles for other Earth System components Use ocean, sulphur cycle, carbon cycle PPEs and multimodel ensembles to tune different configurations of the Simple Climate Model 17 members of Atmosphere Perturbed Physics Ensemble repeated with a full coupling between atmosphere and dynamic ocean

Making time-dependent PDFs Sample point in atmosphere parameter space Emulate equilibrium response in climate sensitivity and prediction variables and calculate weights Sample ocean, aerosol and carbon cycle configurations of Simple Climate Model Time scale the prediction variables Adjust the weight according to how well model variant reproduces large scale temperature trends over 20 th century And repeat sampling…

Plume for GCM grid box over Wales

Stage 3: Downscaling (Kate Brown) Simple Climate Model Time-dependent PDF 25km PDF UKCP09 Equilibrium PPE 4 time-dependent Earth System PPEs (atmos, ocean, carbon, aerosol) Equilibrium PDF Observations Other models 25km regional climate model

Dynamical downscaling For 11 of the 17 atmosphere fully coupled ocean- atmosphere runs, use 6-hourly boundary conditions to drive 25km regional climate model for

Adding information at 25km scale High resolution regional climate model projections are used to account for the local effects of coastlines, mountains, and other regional influences. They add skilful detail to large scale projections from global climate model projections, but also inherit errors from them.

Stage 1: Uncertainty in equilibrium response Simple Climate Model Time-dependent PDF 25km PDF UKCP09 Equilibrium PPE 4 time-dependent Earth System PPEs (atmos, ocean, carbon, aerosol) Equilibrium PDF Observations Other models 25km regional climate model

Bayesian prediction – Goldstein and Rougier 2004 Aim is to construct joint probability distribution p(X, m h, m f,y,o,d) of all uncertain objects in problem. Input parameters (X) Historical and future model output (m h,m f ) True climate (y h,y f ) Observations (o) Model imperfections (d) Bayes Linear assumption so all objects represented in terms of means and covariances

Best-input assumption (Goldstein and Rougier 2004) Model not perfect so there are processes in real system but not in our model that could alter model response by an uncertain amount. We assume that one choice of values for our model’s parameters, x*, is better than all others True climate Discrepancy d=0 for perfect model Model output of best choice of parameter values x*

Weighting different model variants But each combination has a prior probability of being x*, so build and emulator Use observations to weight towards higher quality parts of parameter space No verification or hindcasting possible so we are limited to this use of the observations Emulated distributions for 10 different samples of combination s of parameter values

Weighting different model variants Emulated distributions for 10 different samples of combination s of parameter values But each combination has a prior probability of being x*, so build and emulator Use observations to weight towards higher quality parts of parameter space No verification or hindcasting possible so we are limited to this use of the observations

Large scale patterns of climate variations The first of six eigenvectors of observed climate used in weighting. A small subset of climate variables are shown

Constraining parameters

Second way to constrain predictions with observations Some uncertainty about future related to uncertainty about past and is removed when values for real world are specified

Specifying the discrepancy Method does not capture systematic errors that are common to all state-of- the-art climate models

The largest discrepancy impact for UK temperature changes: Scotland in March Example of large shift in PDF due to mean discrepancy indicating a bias in HadCM3 relative to other models

Discrepancy Term: Snow Albedo Feedback in Scotland in March Black crosses: Perturbed physics ensembles, slab models Red asterisks: Multi model slab ensemble Black vertical lines: Observations (different data sets) Observations

But what if… Black crosses: Perturbed physics ensembles, slab models Red asterisks: Multi model slab ensemble Black vertical lines: Observations (different data sets) Observations Should this be captured by the adjustment term or should the discrepancy be a function of x?

Making PDFs for the real world Start with prior which comes from model output Weighting by large scale metrics plus adjustment –But what about local scale like control March Scotland temperature. ENSEMBLES show May Sweden temperature similar behaviour so maybe need a new metric to capture this behaviour Discrepancy – a direct link between model and real world Downscaling – statistical or dynamic Real world PDF Discrepancy adj emulated likelihood prior

PDFs for the real world (ii) Quantile matching –Piani et al (WATCH project. Submitted) Transfer function applied to model data in baseline period so that CDF of transfer(model data) = observed CDF –Li et al (2010). Correct future percentile by removing model bias in that percentile in baseline period. But what if you cross a threshold?

Crossing a threshold (Clark et al GRL 2010)

PDFs for the real world (iii) Model soil too dry, so longer tail in daily temperatures than observed. This tail does not change much under climate change because still dry Observed soil not too dry, but if climate change causes soil to dry below threshold, there will be a big increase in the upper tail of daily temperatures Li et al would remove a large baseline bias in the upper tail from future model CDF (whose tail is similar to baseline model CDF) Another perspective: Under climate change, model and real world will both be dry, as they are in the same “soil-regime” and future bias < baseline bias Same applies to March Scotland temperature though it is the model that crosses the threshold

PDFs for the real world (iv) Build physics into the bias correction Buser et al (2009) use interannual variability Seasonal forecasting –Clark and Deque – use analogues to calibrate bias correction in seasonal forecast –Ward and Navarra – SVD of joint vector of forecast/observations for several forecasts to pick out which leading order model patterns correspond to which leading order observed patterns

Summary IPCC Working group II scientists use “multiple lines of evidence” to help users make adaptation decisions UKCP09 is a transparent synthesis of climate model data from Met Office and outside, plus observations Statistics provides us with a nice way to frame the problem, generate an algorithm, makes sure we are not missing any terms, and gives us a language to discuss the problem A real challenge though is to develop the statistics to better represent complex behaviour that we understand physically e.g crossing a “threshold”.

Any questions?

Weighting Dots indicate: 280 values from perturbed physics ensemble 12 values from multimodel ensemble Observed value Using 6 metrics reduces risk of rewarding models for wrong reasons e.g. fortuitous compensation of errors

Second eigenvector of observed climate A small subset of climate variables are shown

Third eigenvector of observed climate A small subset of climate variables are shown

Comparing models with observations Use the Bayesian framework of Goldstein and Rougier (2004) “Posterior PDF = prior PDF x likelihood” Use six “large scale” metrics to define likelihood Skill of model is likelihood of model data given some observations V = obs uncertainty + emulator error + discrepancy

Testing robustness Projections inevitably depend on expert assumptions and choices However, sensitivities to some key choices can be tested Changes for Wales, 2080s relative to

Reducing different sources of uncertainty? Uncertainties in winter precipitation changes for the 2080s relative to , at a 25km box in SE England New information, methods, experimental design can reduce uncertainty so projections will change in future and decision makers need to consider this

Discrepancy Term: Snow Albedo Feedback in Scotland in March Black crosses: Perturbed physics ensembles, slab models Red asterisks: Multi model slab ensemble Black vertical lines: Observations (different data sets) Observations

Adjusting future temperatures Consider surface energy balance

Comparison of methods + Raw QUMP data  UKCP09 method  New energy balance based method

Interannual results v. 30-year mean results Predictions are for 30-year means, so should not be compared to annual climate anomalies. Summer % rainfall change: a) interannual over SE England from 17 runs b) time-dependent percentiles of 30-year mean at DEFRA

Effect of historical discrepancy on weighting Discrepancy included excluded Estimated from sample size of 50000

Discrepancy – a schematic of what it does Avoids contradictions from subsequent analyses when some observations have been allowed to constrain the problem too strongly.

UKCP09 aerosol forcing uncertainty Fig. 2.20, AR4, IPCC: total aerosol forcing in 2005, relative to  Q (Wm -2 ), , aerosol + solar + volcanic + ozone Sample of UKCP A1B-GHG forcing x Different scales From IPCC Fourth assessment report Aerosol forcing is found to be inversely proportional to climate sensitivity and this, along with perturbations to sulphur cycle, implies a distribution of aerosol forcing uncertainty in UKCP09

Model imperfections in Bayesian prediction (Goldstein and Rougier 2004) Define discrepancy as a measure of the extent to which model imperfections could affect the response. Assumes there exists a best choice of parameter values Discrepancy is a variance and it measures how informative the climate model is. A perfect model has zero discrepancy. Discrepancy inflates the PDFs of the prediction variables Discrepancy makes it more difficult to discern a good quality model from a poor quality model and so avoids over-confidence in weighting out poor parts of parameter space But how to specify it?

Bayesian prediction – Goldstein and Rougier Aim is to construct joint probability distribution p(X, m h, m f,y,o,d) of all uncertain objects in problem. Input parameters (X) Historical and future model output (m h,m f ) True climate (y h,y f ) Observations (o) Model imperfections (d) Probability here is a measure of how strongly a given value of climate change is supported by the evidence (model projections, observations, expert judgements informed by understanding)

Weighting particularly effective if there exists a strong relationship between a historical climate variable and a parameter AND that parameter and a future climate variable. So weighting can still have a different effect on different prediction variables. Constraining predictions Observed value

We use 6 eigenvectors Weights come from likelihood function. Comparison of weight distributions varying the dimensionality of the likelihood function for Monte Carlo samples of 1 million points.