Download presentation
Presentation is loading. Please wait.
Published byPhilippa Davis Modified over 9 years ago
1
23-27 Oct. 2006NOAA 31st Annual Climate Diagnostics and Prediction Workshop Predictability & Prediction of Seasonal Climate over North America Lisa Goddard, Simon Mason, Ben Kirtman, Kelly Redmond, Randy Koster, Wayne Higgins, Marty Hoerling, Alex Hall, Jerry Meehl, Tom Delworth, Nate Mantua, Gavin Schmidt (US CLIVAR PPAI Panel)
2
Time Series of Prediction Skill operational Potential predictability Research forecasts (1) Understand the limit of predictability (2) Identify conditional predictability (e.g. state of ENSO or Indian Ocean) (3) Document the expected skill to judge potential utility of the information for decision support (4) Set a baseline for testing improvements to prediction tools and methodologies (5) Set a target for real-time predictions. (Courtesy of Arun Kumar & Ants Leetmaa)
3
Real-time prediction skill… North America, 1-month lead, seasonal terrestrial climate Provide a template for verification - What are the best metrics? Best for who? - Pros & cons of current metrics - Can we capture important aspects of variability (e.g. trends, drought periods)? Estimate skill of real-time forecasts - How predictable is N. America climate? - Benefit of multi-model ensembling? Provide baseline against which we can judge future advances - How best to archive/document for future comparison? - Are we missing something? (i.e. statistical models)
4
Forecast Data Dynamical models (single): CCCma – Canadian Centre for Climate Modeling and Analysis KMA – Korean Meteorological Agency MGO- Main Geophysical Observatory, Russia NASA/GMAO-National Aeronautics and Space Administration, USA RPN – Canadian Meteorological Centre ECHAM4.5 – MPI (run at IRI) CCM3.6 – NCAR (run at IRI) ECMWF –European Center for Medium Range Weather Forecasts Meteo-France – Meteorological Service, France LODYC- Laboratoire d'Océanographie Dynamique et de Climatologie, France Met Office – UK Meteorological Office MPI – Max Planc Institute for Meteorology, Germany CERFACS – European Centre for Research and Advanced Training in Scientific Computing, France INGV-Instituto Nazionale di Geofisica e Vulcanolgia, Italy NOAA-CFS – National Oceanic Atmospheric Administration, USA Multi-Model of dynamical models (simple average) Statistical models (from CPC): CCA, OCN (others?) Multi-Model of dynamical + statistical models
5
ModelNXNYNMLS CCCma-GCM29648100.5-3.5Mar1969-Dec2003 by 3 CCCma-GCM312864100.5-3.5Mar1969-Dec2003 by 3 KMA1447362.5-8.5Jan1979-Dec2002 MGO1447360.5-3.5Nov1978-Nov2000 by 3 NASA-GMAO1449061.5-3.5Feb1993-Nov2002 by 3 RPM19296100.5-3.5Mar1969-Dec2000 by 3 ECHAM4.512864240.5-6.5Jan1958-Dec2002 CCM3.612864240.5-6.5Jan1958-Dec2002 ECMWF1447190.5-5.5Feb1958-Nov2001 by 3 Meteo-France1447190.5-5.5Feb1958-Nov2001 by 3 LODYC1447190.5-5.5Feb1974-Nov2001 by 3 MetOffice1447190.5-5.5Feb1959-Nov2001 by 3 MPI1447190.5-5.5Feb1969-Nov2001 by 3 CERFACS1447190.5-5.5Feb1980-Nov2001 by 3 INGV1447190.5-5.5Feb1973-Nov2001 by 3 CFS19294150.5-8.5Jan1981-Dec2003 Forecast Data
6
ModelNXNYNMLS CCCma-GCM29648100.5-3.5Mar1969-Dec2003 by 3 CCCma-GCM312864100.5-3.5Mar1969-Dec2003 by 3 KMA1447362.5-8.5Jan1979-Dec2002 MGO1447360.5-3.5Nov1978-Nov2000 by 3 NASA-GMAO1449061.5-3.5Feb1993-Nov2002 by 3 RPM19296100.5-3.5Mar1969-Dec2000 by 3 ECHAM4.512864240.5-6.5Jan1958-Dec2002 CCM3.612864240.5-6.5Jan1958-Dec2002 ECMWF1447190.5-5.5Feb1958-Nov2001 by 3 Meteo-France1447190.5-5.5Feb1958-Nov2001 by 3 LODYC1447190.5-5.5Feb1974-Nov2001 by 3 MetOffice1447190.5-5.5Feb1959-Nov2001 by 3 MPI1447190.5-5.5Feb1969-Nov2001 by 3 CERFACS1447190.5-5.5Feb1980-Nov2001 by 3 INGV1447190.5-5.5Feb1973-Nov2001 by 3 CFS19294150.5-8.5Jan1981-Dec2003 Forecast Data: JJA & DJF (1981-2001)
7
Verification Data & Metrics OBSERVATIONAL DATA: 2.5x2.5 deg 2m T: CRU-TSv2.0 (1901-2002) Precipitation: CMAP (1979-2004) VERIFICATION MEASURES Metrics consistent with WMO - SVS for LRF (Standardised Verification System for Long Range Forecasts) Deterministic information : - MSE & its decomposition - correlation, mean bias, & variance ratio Probabilistic information: - Reliability diagrams, regionally accumulated - ROC areas for individual grid boxes
8
Mean Squared Error
9
Pro: * Gives some estimate of uncertainty in forecast (i.e. RMSE). Con: * Can not infer frequency of large errors unless precise distributional assumptions are met. Recommendation: * Perhaps simple graph or table showing frequency of errors of different magnitudes would be appropriate.
10
Correlation : Temperature DJF 1981-2001
11
Correlation : Temperature JJA 1981-2001
12
Correlation : Precipitation DJF 1981-2001
13
Correlation : Precipitation JJA 1981-2001
14
Correlation Pros: * Commonly used; familiar * Gives simple overview of where models are likely to have skill or not Con: * Merely measure of association, not of forecast accuracy Recommendation: * Avoid deterministic metrics
16
Example Ensemble forecasts of above-median March – May rainfall over north-eastern Brazil
18
ROC Areas : DJF Temperature BELOW-NORMAL 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.4
19
ROC Areas : DJF Temperature ABOVE-NORMAL 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.4
20
ROC Areas : JJA Temperature ABOVE-NORMAL 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.4
21
ROC Areas : JJA Temperature BELOW-NORMAL 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.4
22
ROC Areas : DJF Precipitation ABOVE-NORMAL 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.4
23
ROC Areas : DJF Precipitation BELOW-NORMAL 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.4
24
ROC Areas : JJA Precipitation ABOVE-NORMAL 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.4
25
ROC Areas : JJA Precipitation BELOW-NORMAL 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.4
26
ROC Areas Pros: * Can treat probabilistic forecasts * Can be provided point-wise * Can distinguish ‘asymmetric’ skill Cons: * Fails to address reliability
27
RELIABILITY
29
Reliability Pros: * Treats probabilistic forecasts * Relatively easy to interpret * Provides most relevant information on usability of forecast information over time Cons: * Difficult to provide for individual grid points, especially for short time samples
30
Temperature Trends over North America %-Area Covered by “Above-Normal”
31
Temperature Trends over North America %-Area Covered by “Above-Normal”
32
Observed Precipitation over North America 1998-2001 Anomalies relative to 1981-1997 Percent difference relative to 1981-1997 1 2 3 4 JJA DJF Frequency (# years out of 4) for precipitation in BN category
33
Frequency of Below-Normal Precipitation JJA 1998-2001 1 in 4 2 in 4 3 in 4 4 in 4 OBSERVATIONS
34
Frequency of Below-Normal Precipitation DJF 1998-2001 1 in 4 2 in 4 3 in 4 4 in 4 OBSERVATIONS
35
Summary What’s an appropriate template? - Skill metrics should be flexible (i.e. user defined “events”, categories, thresholds) - Probabilistic forecasts must be treated probabilistically!!! How are we doing? - Could be better. Encouraging performance estimates by some measures, but inadequate performance on important aspects of climate variability. - Missing elements necessary for seasonal prediction? Baseline??
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.