Download presentation
Presentation is loading. Please wait.
Published byJoel Sanders Modified over 9 years ago
1
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Renate Hagedorn European Centre for Medium-Range Weather Forecasts The General Concept of Ensemble Forecasting in Theory and Practice
2
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Goals Students to learn: Why & how are probabilistic forecasts produced & used? Teacher to learn: What are your greatest needs & expectations from an EPS? Achieve together: What is the best way forward to integrate uncertainty information as an integral component into public weather forecasts?
3
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Outline Why do we need Ensemble Prediction Systems? Chaos theory and its consequences for weather prediction How are probabilistic forecasts made in practice? How do we represent uncertainties? From ensemble members to PDF’s and CDF’s Good ensembles – bad ensembles? How to verify probabilistic forecasts
4
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting The Philosophical Point of View… To know what you know, and to know what you do not know, that is real knowledge Confucius The Analects
5
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting The Practical Point of View… or Predicting predictability is as important as predicting rainfall No forecast is complete without a forecast of forecast skill
6
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting The Practical Point of View… or Predicting predictability is as important as predicting rainfall Weather Forecasts have errors (are uncertain) Ultimate goal of weather forecasting is to improve user decisions, i.e. decision-making based on forecast information should be superior to decision-making without forecast information Decision-making can be improved when uncertainty information is available WHY? No forecast is complete without a forecast of forecast skill
7
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Weather Forecasting: How does it work? Numerical model to describe the processes in the earth system
8
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Weather Forecasting: How does it work? Observations to start the forecast
9
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Weather Forecasting: How does it work? Computer Observations Model
10
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Deterministic Forecasting Forecast time Temperature Initial condition Forecast Is this forecast “correct”? Initial Uncertainty Model Error
11
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting The Lorenz Attractor “… one flap of a sea-gull’s wing may forever change the future course of the weather” (Lorenz, 1963)
12
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting The Lorenz Attractor…...is the visualization of the time-evolution of a three- dimensional non-linear dynamical system described by the ‘Lorenz-63‘ equations
13
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Scientific basis for Ensemble Predictions In a non-linear dynamical system, the growth of uncertainties in initial conditions is flow dependant IC cold warm
14
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Ensemble Forecasting Forecast time Temperature Complete description of weather prediction in terms of a Probability Density Function (PDF) Initial condition Forecast
15
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Flow dependence of forecast errors If the forecasts are coherent (small spread) the atmosphere is in a more predictable state than if the forecasts diverge (large spread) 26 th June 199526 th June 1994
16
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Goal of Ensemble Prediction Represent/predict uncertainty of prediction Move from deterministic to probabilistic forecast Ensemble Spread should capture “truth” (spread ~ RMS error) indicate range of uncertainty
17
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Outline Why do we need Ensemble Prediction Systems? Chaos theory and its consequences for weather prediction How are probabilistic forecasts made in practice? How do we represent uncertainties? From ensemble members to PDF’s and CDF’s Good ensembles – bad ensembles? How to verify probabilistic forecasts
18
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Sources of uncertainties Initial conditions: limited accuracy of observations and data assimilation Run ensemble of forecasts from slightly different conditions. Initial perturbations generated via singular vector technique, breeding vectors, ETKF, etc…
19
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Initial Perturbations Problem: How to construct “good” initial perturbations, given that only a number of limited integrations can be carried out? Solution: Find initial perturbations with maximum amplification rate Singular vector approach: Perturbations with the fastest growth over a finite time intervall (SV) can be identified in solving an eigenvalue problem of the product of the tangent forward and adjoint model propagator
20
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Example of initial perturbations 21/03/2006 00UTC, Temperature (every 0.2K) @~700hPa @ 50°N
21
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Sources of uncertainties Initial conditions: limited accuracy of observations and data assimilation Run ensemble of forecasts from slightly different conditions. Initial perturbations generated via singular vector technique, breeding vectors, ETKF, etc… Model error parameterisations: how to represent unresolved processes ostochastic physics approach physical parameter values: inaccurate knowledge of parameter space operturbed parameter approach model structure: how to represent physical processes in models omulti-model approach
22
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Stochastic Physics Background : Assume that parameterization is tuned to give correct ensemble mean Account for statistical fluctuations using random numbers ECMWF implementation: assign random numbers [0.5,1.5] to 10º lat/lon boxes multiply model parameterization tendencies by these random numbers assign new random numbers every 6 hours Skill measure: area under ROC curve Event: precipitation > 40 mm/day Top curves: winter performance Bottom curves: summer performance Buizza et al, 1999 SP: yes SP: no
23
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting RMS error Spread No SPBS SPBS New Stochastic Physics (SPBS) Courtesy: Judith Berner under-dispersion reduced for all forecast ranges RMS error reduced u-component 850hPa, Tropics
24
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Sources of uncertainties Initial conditions: limited accuracy of observations and data assimilation Run ensemble of forecasts from slightly different conditions. Initial perturbations generated via singular vector technique, breeding vectors, ETKF, etc… Model error parameterisations: how to represent unresolved processes ostochastic physics approach physical parameter values: inaccurate knowledge of parameter space operturbed parameter approach model structure: how to represent physical processes in models omulti-model approach Boundary conditions: SST, soil moisture, sea ice, etc. Unknown changes in boundary conditions are source of uncertainty, however, known (or well modelled) external forcing can be source of predictability for extended range forecasts
25
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Outline Why do we need Ensemble Prediction Systems? Chaos theory and its consequences for weather prediction How are probabilistic forecasts made in practice? How do we represent uncertainties? From ensemble members to PDF’s and CDF’s Good ensembles – bad ensembles? How to verify probabilistic forecasts
26
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Ensemble Prediction System 1 control run + 50 perturbed runs (T L 399 L62) added dimension of ensemble members f(x,y,z,t,e) How do we deal with added dimension when interpreting, verifying and using EPS output?
27
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting From ensembles to PDF’s
28
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Count members per bin
29
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Discrete probability distribution
30
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Continuous probability density function
31
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Continuous PDF f (x) is the “probability density” function, or PDF is the “mean” is the “standard deviation”
32
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Continuous PDF’s Which forecast has a mean of 10 degree celsius?
33
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Continuous PDF’s Which forecast has the highest σ (standard deviation)?
34
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Continuous PDF’s = 0.0, = 0.5 = 10.0, = 1.0 = 15.0, = 2.0
35
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting What information can we get from PDF’s? PDF’s describe the probability density of a continuous spectrum of possible outcomes Probability density describes relative likelihood to be near a particular value We have to distinguish between: Continuous events: unlimited number of possible outcomes (temperature, windspeed, …) Discrete events: limited number of possible outcomes (rain/no-rain, temperature below/above freezing,…) Probabilities are only meaningful for discrete events P(9≤T≤11) can be determined from PDF (or CDF) P(T=10°C) = 0
36
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Probabilities related to event
37
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Probabilities related to event
38
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Probability density function
39
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting From PDF’s to probabilities
40
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting From PDF’s to CDF’s
41
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Probabilities from a CDF What is the probability that the temperature will be 10°C?
42
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Probabilities from a CDF What is the probability that the temperature will be 10°C? P(X=10) = 0
43
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Probabilities from a CDF What is the probability that the temperature will be ≤10°C?
44
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Probabilities from a CDF What is the probability that the temperature will be ≤10°C? P(X≤10) = 0.5
45
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Probabilities from a CDF What is the probability that the temperature will be >11°C?
46
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Probabilities from a CDF What is the probability that the temperature will be >11°C? P(X>11) = P(X≤∞) – P(X≤11) = 1. - 0.85 = 0.15
47
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Probabilities from a CDF What is the probability that the temperature will be between 9-11°C?
48
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Probabilities from a CDF What is the probability that the temperature will be between 9-11°C? P(9≤X≤11) = P(X≤11) – P(X<9) = 0.85 – 0.15 = 0.70
49
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting From PDF’s to CDF’s
50
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Ensemble Prediction System 1 control run + 50 perturbed runs (T L 399 L62) added dimension of ensemble members f(x,y,z,t,e) How do we deal with added dimension when interpreting, verifying and using EPS output? Transition from forecasting local events (22°C) to categorical events (>20°) deterministic (yes/no) to probabilistic (x%)
51
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Outline Why do we need Ensemble Prediction Systems? Chaos theory and its consequences for weather prediction How are probabilistic forecasts made in practice? How do we represent uncertainties? From ensemble members to PDF’s and CDF’s Good ensembles – bad ensembles? How to verify probabilistic forecasts
52
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Objective of diagnostic/verification tools Assessing the goodness of a forecast system involves determining skill and value of forecasts
53
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Objective of diagnostic/verification tools Assessing the goodness of a forecast system involves determining skill and value of forecasts A forecast has skill if it predicts the observed conditions well according to some objective or subjective criteria. A forecast has value if it helps the user to make better decisions than without knowledge of the forecast.
54
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Objective of diagnostic/verification tools Assessing the goodness of a forecast system involves determining skill and value of forecasts A forecast has skill if it predicts the observed conditions well according to some objective or subjective criteria. A forecast has value if it helps the user to make better decisions than without knowledge of the forecast. Forecasts with poor skill can be valuable (e.g. location mismatch) Forecasts with high skill can be of little value (e.g. blue sky desert)
55
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Assessing the quality of a forecast The forecast indicated 10% probability for rain It did rain on the day Was it a good forecast? □ Yes □ No □ I don’t know Single probabilistic forecasts are never completely wrong or right (unless they give 0% or 100% probabilities) To evaluate a forecast system we need to look at a (large) number of forecast–observation pairs
56
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Assessing the quality of a forecast system Characteristics of a forecast system: Consistency*: Do the observations statistically belong to the distributions of the forecast ensembles? (consistent degree of ensemble dispersion) Reliability: Can I trust the probabilities to mean what they say? Sharpness: How much do the forecasts differ from the climatological mean probabilities of the event? Resolution: How much do the forecasts differ from the climatological mean probabilities of the event, and the systems gets it right? Skill: Are the forecasts better than my reference system (chance, climatology, persistence,…)? * Note that terms like consistency, reliability etc. are not always well defined in verification theory and can be used with different meanings in other contexts
57
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Rank Histogram Rank Histograms asses whether the ensemble spread is consistent with the assumption that the observations are statistically just another member of the forecast distribution Check whether observations are equally distributed amongst predicted ensemble Sort ensemble members in increasing order and determine where the observation lies with respect to the ensemble members
58
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Rank Histogram Rank Histograms asses whether the ensemble spread is consistent with the assumption that the observations are statistically just another member of the forecast distribution Check whether observations are equally distributed amongst predicted ensemble Sort ensemble members in increasing order and determine where the observation lies with respect to the ensemble members Temperature -> Rank 1 case
59
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Rank Histogram Rank Histograms asses whether the ensemble spread is consistent with the assumption that the observations are statistically just another member of the forecast distribution Check whether observations are equally distributed amongst predicted ensemble Sort ensemble members in increasing order and determine where the observation lies with respect to the ensemble members Temperature -> Rank 1 case Rank 4 case Temperature ->
60
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Rank Histograms A uniform rank histogram is a necessary but not sufficient criterion for determining that the ensemble is reliable (see also: T. Hamill, 2001, MWR) OBS is indistinguishable from any other ensemble member OBS is too often below the ensemble members (biased forecast) OBS is too often outside the ensemble spread
61
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Reliability A forecast system is reliable if: statistically the predicted probabilities agree with the observed frequencies, i.e. taking all cases in which the event is predicted to occur with a probability of x%, that event should occur exactly in x% of these cases; not more and not less.
62
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Reliability A forecast system is reliable if: statistically the predicted probabilities agree with the observed frequencies, i.e. taking all cases in which the event is predicted to occur with a probability of x%, that event should occur exactly in x% of these cases; not more and not less. A reliability diagram displays whether a forecast system is reliable (unbiased) or produces over-confident / under- confident probability forecasts
63
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Reliability A forecast system is reliable if: statistically the predicted probabilities agree with the observed frequencies, i.e. taking all cases in which the event is predicted to occur with a probability of x%, that event should occur exactly in x% of these cases; not more and not less. A reliability diagram displays whether a forecast system is reliable (unbiased) or produces over-confident / under- confident probability forecasts A reliability diagram also gives information on the resolution (and sharpness) of a forecast system Forecast PDF Climatological PDF
64
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Reliability Diagram Take a sample of probabilistic forecasts: e.g. 30 days x 2200 GP = 66000 forecasts How often was event (T > 25) forecasted with X probability? FC Prob.# FC“perfect FC” OBS-Freq. “real” OBS-Freq. 100% 8000 8000 (100%) 7200 (90%) 90% 5000 4500 ( 90%) 4000 (80%) 80% 4500 3600 ( 80%) 3000 (66%) …. 10% 5500 550 ( 10%) 800 (15%) 0% 7000 0 ( 0%) 700 (10%) 25
65
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Reliability Diagram Take a sample of probabilistic forecasts: e.g. 30 days x 2200 GP = 66000 forecasts How often was event (T > 25) forecasted with X probability? FC Prob.# FC“perfect FC” OBS-Freq. “real” OBS-Freq. 100% 8000 8000 (100%) 7200 (90%) 90% 5000 4500 ( 90%) 4000 (80%) 80% 4500 3600 ( 80%) 3000 (66%) …. 10% 5500 550 ( 10%) 800 (15%) 0% 7000 0 ( 0%) 700 (10%) OBS-Frequency 0 100 100 FC-Probability 0
66
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Reliability Diagram over-confident modelperfect model
67
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Reliability Diagram under-confident modelperfect model
68
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Reliability diagram Reliability score (the smaller, the better) imperfect model perfect model
69
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Components of the Brier Score N = total number of cases I = number of probability bins n i = number of cases in probability bin i f i = forecast probability in probability bin I o i = frequency of event being observed when forecasted with f i Reliability: forecast probability vs. observed relative frequencies
70
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Sharpness Diagrams show the distribution of issued forecast probabilities FC Probability Rel. Frequency Sample A Sample B
71
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Sharpness Diagrams show the distribution of issued forecast probabilities FC Probability Rel. Frequency Sample A Sample B Which sample contains the sharper probability forecasts?
72
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Reliability diagram Poor resolution Good resolution Reliability score (the smaller, the better) Resolution score (the bigger, the better) c c Size of red bullets represents number of forecasts in probability category (sharpness)
73
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Components of the Brier Score N = total number of cases I = number of probability bins n i = number of cases in probability bin i f i = forecast probability in probability bin I o i = frequency of event being observed when forecasted with f i c = frequency of event being observed in whole sample Reliability: forecast probability vs. observed relative frequencies Resolution: ability to issue reliable forecasts close to 0% or 100% Uncertainty: variance of observations frequency in sample Brier Score = Reliability – Resolution + Uncertainty
74
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Brier Score The Brier score is a measure of the accuracy of probability forecasts with p: forecast probability (fraction of members predicting event) o: observed outcome (1 if event occurs; 0 if event does not occur) BS varies from 0 (perfect deterministic forecasts) to 1 (perfectly wrong!) Considering N forecast – observation pairs the BS is defined as: BS corresponds to RMS error for deterministic forecasts
75
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Brier Skill Score Skill scores are used to compare the performance of forecasts with that of a reference forecast such as climatology or persistence positive (negative) BSS better (worse) than reference Constructed so that perfect FC takes value 1 and reference FC = 0 Skill score = score of current FC – score for ref FC score for perfect FC – score for ref FC
76
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Assessing the quality of a forecast system Characteristics of a forecast system: Consistency: Do the observations statistically belong to the distributions of the forecast ensembles? (consistent degree of ensemble dispersion) Reliability: Can I trust the probabilities to mean what they say? Sharpness: How much do the forecasts differ from the climatological mean probabilities of the event? Resolution: How much do the forecasts differ from the climatological mean probabilities of the even, and the systems gets it right? Skill: Are the forecasts better than my reference system (chance, climatology, persistence,…)? Reliability Diagram Rank Histogram Brier Skill Score
77
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Brier Score -> Ranked Probability Score 5 10 15 20 25 f(y) Brier Score used for two category (yes/no) situations (e.g. T > 15 o C) 5 10 15 20 25 RPS takes into account ordered nature of variable (“extreme errors”) F(y) 1
78
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Ranked Probability Score category f(y) category F(y) 1 PDF CDF
79
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Ranked Probability Score category f(y) category F(y) 1 PDF CDF
80
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Ranked Probability Score category f(y) PDF RPS=0.01 sharp & accurate category f(y) PDF RPS=0.15 sharp, but biased category f(y) PDF RPS=0.05 not very sharp, slightly biased category f(y) PDF RPS=0.08 accurate, but not sharp climatology
81
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Ranked Probability Score Measures the quadratic distance between forecast and verification probabilities for several probability categories k It is the average Brier score across the range of the variable Ranked Probability Skill Score (RPSS) is a measure for skill relative to a reference forecast Emphasizes accuracy by penalizing large errors more than “near misses” Rewards sharp forecast if it is accurate
82
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Example of RPSS for ECMWF’s EPS
83
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Goals Students to learn: Why & how are probabilistic forecasts produced & used? Teacher to learn: What are your greatest needs & expectations from an EPS? Achieve together: What is the best way forward to integrate uncertainty information as an integral component into public weather forecasts?
84
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Assignment-I What are your greatest needs and/or expectations from the probabilistic products of an EPS? Are there any areas in your day-to-day work which benefit from probabilistic forecasts (now or in the future)? Which aspect of the output of an EPS is most valuable for you? Can you think of any information valuable for you which is currently not available as EPS product?
85
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Assignment-II How would you present a probabilistic weather forecast to the general public? Prepare one or more examples of a weather forecast containing probabilistic information for: TV Radio Newspaper Internet Governmental agency (weather warning) Commercial company … Some more hints next week, but you might think already this week about your general concept
86
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting References and further reading Palmer, T. and R. Hagedorn (editors), 2006: Predictability of weather and climate. Cambridge University Press, pp.702 Jolliffe, I.T. and D.B. Stephenson, 2003: Forecast Verification. A Practitioner’s Guide in Atmospheric Science. Wiley, pp. 240 Wilks, D. S., 2006: Statistical methods in the atmospheric sciences. 2 nd ed. Academic Press, pp.627 ECMWF newsletter for updates on EPS performance Hamill, T., 2001: Interpretation of Rank Histograms for Verifying Ensemble Forecasts. Monthly Weather Review, 129, 550-560 Buizza, R., Bidlot, J.-R., Wedi, N., Fuentes, M., Hamrud, M., Holt, G., and Vitart, F., 2007: The new ECMWF VAREPS (Variable Resolution Ensemble Prediction System). Q. J. Roy. Meteorol. Soc., 133, 681-695 Leutbecher, M. and T.N. Palmer, 2007: Ensemble forecasting. J. Comp. Phys., in press
87
EUMETCAL NWP-course 2007: The Concept of Ensemble Forecasting Web links: ECMWF products and training: http://www.ecmwf.int/products/forecasts/d/charts http://www.ecmwf.int/products/forecasts/d/charts http://www.ecmwf.int/newsevents/training/meteorological_presentations /MET_PR.html http://www.ecmwf.int/newsevents/training/meteorological_presentations /MET_PR.html NCEP ensemble training: http://www.emc.ncep.noaa.gov/gmb/ens/training.html http://www.emc.ncep.noaa.gov/gmb/ens/training.html http://www.hpc.ncep.noaa.gov/ensembletraining/http://www.hpc.ncep.noaa.gov/ensembletraining/ Interactive learning on probabilities: http://www.shodor.org/interactivate/activities/ http://www.shodor.org/interactivate/activities/
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.