Download presentation
Presentation is loading. Please wait.
1
Forecasting JY Le Boudec 1
2
Contents 1.What is forecasting ? 2.Linear Regression 3.Avoiding Overfitting 4.Differencing 5.ARMA models 6.Sparse ARMA models 7.Case Studies 2
3
1. What is forecasting ? Assume you have been able to define the nature of the load It remains to have an idea about its intensity It is impossible to forecast without error The good engineer should Forecast what can be forecast Give uncertainty intervals The rest is outside our control 3
4
4
5
2. Linear Regression Simple, for simple cases Based on extrapolating the explanatory variables 5
6
6
7
7
8
8
9
Estimation and Forecasting In practice we estimate from y, …, y t When computing the forecast, we pretend is known, and thus make an estimation error It is hoped that the estimation error is much less than the confidence interval for forecast In the case of linear regression, the theorem gives the global error exactly In general, we won’t have this luxury 9
10
10
11
We saw this already A case where estimation error versus prediction uncertainty can be quantified Prediction interval if model is known Prediction interval accounting for estimation (t = 100 observed points) 11
12
3. The Overfitting Problem The best model is not necessarily the one that fits best 12
13
Prediction for the better model 13 This is the overfitting problem
14
How to avoid overfitting Method 1: use of test data Method 2: information criterion 14
15
15
16
16
17
Best Model for Internet Data, polynomial of degree up to 2 17
18
18 d = 1
19
Best Model for Internet Data, polynomial of degree up to 10 19
20
4. Differencing the Data 20
21
21
22
22
23
Point Predictions from Differenced Data 23
24
Background On Filters (Appendix B) 24 We need to understand how to use discrete filters. Example: write the Matlab command for
25
25
26
26 A simple filter Q: compute X back from Y
27
27
28
28
29
29
30
30 Impulse Response
31
31
32
32
33
33 A filter with stable inverse
34
How is this prediction done ? This is all very intuitive 34
35
35
36
Prediction assuming differenced data is iid 36
37
Prediction Intervals A prediction without prediction intervals is only a small part of the story The financial crisis might have been avoided if investors had been aware of prediction intervals 37
38
38
39
39
40
Compare the Two 40 Linear Regression with 3 parameters + varianceAssuming differenced data is iid
41
41
42
5. Using ARMA Models When the differenced data appears stationary but not iid 42
43
Test of iid-ness 43
44
44
45
ARMA Process 45
46
46
47
ARMA Processes are Gaussian (non iid) 47
48
48
49
49
50
ARIMA Process 50
51
Fitting an ARMA Process Called the Box-Jenkins method Difference the data until stationary Examine ACF to get a feeling of order (p,q) Fit an ARMA model using maximum likelihood 51
52
Fitting an ARIMA Model Apply Scientific Method 1. make stationary and normal (how ?) 2. bound orders p,q 3. fit an ARMA model to Y t - i.e. Y t - » ARMA 4. compute residuals and verify white noise and normal Fitting an ARMA model Pb is : given orders p,q given (x 1, …x n ) (transformed data) compute the parameters of an ARMA (p,q) model that maximizes the likelihood 52 A: the mean , the polynomial coefficients k and k, the noise variance 2 Q:What are the parameters ?
53
This is a non-linear optimization problem Maximizing the likelihood is a non-linear optimization problems Usually solved by iterative, heuristic algorithms, may converge to a local maximum may not converge Some simple, non MLE, heuristics exist for AR or MA models Ex: fit the AR model that has the same theoretical ACF as the sample ACF Common practice is to bootstrap the optimization procedure by starting with a “best guess” AR or MA fit, using heuristic above 53
54
Fitting ARMA Model is Same as Minimizing One-Step ahead prediction error 54
55
Best Model Order 55
56
Check the Residuals 56
57
Example 57
58
58
59
59 Forecasting with ARMA Assume Y t is fitted to an ARMA process The prediction problem is: given Y 1 =y 1,…,Y t =y t find the conditional distribution of Y t+h We know it is normal, with a mean that depends on (y 1,…,y t ) and a variance that depends only on the fitted parameters of the ARMA process There are many ways to compute this; it is readily done by Matlab
60
60
61
61
62
Forecasting Formulae for ARIMA Y = original data X = differenced data, fitted to an ARMA model 1.Obtain point prediction for X using what we just saw 2.Apply Proposition 6.4.1 to obtain point prediction for Y 3.Apply formula for prediction interval There are several other methods, but they may have numerical problems. See comments in the lecture notes after prop 6.5.2 62
63
63
64
Improve Confidence Interval If Residuals are not Gaussian (but appear to be iid) Assume residuals are not gaussian but are iid How can we get confidence intervals ? Bootstrap by sampling from residuals 64
65
65
66
With bootstrap from residuals With gaussian assumption 66
67
6. Sparse ARMA Models Problem: avoid many parameters when the degree of the A and C polynomials is high, as in the previous example Based on heuristics Multiplicative ARIMA, constrained ARIMA Holt Winters 67
68
68
69
Holt Winters Model 1: EWMA 69
70
70
71
EWMA is OK when there is no trend and no periodicity 71
72
72
73
73
74
74
75
75
76
76
77
77
78
Sparse models give less accurate predictions but have much fewer parameters and are simple to fit. 78 Constrained ARIMA (corrected or not)
79
79 7. Case Studies
80
80
81
81
82
82
83
83
84
84
85
85
86
86
87
87
88
88 h = 1
89
89
90
90 h = 2
91
91
92
92 log h = 1 log h = 1
93
Conclusion Forecasting is useful when savings matter; for example Save money on server space rental Save energy Capturing determinism is perhaps most important and easiest Prediction intervals are useful to avoid gross mistakes Re-scaling the data may help … à vous de jouer. 93
94
94
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.