Download presentation
Presentation is loading. Please wait.
1
Statistics 153 Review - Sept 30, 2008
Notes re the 153 Midterm on October 2, 2008 1. The class will be split into two groups by first letter of surname. Letters A to L will take the exam in 330 Evans. Letters M to Z will take it in 340 Evans. 2. The exam will cover material through Chapter 4. 3. There will be 2 questions, answer both. 4. The questions will be like the Assignment's, but the exam will be closed book - no books or notes allowed. 5. No questions to the Proctors about the exam content please. If unsure, make an interpretation, state it and answer that. 6. You will have 60 minutes, exactly, to work. 7. The exam itself will be handed out. There will be space on it to answer the questions. 8. The solutions to Assignment 3 will be posted in the glass case, center corridor, third floor Evans, Tuesday after class, but the papers won't have been graded yet. I will do a review in class September 30. Suggest some topics. 3
2
Name:________________________ October 2, 2008
MIDTERM EXAMINATION Statistics D. R. Brillinger Answer both questions in the space provided. Show your work. If you are not sure of the meaning of a question, set down an interpretation, and provide a reasonable answer. You have exactly 60 minutes for the exam. Question 1. Let {Zt} be a purely random process.
3
What is a time series? a sequence of numbers, x, indexed by t
4
Stat 153 - 11 Sept 2008 D. R. Brillinger
Simple descriptive techniques Trend Xt = + t + t
5
Filtering/filters yt = r=-qs ar xt+r yt = k hk xt-k p. 189 This form carries stationary into stationary Filters may be in series stationarity preserved if filter time invariant
6
Differencing yt = xt - xt-1 = xt "removes" linear trend Seasonal variation model Xt = mt + St + t St St-s 12 xt = xt - xt-12 , t in months
7
Stationary case, autocorrelation estimate at lag k, rk
t=1N-k (xt- )(xt+k - ) over t=1N (xt )2 autocovariance estimate at lag k, ck t=1N-k (xt )(xt+k ) / N
8
Stat 153 - 16 Sept 2008 D. R. Brillinger
Chapter 3 mean function variance function autocovariance
9
Strictly stationary All joint distributions unaffected by simple time shift Second-order stationary
10
Properties of autocovariance function
Does not identify model uniquely
11
Useful models and acf's Purely random Building block
12
Random walk not stationary
13
*
14
Moving average, MA(q) From * stationary
15
Backward shift operator
Linear process. Need convergence condition Stationary
16
autoregressive process, AR(p)
first-order, AR(1) Markov * Linear process For convergence/stationarity root of φ(z)=0 in |z|>1
17
a.c.f. From * p.a.c.f vanishes for k>p
18
In general case, Very useful for prediction
19
ARMA(p,q) Roots of (z)=0 in |z|>1 for stationarity Roots of θ(z)=0 in |z|>1 for invertibility
20
ARIMA(p,d,q).
21
Yule-Walker equations for AR(p).
Correlate, with Xt-k , each side of For AR(1)
22
Stat 153 - 23 Sept 2008 D. R. Brillinger
Chapter 4 - Fitting t.s. models in the time domain sample autocovariance coefficient. Under stationarity, ...
23
Estimated autocorrelation coefficient
asymptotically normal interpretation
24
Estimating the mean Can be bigger or less than 2/N
25
Fitting an autoregressive, AR(p)
Easy. Remember regression and least squares normal equations
26
AR(1) Cp.
27
Seasonal ARIMA. seasonal parameter s
SARIMA(p,d,q)(P,D,Q)s Example
28
Residual analysis. Paradigm observation = fitted value plus residual The parametric models have contained Zt
29
Portmanteau lack-of-fit statistic
ARMA(p,q) appropriate?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.