Stat 153 - 25 Sept 2008 D. R. Brillinger Chapter 5 - Forecasting Data x 1,..., x N What about x N+h, h>0 No single method universally applicable extrapolation.

Slides:



Advertisements
Similar presentations
Heteroskedasticity Hill et al Chapter 11. Predicting food expenditure Are we likely to be better at predicting food expenditure at: –low incomes; –high.
Advertisements

Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Ch11 Curve Fitting Dr. Deshi Ye
Read Chapter 17 of the textbook
Forecasting JY Le Boudec 1. Contents 1.What is forecasting ? 2.Linear Regression 3.Avoiding Overfitting 4.Differencing 5.ARMA models 6.Sparse ARMA models.
REGRESSION Want to predict one variable (say Y) using the other variable (say X) GOAL: Set up an equation connecting X and Y. Linear regression linear.
Copyright © 2008 Pearson Education, Inc. Chapter 1 Linear Functions Copyright © 2008 Pearson Education, Inc.
Linear Regression with One Regression
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
NY Times 23 Sept time series of the day. Stat Sept 2008 D. R. Brillinger Chapter 4 - Fitting t.s. models in the time domain sample autocovariance.
Statistics 350 Lecture 23. Today Today: Exam next day Good Chapter 7 questions: 7.1, 7.2, 7.3, 7.28, 7.29.
Slide Copyright © 2010 Pearson Education, Inc. Active Learning Lecture Slides For use with Classroom Response Systems Business Statistics First Edition.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Forecasting Outside the Range of the Explanatory Variable: Chapter
15: Linear Regression Expected change in Y per unit X.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Multiple Regression Fundamentals Basic Interpretations.
MANAGERIAL ECONOMICS 11 th Edition By Mark Hirschey.
Regression. Population Covariance and Correlation.
Regression Regression relationship = trend + scatter
Autocorrelation in Time Series KNNL – Chapter 12.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 12 Multiple Linear Regression and Certain Nonlinear Regression Models.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Linear Models Alan Lee Sample presentation for STATS 760.
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
^ y = a + bx Stats Chapter 5 - Least Squares Regression
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Stat 112 Notes 6 Today: –Chapter 4.1 (Introduction to Multiple Regression)
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
1 Chapter 5 : Volatility Models Similar to linear regression analysis, many time series exhibit a non-constant variance (heteroscedasticity). In a regression.
The Box-Jenkins (ARIMA) Methodology
Chapter 8 Linear Regression. Fat Versus Protein: An Example 30 items on the Burger King menu:
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
The Line of Best Fit CHAPTER 2 LESSON 3  Observed Values- Data collected from sources such as experiments or surveys  Predicted (Expected) Values-
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Statistics 350 Review. Today Today: Review Simple Linear Regression Simple linear regression model: Y i =  for i=1,2,…,n Distribution of errors.
Chapter 10Design & Analysis of Experiments 8E 2012 Montgomery 1.
Predicting Future. Two Approaches to Predition n Extrapolation: Use past experiences for predicting future. One looks for patterns over time. n Predictive.
Chapter 15 Multiple Regression Model Building
The simple linear regression model and parameter estimation
Chapter 12 Multiple Linear Regression and Certain Nonlinear Regression Models.
Case study 4: Multiplicative Seasonal ARIMA Model
AP Statistics Chapter 14 Section 1.
Validation of Regression Models
Statistics 153 Review - Sept 30, 2008
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
Statistics in Data Mining on Finance by Jian Chen
Chapter 8 – Linear Regression
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Linear Regression.
The Science of Predicting Outcome
^ y = a + bx Stats Chapter 5 - Least Squares Regression
Linear regression Fitting a straight line to observations.
Chapter 1 Linear Functions
Design & Analysis of Experiments 7E 2009 Montgomery
Simple Linear Regression
Nonlinear Fitting.
Chapter 7 Demand Forecasting in a Supply Chain
11C Line of Best Fit By Eye, 11D Linear Regression
Example on the Concept of Regression . observation
Chengyuan Yin School of mathematics
Case study 3: SEASONAL ARIMA MODEL
Lesson 2.2 Linear Regression.
Regression and Correlation of Data
2k Factorial Design k=2 Ex:.
Presentation transcript:

Stat Sept 2008 D. R. Brillinger Chapter 5 - Forecasting Data x 1,..., x N What about x N+h, h>0 No single method universally applicable extrapolation conditional statement scenarios

Conditional expected value, E(Y|X) X can be vector-valued Y = X N+h X=(X 1,...,X N )

Multiple regression Fit by least squares lm() Residuals

Linear c's to minimize E(X N+h - c 0 X N -...-c N-1 X 1 ) 2 Conditional expected value if {X t } Gaussian/normal

Prediction/forecast error Linear process The error is Represent series as linear process

Box-Jenkins Model. arima(p,d,q) Stages of Box-Jenkins forecasting model (1) Model identification. Which p,d,q? (2) Estimation. arima() (3) Diagnostic checking. residuals (4) Consideration of alternate models. If necessary Cp. The scientific method

Pertinent R functions arima() tsdiag() predict() - applied to output of arima help("predict.Arima")