Chengyuan Yin School of mathematics

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Econometrics I Professor William Greene Stern School of Business
Topic 12: Multiple Linear Regression
Kin 304 Regression Linear Regression Least Sum of Squares
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Linear Regression. PSYC 6130, PROF. J. ELDER 2 Correlation vs Regression: What’s the Difference? Correlation measures how strongly related 2 variables.
Ch11 Curve Fitting Dr. Deshi Ye
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Psychology 202b Advanced Psychological Statistics, II February 1, 2011.
Regression Analysis. Unscheduled Maintenance Issue: l 36 flight squadrons l Each experiences unscheduled maintenance actions (UMAs) l UMAs costs $1000.
Multiple Regression Involves the use of more than one independent variable. Multivariate analysis involves more than one dependent variable - OMS 633 Adding.
Multivariate Data Analysis Chapter 4 – Multiple Regression.
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
1 Econometrics 1 Lecture 6 Multiple Regression -tests.
Quantitative Business Analysis for Decision Making Simple Linear Regression.
Econometrics I Summer 2011/2012 Course Guarantor: prof. Ing. Zlata Sojková, CSc., Lecturer: Ing. Martina Hanová, PhD.
Introduction to Regression Analysis, Chapter 13,
Simple Linear Regression Analysis
Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) =  +  1 X 1 +  +  k.
Chapter 11 Simple Regression
Introduction to Regression Analysis. Two Purposes Explanation –Explain (or account for) the variance in a variable (e.g., explain why children’s test.
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
CHAPTER 15 Simple Linear Regression and Correlation
Chapter 12 Multiple Linear Regression Doing it with more variables! More is better. Chapter 12A.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Statistics for the Social Sciences Psychology 340 Fall 2013 Correlation and Regression.
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
MTH 161: Introduction To Statistics
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Part 2: Model and Inference 2-1/49 Regression Models Professor William Greene Stern School of Business IOMS Department Department of Economics.
6-1 Introduction To Empirical Models Based on the scatter diagram, it is probably reasonable to assume that the mean of the random variable Y is.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
1 G Lect 2w Review of expectations Conditional distributions Regression line Marginal and conditional distributions G Multiple Regression.
How Good is a Model? How much information does AIC give us? –Model 1: 3124 –Model 2: 2932 –Model 3: 2968 –Model 4: 3204 –Model 5: 5436.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Chapter 16 Multiple Regression and Correlation
Lecturer: Ing. Martina Hanová, PhD. Business Modeling.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Econometrics III Evgeniya Anatolievna Kolomak, Professor.
Lecture 6 Feb. 2, 2015 ANNOUNCEMENT: Lab session will go from 4:20-5:20 based on the poll. (The majority indicated that it would not be a problem to chance,
The simple linear regression model and parameter estimation
EXCEL: Multiple Regression
REGRESSION G&W p
Decomposition of Sum of Squares
Regression Analysis Part D Model Building
THE LINEAR REGRESSION MODEL: AN OVERVIEW
How Good is a Model? How much information does AIC give us?
Regression Diagnostics
10.3 Coefficient of Determination and Standard Error of the Estimate
Regression.
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
Evgeniya Anatolievna Kolomak, Professor
Introduction to Econometrics
Regression Computer Print Out
6-1 Introduction To Empirical Models
Chengyuan Yin School of Mathematics
Chengyuan Yin School of Mathematics
Econometrics Chengyaun yin School of Mathematics SHUFE.
Simple Linear Regression
Econometrics Analysis
Multiple Linear Regression
Econometrics I Professor William Greene Stern School of Business
Econometrics I Professor William Greene Stern School of Business
Econometrics I Professor William Greene Stern School of Business
Chapter 13 Additional Topics in Regression Analysis
Microeconometric Modeling
Decomposition of Sum of Squares
Presentation transcript:

Chengyuan Yin School of mathematics Econometrics Chengyuan Yin School of mathematics

10. Prediction in the Classical Regression Model Econometrics 10. Prediction in the Classical Regression Model

Forecasting Objective: Forecast Distinction: Ex post vs. Ex ante forecasting Ex post: RHS data are observed Ex ante: RHS data must be forecasted Prediction vs. model validation. Within sample prediction “Hold out sample”

Prediction Intervals Given x0 predict y0. Two cases: Estimate E[y|x0] = x0; Predict y0 = x0 + 0 Obvious predictor, b’x0 + estimate of 0. Forecast 0 as 0, but allow for variance. Alternative: When we predict y0 with bx0, what is the 'forecast error?' Est.y0 - y0 = bx0 - x0 - 0, so the variance of the forecast error is x0Var[b - ]x0 + 2 How do we estimate this? Form a confidence interval. Two cases: If x0 is a vector of constants, the variance is just x0 Var[b] x0. Form confidence interval as usual. If x0 had to be estimated, then we use a random variable. What is the variance of the product? (Ouch!) One possibility: Use bootstrapping.

Forecast Variance Variance of the forecast error is 2 + x0’ Var[b]x0 = 2 + 2[x0’ (X’X)-1x0] If the model contains a constant term, this is In terms squares and cross products of deviations from means. Interpretation: Forecast variance is smallest in the middle of our “experience” and increases as we move outside it.

Butterfly Effect 5.1 in the 6th edition

Salkever’s Algebraic Trick Salkever’s method of computing the forecasts and forecast variances Multiple regression of produces the least squares coefficient vector followed by the predictions. Residuals are 0 for the predictions, so s2( * )-1 gives the covariance matrix for the coefficient estimates and the variances for the forecasts. (Very clever, useful for understanding. Not actually used in modern software.)

Dummy Variable for One Observation A dummy variable that isolates a single observation. What does this do? Define d to be the dummy variable in question. Z = all other regressors. X = [Z,d] Multiple regression of y on X. We know that X'e = 0 where e = the column vector of residuals. That means d'e = 0, which says that ej = 0 for that particular residual. Fairly important result. Important to know.

Oaxaca Decomposition Two groups, two regression models: (Two time periods, men vs. women, two countries, etc.) y1 = X11 + 1 and y2 = X22 + 2 Consider mean values, y1* = E[y1|mean x1] = x1* 1 y2* = E[y2|mean x2] = x2* 2 Now, explain why y1* is diferent from y2*. (I.e., departing from y2, why is y1 different?) (Could reverse the roles of 1 and 2.) y1* - y2* = x1* 1 - x2* 2 = x1*(1 - 2) + (x1* - x2*) 2 (change in model) (change in conditions)

The Oaxaca Decomposition