Class 4 Ordinary Least Squares SKEMA Ph.D programme 2010-2011 Lionel Nesta Observatoire Français des Conjonctures Economiques

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Regression and correlation methods
Forecasting Using the Simple Linear Regression Model and Correlation
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
The Simple Linear Regression Model: Specification and Estimation
Copyright © 2008 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics, 9e Managerial Economics Thomas Maurice.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Economics 20 - Prof. Anderson1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 6. Heteroskedasticity.
Chapter 12 Simple Regression
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 6. Heteroskedasticity.
Regression Hal Varian 10 April What is regression? History Curve fitting v statistics Correlation and causation Statistical models Gauss-Markov.
Lecture 23: Tues., April 6 Interpretation of regression coefficients (handout) Inference for multiple regression.
The Simple Regression Model
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
BCOR 1020 Business Statistics
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Economics Prof. Buckles
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Introduction to Regression Analysis, Chapter 13,
Simple Linear Regression. Introduction In Chapters 17 to 19, we examine the relationship between interval variables via a mathematical equation. The motivation.
Correlation & Regression
Regression and Correlation Methods Judy Zhong Ph.D.
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
Hypothesis Testing in Linear Regression Analysis
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
© 2002 Prentice-Hall, Inc.Chap 14-1 Introduction to Multiple Regression Model.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Introduction to Linear Regression
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Copyright © 2014 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Copyright © 2005 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics Thomas Maurice eighth edition Chapter 4.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
10B11PD311 Economics REGRESSION ANALYSIS. 10B11PD311 Economics Regression Techniques and Demand Estimation Some important questions before a firm are.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Class 4 Ordinary Least Squares CERAM February-March-April 2008 Lionel Nesta Observatoire Français des Conjonctures Economiques
1 Javier Aparicio División de Estudios Políticos, CIDE Primavera Regresión.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Introduction to Multiple Regression Lecture 11. The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more.
Class 5 Multiple Regression CERAM February-March-April 2008 Lionel Nesta Observatoire Français des Conjonctures Economiques
Simple linear regression and correlation Regression analysis is the process of constructing a mathematical model or function that can be used to predict.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
The simple linear regression model and parameter estimation
Chapter 4: Basic Estimation Techniques
Chapter 4 Basic Estimation Techniques
Regression Analysis AGEC 784.
Inference for Least Squares Lines
Statistics for Managers using Microsoft Excel 3rd Edition
Basic Estimation Techniques
The Simple Linear Regression Model: Specification and Estimation
Chapter 11 Simple Regression
Basic Estimation Techniques
I271B Quantitative Methods
CHAPTER 29: Multiple Regression*
So far --> looked at the effect of a discrete variable on a continuous variable t-test, ANOVA, 2-way ANOVA.
Simple Linear Regression
Simple Linear Regression
BEC 30325: MANAGERIAL ECONOMICS
BEC 30325: MANAGERIAL ECONOMICS
Presentation transcript:

Class 4 Ordinary Least Squares SKEMA Ph.D programme Lionel Nesta Observatoire Français des Conjonctures Economiques

Introduction to Regression  Ideally, the social scientist is interested not only in knowing the intensity of a relationship, but also in quantifying the magnitude of a variation of one variable associated with the variation of one unit of another variable.  Regression analysis is a technique that examines the relation of a dependent variable to independent or explanatory variables.  Simple regression y = f(X)  Multiple regression y = f(X,Z)  Let us start with simple regressions

Scatter Plot of Fertilizer and Production

Objective of Regression  It is time to ask: “What is a good fit?”  “A good fit is what makes the error small”  “The best fit is what makes the error smallest”  Three candidates 1.To minimize the sum of all errors 2.To minimize the sum of absolute values of errors 3.To minimize the sum of squared errors

To minimize the sum of all errors X Y – – + X Y – + + Problem of sign

X Y +3 To minimize the sum of absolute values of errors X Y –1 +2 Problem of middle point

To minimize the sum of squared errors X Y – – + Solve both problems

ε ε²ε²  Overcomes the sign problem  Goes through the middle point  Squaring emphasizes large errors  Easily Manageable  Has a unique minimum  Has a unique – and best - solution To minimize the sum of squared errors

Scatter Plot of Fertilizer and Production

Scatter Plot of R&D and Patents (log)

The Simple Regression Model y i Dependent variable (to be explained) x i Independent variable (explanatory) α First parameter of interest  Second parameter of interest ε i Error term

The Simple Regression Model

ε ε²ε² To minimize the sum of squared errors

ε ε²ε²

Application to SKEMA_BIO Data using Excel

Interpretation  When the log of R&D (per asset) increases by one unit, the log of patent per asset increases by  Remember! A change in log of x is a relative change of x itself  A 1% increase in R&D (per asset) entails a 1.748% increase in the number of patent (per asset).

OLS with STATA  Stata Instruction : regress (reg) reg y x 1 x 2 x 3 … x k [if] [weight] [, options] Options : noconstant : gets rid of constant robust : estimates robust variances, even with heteroskedasticity if : selects observations weight : Weighted least squares

Application to Data using STATA reg lpat_assets lrdi predict newvar, [type] Type means residual or predictions

Assessing the Goodness of Fit  It is important to ask whether a specification provides a good prediction on the dependent variable, given values of the independent variable.  Ideally, we want an indicator of the proportion of variance of the dependent variable that is accounted for – or explained – by the statistical model.  This is the variance of predictions ( ŷ ) and the variance of residuals ( ε ), since by construction, both sum to overall variance of the dependent variable ( y ).

Overall Variance

Decomposing the overall variance (1)

Decomposing the overall variance (2)

Coefficient of determination R²  R 2 is a statistic which provides information on the goodness of fit of the model.

Fisher’s F Statistics  Fisher’s statistics is relevant as a form of ANOVA on SS fit which tells us whether the regression model brings significant (in a statistical sense, information. ModelSSdfMSSF (1)(2)(3)(2)/(3) Fittedp ResidualN–p–1N–p–1 TotalN–1N–1 p: number of parameters N: number of observations

STATA output

What the R² is not  Independent variables are a true cause of the changes in the dependent variable  The correct regression was used  The most appropriate set of independent variables has been chosen  There is co-linearity present in the data  The model could be improved by using transformed versions of the existing set of independent variables

Inference on β  We have estimated   Therefore we must test whether the estimated parameter is significantly different than 0, and, by way of consequence, we must say something on the distribution – the mean and variance – of the true but unobserved β*

The mean and variance of β  It is possible to show that is a good approximation, i.e. an unbiased estimator, of the true parameter β*.  The variance of β is defined as the ratio of the mean square of errors over the sum of squares of the explanatory variable

The confidence interval of β  We must now define de confidence interval of β, at 95%. To do so, we use the mean and variance of β and define the t value as follows:  Therefore, the 95% confidence interval of β is: If the 95% CI does not include 0, then β is significantly different than 0.

Student t Test for β  We are also in the position to infer on β  H 0 : β* = 0  H 1 : β* ≠ 0 Rule of decision Accept H 0 is | t | < t α/2 Reject H 0 is | t | ≥ t α/2

STATA output