Linear Regression with One Regression

Slides:



Advertisements
Similar presentations
Properties of Least Squares Regression Coefficients
Advertisements

Lecture 17: Tues., March 16 Inference for simple linear regression (Ch ) R2 statistic (Ch ) Association is not causation (Ch ) Next.
Multiple Regression Analysis
Regression and correlation methods
Chapter 12 Inference for Linear Regression
The Simple Regression Model
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
Linear Regression with One Regressor.  Introduction  Linear Regression Model  Measures of Fit  Least Squares Assumptions  Sampling Distribution of.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Simple Linear Regression
Session 2. Applied Regression -- Prof. Juran2 Outline for Session 2 More Simple Regression –Bottom Part of the Output Hypothesis Testing –Significance.
The Simple Linear Regression Model: Specification and Estimation
Chapter 10 Simple Regression.
Linear Regression with One Regression
Introduction to Econometrics The Statistical Analysis of Economic (and related) Data.
Introduction to Econometrics The Statistical Analysis of Economic (and related) Data.
Lecture 19: Tues., Nov. 11th R-squared (8.6.1) Review
The Simple Regression Model
1 Review of Correlation A correlation coefficient measures the strength of a linear relation between two measurement variables. The measure is based on.
The Simple Regression Model
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Business Statistics - QBM117 Statistical inference for regression.
Simple Linear Regression Analysis
Chapter 11 Simple Regression
Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Chapter 4 Linear Regression with One Regressor.
Linear Regression Inference
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
MTH 161: Introduction To Statistics
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
1Spring 02 First Derivatives x y x y x y dy/dx = 0 dy/dx > 0dy/dx < 0.
Chapter 5 Regression with a Single Regressor: Hypothesis Tests and Confidence Intervals.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Chapter 6 Introduction to Multiple Regression. 2 Outline 1. Omitted variable bias 2. Causality and regression analysis 3. Multiple regression and OLS.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Linear Regression Linear Regression. Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Purpose Understand Linear Regression. Use R functions.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Ex St 801 Statistical Methods Inference about a Single Population Mean (CI)
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 12 Analyzing the Association Between Quantitative Variables: Regression Analysis Section.
Quantitative Methods. Bivariate Regression (OLS) We’ll start with OLS regression. Stands for  Ordinary Least Squares Regression. Relatively basic multivariate.
Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Chapter 5 Regression with a Single Regressor: Hypothesis Tests and Confidence Intervals.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)
Lecture 6 Feb. 2, 2015 ANNOUNCEMENT: Lab session will go from 4:20-5:20 based on the poll. (The majority indicated that it would not be a problem to chance,
Chapter 5 Regression with a Single Regressor: Hypothesis Tests and Confidence Intervals.
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
Ch. 2: The Simple Regression Model
AP Statistics Chapter 14 Section 1.
The Simple Linear Regression Model: Specification and Estimation
Virtual COMSATS Inferential Statistics Lecture-26
Inferences for Regression
Chapter 11: Simple Linear Regression
The Simple Regression Model
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
CHAPTER 29: Multiple Regression*
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Chapter 6: MULTIPLE REGRESSION ANALYSIS
6-1 Introduction To Empirical Models
Introduction to Econometrics
Elementary Statistics: Looking at the Big Picture
Simple Linear Regression
Heteroskedasticity.
Simple Linear Regression
Seminar in Economics Econ. 470
The Simple Regression Model
Inferences for Regression
Confidence and Prediction Intervals
Presentation transcript:

Linear Regression with One Regression Chapter 4 Linear Regression with One Regression

Linear Regression with One Regressor (SW Chapter 4) Linear regression allows us to estimate, and make inferences about, population slope coefficients. Ultimately our aim is to estimate the causal effect on Y of a unit change in X – but for now, just think of the problem of fitting a straight line to data on two variables, Y and X.

Confidence intervals: The problems of statistical inference for linear regression are, at a general level, the same as for estimation of the mean or of the differences between two means. Statistical, or econometric, inference about the slope entails: Estimation: How should we draw a line through the data to estimate the (population) slope (answer: ordinary least squares). What are advantages and disadvantages of OLS? Hypothesis testing: How to test if the slope is zero? Confidence intervals: How to construct a confidence interval for the slope?

Linear Regression: Some Notation and Terminology (SW Section 4.1)

The Population Linear Regression Model – general notation

This terminology in a picture: Observations on Y and X; the population regression line; and the regression error (the “error term”):

The Ordinary Least Squares Estimator (SW Section 4.2)

Mechanics of OLS

The OLS estimator solves:

Application to the California Test Score – Class Size data

Interpretation of the estimated slope and intercept

Predicted values & residuals:

OLS regression: STATA output

Measures of Fit (Section 4.3)

The Standard Error of the Regression (SER)

Example of the R2 and the SER

The Least Squares Assumptions (SW Section 4.4)

The Least Squares Assumptions

Least squares assumption #1: E(u|X = x) = 0.

Least squares assumption #1, ctd.

Least squares assumption #2: (Xi,Yi), i = 1,…,n are i.i.d.

Least squares assumption #3: Large outliers are rare Technical statement: E(X4) <  and E(Y4) < 

OLS can be sensitive to an outlier:

The Sampling Distribution of the OLS Estimator (SW Section 4.5)

Probability Framework for Linear Regression

The Sampling Distribution of 1 ˆ b

The mean and variance of the sampling distribution of

Now we can calculate E( ) and var( ):

Next calculate var( ):

What is the sampling distribution of ?

Large-n approximation to the distribution of :

The larger the variance of X, the smaller the variance of

The larger the variance of X, the smaller the variance of

Summary of the sampling distribution of :

Regression with a Single Regressor: Hypothesis Tests and Confidence Intervals (SW Chapter 5)

But first… a big picture view (and review)

Object of interest: 1 in,

Hypothesis Testing and the Standard Error of (Section 5.1)

Formula for SE( )

Summary: To test H0: 1 = 1,0 v. H1: 1  1,0,

Example: Test Scores and STR, California data

Confidence Intervals for 1 (Section 5.2)

A concise (and conventional) way to report regressions:

OLS regression: reading STATA output

Summary of Statistical Inference about 0 and 1:

Regression when X is Binary (Section 5.3)

Interpreting regressions with a binary regressor

Summary: regression when Xi is binary (0/1)

Heteroskedasticity and Homoskedasticity, and Homoskedasticity-Only Standard Errors (Section 5.4)

Homoskedasticity in a picture:

Heteroskedasticity in a picture:

A real-data example from labor economics: average hourly earnings vs A real-data example from labor economics: average hourly earnings vs. years of education (data source: Current Population Survey):

The class size data:

So far we have (without saying so) assumed that u might be heteroskedastic.

What if the errors are in fact homoskedastic?

We now have two formulas for standard errors for

Practical implications…

Heteroskedasticity-robust standard errors in STATA

The bottom line:

Some Additional Theoretical Foundations of OLS (Section 5.5)

The Extended Least Squares Assumptions

Efficiency of OLS, part I: The Gauss-Markov Theorem

The Gauss-Markov Theorem, ctd.

Efficiency of OLS, part II:

Some not-so-good thing about OLS

Limitations of OLS, ctd.

Inference if u is Homoskedastic and Normal: the Student t Distribution (Section 5.6)

Practical implication:

Summary and Assessment (Section 5.7)