Violations of Assumptions In Least Squares Regression.

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
The Simple Regression Model
BA 275 Quantitative Business Methods
Simple Linear Regression
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Chapter 10 Simple Regression.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Confidence intervals. Population mean Assumption: sample from normal distribution.
Class 5: Thurs., Sep. 23 Example of using regression to make predictions and understand the likely errors in the predictions: salaries of teachers and.
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
REGRESSION MODEL ASSUMPTIONS. The Regression Model We have hypothesized that: y =  0 +  1 x +  | | + | | So far we focused on the regression part –
Chapter Topics Types of Regression Models
Regression Diagnostics - I
Statistical Analysis SC504/HS927 Spring Term 2008 Session 7: Week 23: 7 th March 2008 Complex independent variables and regression diagnostics.
Simple Linear Regression Analysis
GRA 6020 Multivariate Statistics Regression examples Ulf H. Olsson Professor of Statistics.
The Simple Regression Model
1.The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2.Homoscedasticity --the.
Quantitative Business Analysis for Decision Making Simple Linear Regression.
Korelasi dalam Regresi Linear Sederhana Pertemuan 03 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Pertemua 19 Regresi Linier
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Business Statistics - QBM117 Statistical inference for regression.
Introduction to Regression Analysis, Chapter 13,
Basics of Sampling Theory P = { x 1, x 2, ……, x N } where P = population x 1, x 2, ……, x N are real numbers Assuming x is a random variable; Mean/Average.
Standard error of estimate & Confidence interval.
Regression and Correlation Methods Judy Zhong Ph.D.
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
1 G Lect 10a G Lecture 10a Revisited Example: Okazaki’s inferences from a survey Inferences on correlation Correlation: Power and effect.
Regression and Correlation Jake Blanchard Fall 2010.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
©2006 Thomson/South-Western 1 Chapter 13 – Correlation and Simple Regression Slides prepared by Jeff Heyl Lincoln University ©2006 Thomson/South-Western.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
ANOVA Assumptions 1.Normality (sampling distribution of the mean) 2.Homogeneity of Variance 3.Independence of Observations - reason for random assignment.
INDE 6335 ENGINEERING ADMINISTRATION SURVEY DESIGN Dr. Christopher A. Chung Dept. of Industrial Engineering.
Lecture 10: Correlation and Regression Model.
ECON 338/ENVR 305 CLICKER QUESTIONS Statistics – Question Set #8 (from Chapter 10)
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
Linear correlation and linear regression + summary of tests Dr. Omar Al Jadaan Assistant Professor – Computer Science & Mathematics.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc Chapter 17 Simple Linear Regression and Correlation.
The Holly Seven Steps of Developing a Statistical Test Tugrul Temel Center for World Food Studies Free University, Amsterdam February 2000.
Homogeneity of Variance Pooling the variances doesn’t make sense when we cannot assume all of the sample Variances are estimating the same value. For two.
Week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Chapter 11 Inference for Distributions AP Statistics 11.2 – Inference for comparing TWO Means.
Inference about the slope parameter and correlation
Statistical Modelling
Review 1. Describing variables.
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
BIVARIATE REGRESSION AND CORRELATION
Simple Linear Regression - Introduction
BA 275 Quantitative Business Methods
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
No notecard for this quiz!!
Sampling Distribution
Sampling Distribution
Statistical Assumptions for SLR
Homogeneity of Variance
Simple Linear Regression
Simple Linear Regression
Linear Regression Summer School IFPRI
Violations of Assumptions In Least Squares Regression
Ch3 The Two-Variable Regression Model
Violations of Assumptions In Least Squares Regression
Presentation transcript:

Violations of Assumptions In Least Squares Regression

Standard Assumptions in Regression Errors are Normally Distributed with mean 0 Errors have constant variance Errors are independent X is Measured without error

Example X s and OLS Estimators “t” is used to imply time ordering

Non-Normal Errors (Centered Gamma)

Errors = (Gamma(2,3.7672) ) Y t = X t + (  t -7.35) =  0 * +  1 X t +  t * E(  t * ) = 0 V(  t * ) = 27 Based on 100,000 simulations, the 95% CI for  1 contained 10 in 95.05% of the samples. Average= , SD= Average s 2 (b1) =

Non-Constant Error Variance (Heteroscedasticity) Mean: E(Y|X) =     X = X Standard Deviation:  Y|X = X Distribution: Normal

Non-Constant Error Variance (Heteroscedasticity) Based on 100,000 simulations, the 95% CI for  1 contained 10 in 92.62% of the samples. Average= , SD = Average s 2 (b1) = <

Correlated Errors Example:  2 = 9,  =0.5, n=22

Correlated Errors Based on 100,000 simulations, the 95% CI for  1 contained 10 in 84.56% of the samples. Average= , SD= Average s 2 (b1) = <

Measurement Error in X Z=True Value of Independent Variable (Unobserved) X=Observed Value of Independent Variable X=Z+U (Z can be fixed or random) Z, U independent (assumed) when Z is random V(U) is independent of Z when Z is fixed

Measurement Error in X Based on 100,000 simulations, the 95% CI for  1 contained 10 in 76.72% of the samples. Average= , SD= Average s 2 (b1) = >> ( ) 2