Download presentation
Presentation is loading. Please wait.
Published byOsborn Perkins Modified over 6 years ago
1
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
Basic Econometrics Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation Prof. Himayatullah May 2004
2
3-1. The method of ordinary least square (OLS)
Least-square criterion: Minimizing U^2i = (Yi – Y^i) 2 = (Yi- ^1 - ^2X) (3.1.2) Normal Equation and solving it for ^1 and ^2 = Least-square estimators [See (3.1.6)(3.1.7)] Numerical and statistical properties of OLS are as follows: Prof. Himayatullah May 2004
3
3-1. The method of ordinary least square (OLS)
OLS estimators are expressed solely in terms of observable quantities. They are point estimators The sample regression line passes through sample means of X and Y The mean value of the estimated Y^ is equal to the mean value of the actual Y: E(Y) = E(Y^) The mean value of the residuals U^i is zero: E(u^i )=0 u^i are uncorrelated with the predicted Y^i and with Xi : That are u^iY^i = 0; u^iXi = 0 Prof. Himayatullah May 2004
4
3-2. The assumptions underlying the method of least squares
Ass 1: Linear regression model (in parameters) Ass 2: X values are fixed in repeated sampling Ass 3: Zero mean value of ui : E(uiXi)=0 Ass 4: Homoscedasticity or equal variance of ui : Var (uiXi) = 2 [VS. Heteroscedasticity] Ass 5: No autocorrelation between the disturbances: Cov(ui,ujXi,Xj ) = 0 with i # j [VS. Correlation, + or - ] Prof. Himayatullah May 2004
5
3-2. The assumptions underlying the method of least squares
Ass 6: Zero covariance between ui and Xi Cov(ui, Xi) = E(ui, Xi) = 0 Ass 7: The number of observations n must be greater than the number of parameters to be estimated Ass 8: Variability in X values. They must not all be the same Ass 9: The regression model is correctly specified Ass 10: There is no perfect multicollinearity between Xs Prof. Himayatullah May 2004
6
3-3. Precision or standard errors of least-squares estimates
In statistics the precision of an estimate is measured by its standard error (SE) var( ^2) = 2 / x2i (3.3.1) se(^2) = Var(^2) (3.3.2) var( ^1) = 2 X2i / n x2i (3.3.3) se(^1) = Var(^1) (3.3.4) ^ 2 = u^2i / (n - 2) (3.3.5) ^ = ^ 2 is standard error of the estimate Prof. Himayatullah May 2004
7
3-3. Precision or standard errors of least-squares estimates
Features of the variance: + var( ^2) is proportional to 2 and inversely proportional to x2i + var( ^1) is proportional to 2 and X2i but inversely proportional to x2i and the sample size n. + cov ( ^1 , ^2) = - var( ^2) shows the independence between ^1 and ^2 Prof. Himayatullah May 2004
8
3-4. Properties of least-squares estimators: The Gauss-Markov Theorem
An OLS estimator is said to be BLUE if : + It is linear, that is, a linear function of a random variable, such as the dependent variable Y in the regression model + It is unbiased , that is, its average or expected value, E(^2), is equal to the true value 2 + It has minimum variance in the class of all such linear unbiased estimators An unbiased estimator with the least variance is known as an efficient estimator Prof. Himayatullah May 2004
9
3-4. Properties of least-squares estimators: The Gauss-Markov Theorem
Given the assumptions of the classical linear regression model, the least-squares estimators, in class of unbiased linear estimators, have minimum variance, that is, they are BLUE Prof. Himayatullah May 2004
10
3-5. The coefficient of determination r2: A measure of “Goodness of fit”
Yi = i + i or Yi = i - i + i or yi = i i (Note: = ) Squaring on both side and summing => yi2 = x2i + 2i ; or TSS = ESS + RSS Prof. Himayatullah May 2004
11
TSS = yi2 = Total Sum of Squares
3-5. The coefficient of determination r2: A measure of “Goodness of fit” TSS = yi2 = Total Sum of Squares ESS = Y^ i2 = ^22 x2i = Explained Sum of Squares RSS = u^2I = Residual Sum of Squares ESS RSS 1 = ; or TSS TSS RSS RSS 1 = r ; or r2 = TSS TSS Prof. Himayatullah May 2004
12
r = r2 is sample correlation coefficient Some properties of r
3-5. The coefficient of determination r2: A measure of “Goodness of fit” r2 = ESS/TSS is coefficient of determination, it measures the proportion or percentage of the total variation in Y explained by the regression Model 0 r2 1; r = r2 is sample correlation coefficient Some properties of r Prof. Himayatullah May 2004
13
3-6. A numerical Example (pages 80-83)
3-5. The coefficient of determination r2: A measure of “Goodness of fit” 3-6. A numerical Example (pages 80-83) 3-7. Illustrative Examples (pages 83-85) 3-8. Coffee demand Function 3-9. Monte Carlo Experiments (page 85) 3-10. Summary and conclusions (pages 86-87) Prof. Himayatullah May 2004
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.