Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ch3 The Two-Variable Regression Model

Similar presentations


Presentation on theme: "Ch3 The Two-Variable Regression Model"— Presentation transcript:

1 Ch3 The Two-Variable Regression Model
Curve fitting: line of best fit The least-squares criterion Two-variable model and its assumptions BLUE and Gauss-Markov Theorem Goodness of fit Tests of regression coefficients Testing the regression equation

2 1. Curve Fitting Given linearity, our objective is to specify a rule by which the “best” straight line relating X to Y can be determined Several procedures of minimizing the deviations between actual values and fitted values

3 2. The Least-squares Criterion
The method of least squares: a procedure exists which is computationally simple and penalizes larger errors relatively more than it penalizes small errors. The “line of best fit” is said to be that which minimizes the sum of the squared deviations of the points of the graph from the points of the straight line.

4 The Least-squares Criterion

5 3. Two-Variable model and Its Assumptions
The implication of the error term 1. Two source of error 2. The difference of error and residual Six important assumptions

6 Six Assumptions The relationship between Y and X is linear.
The X’s are nonstochastic variables whose values are fixed. The error has zero expected value: The error term has constant variance for all observations, i.e., The random variables are statistically independent. Thus, The error term is normally distributed

7 4. BLUE and Gauss-Markov Theorem
BLUE: Best Linear Unbiased Estimator Gauss-Markov theorem: Given the assumptions 1 through 5, the estimators are the best (most efficient) linear unbiased estimators of in the sense that they have minimum variance of all linear unbiased estimators, that is, they are BLUE.

8 5. Relaxing The Assumptions
Heteroscedasticity: the error variance is nonconstant; Serial Correlation: the error terms are correlated

9 Heteroscedasticity . . . . . X Y . . X Y . . . . . . . . . .

10 . . . . . . . . . . . . Serial Correlation Negative Serial Correlation
X Y X Y . . . . . . . . Negative Serial Correlation Positive Serial Correlation

11 6. Goodness of Fit TSS=RSS+ESS
Total variation of Y (or total sum of squares)= unexplained variation of Y (or residual sum of squares) + explained variation of Y (or sum of squares due to regression)

12 7. Tests of Regression Coefficients
To test the null hypothesis that

13 8. Testing the Regression Equation

14 Example: Interest Rate
We use least squares to estimate a model that explains the movement of monthly interest rates from January 1960 through August 1995 Interest rates are believed to be determined by the aggregate demand for and supply of liquid assets.

15 Example: Interest Rate
The variables that underlie the regression model are as follows: R: 3-month U.S. Treasury bill rate, in percent per year IP: Federal Reserve Board index of industrial production (1987=100) M2: nominal money supply, in billions of dollars PW: producer price index for all commodities (1982=100)


Download ppt "Ch3 The Two-Variable Regression Model"

Similar presentations


Ads by Google