Download presentation
Presentation is loading. Please wait.
1
Econ 3790: Business and Economics Statistics
Instructor: Yogesh Uppal
2
Sampling Distribution of b1
Expected value of b1: E(b1) =b1 Variance of b1: Var(b1) = σ2/SSx
3
Estimate of σ2 s 2 = MSE = SSE/(n - 2) where:
The mean square error (MSE) provides the estimate of σ2. s 2 = MSE = SSE/(n - 2) where:
4
Sample variance of b1 Estimate of variance of b1:
Standard error of b1: s is called the standard error of the estimate.
5
Interval Estimate of b1:
(1-a)100% confidence interval for b1 is: Where ta/2 is the value from t distribution with (n-2) degrees of freedom such that probability in the upper tail is a/2.
6
Example: Reed Auto Sales
s2 = MSE = SSE/(n - 2) = 8.2/3 =2.73 95% confidence interval for b1: We can say we 95% confidence that b1 will lie between 1.87 and 7.13.
7
Testing for Significance: t Test
Hypotheses Test Statistic Where b1 is the slope estimate and SE(b1) is the standard error of b1.
8
Testing for Significance: t Test
Rejection Rule Reject H0 if p-value < a or t < -tor t > t where: t is based on a t distribution with n - 2 degrees of freedom
9
Testing for Significance: t Test
1. Determine the hypotheses. 2. Specify the level of significance. a = .05 3. Select the test statistic. 4. State the rejection rule. Reject H0 if p-value < .05 or t ≤ or t ≥ 3.182
10
Testing for Significance: t Test
5. Compute the value of the test statistic. 6. Determine whether to reject H0. t = 5.42 > ta/2 = We can reject H0.
11
Some Cautions about the Interpretation of Significance Tests
Rejecting H0: b1 = 0 and concluding that the relationship between x and y is significant does not enable us to conclude that a cause-and-effect relationship is present between x and y. Just because we are able to reject H0: b1 = 0 and demonstrate statistical significance does not enable us to conclude that there is a linear relationship between x and y.
12
Multiple Regression Model
The equation that describes how the dependent variable y is related to the independent variables x1, x2, xp and an error term is called the multiple regression model. y = b0 + b1x1 + b2x bpxp + e where: b0, b1, b2, , bp are the parameters, and e is a random variable called the error term
13
Estimated Multiple Regression Equation
A simple random sample is used to compute sample statistics b0, b1, b2, , bp that are used as the point estimators of the parameters b0, b1, b2, , bp. The estimated multiple regression equation is: ^ y = b0 + b1x1 + b2x bpxp
14
Interpreting the Coefficients
In multiple regression analysis, we interpret each regression coefficient as follows: bi represents an estimate of the change in y corresponding to a 1-unit increase in xi when all other independent variables are held constant.
15
Multiple Regression Model
Example: Car Sales Suppose we believe that number of cars sold (y) is not only related to the number of ads (x1), but also to the minimum down payment required at the (x2). The regression model can be given by: y = 0 + 1x1 + 2x2 + where y = number of cars sold x1 = number of ads x2 = minimum down payment required (‘000)
16
Estimated Regression Equation
y = x x2 Interpretation? Estimated values of y? Error? Prediction?
17
Multiple Coefficient of Determination
Relationship Among SST, SSR, SSE SST = SSR SSE where: SST = total sum of squares SSR = sum of squares due to regression SSE = sum of squares due to error
18
Multiple Coefficient of Determination
R2 = SSR/SST R2 = 84.63/89.2 = Adjusted Multiple Coefficient of Determination Standard Error of Estimate
19
Testing for Significance: t Test
Hypotheses Test Statistics Rejection Rule Reject H0 if p-value < a or if t < -tor t > t where t is based on a t distribution with n - p - 1 degrees of freedom.
20
Example: Testing for significance of coefficients
Hypotheses For = .05 and d.f. = ?, t.025 = Rejection Rule Test Statistics
21
Testing for Significance of Regression: F Test
H0: 1 = 2 = = p = 0 Ha: One or more of the parameters is not equal to zero. Hypotheses Test Statistics F = MSR/MSE Rejection Rule Reject H0 if p-value < a or if F > F, where F is based on an F distribution with p d.f. in the numerator and n - p - 1 d.f. in the denominator.
22
Multiple Regression Model
Example 2: Programmer Salary Survey A software firm collected data for a sample of 20 computer programmers. A suggestion was made that regression analysis could be used to determine if salary was related to the years of experience and the score on the firm’s programmer aptitude test. The years of experience, score on the aptitude test, and corresponding annual salary ($1000s) for a sample of 20 programmers is shown on the next slide.
23
Multiple Regression Model
Exper. Score Salary Exper. Score Salary 4 7 1 5 8 10 6 78 100 86 82 84 75 80 83 91 24 43 23.7 34.3 35.8 38 22.2 23.1 30 33 9 2 10 5 6 8 4 3 88 73 75 81 74 87 79 94 70 89 38 26.6 36.2 31.6 29 34 30.1 33.9 28.2 30
24
Multiple Regression Model
Suppose we believe that salary (y) is related to the years of experience (x1) and the score on the programmer aptitude test (x2) by the following regression model: y = 0 + 1x1 + 2x2 + where y = annual salary ($1000) x1 = years of experience x2 = score on programmer aptitude test
25
Solving for b0, b1 and b2:
26
Anova Table Source of Variation Sum of Squares Degrees of Freedom
Mean Square F-statistic Regression 500.34 …… …….. ………. Error ……. Total 599.8
27
Estimated Regression Equation
SALARY = (EXPER) (SCORE) b1 = implies that salary is expected to increase by $1,404 for each additional year of experience (when the variable score on programmer attitude test is held constant). b2 = implies that salary is expected to increase by $251 for each additional point scored on the programmer aptitude test (when the variable years of experience is held constant).
28
Prediction Bob’s estimated salary is $28,358.
Suppose Bob had an experience of 4 years and had a score of 78 on the aptitude test. What would you estimate (or expect) his score to be? = *(4) (78) = Bob’s estimated salary is $28,358.
29
Error Bob’s actual salary is $ How much error we made in estimating his salary based on his experience and score? So, we shall overestimate Bob’s salary.
30
Multiple Coefficient of Determination
Relationship Among SST, SSR, SSE SST = SSR SSE where: SST = total sum of squares SSR = sum of squares due to regression SSE = sum of squares due to error
31
Multiple Coefficient of Determination
R2 = SSR/SST R2 = / = Adjusted Multiple Coefficient of Determination
32
Testing for Significance: t Test
Hypotheses Test Statistics Rejection Rule Reject H0 if p-value < a or if t < -tor t > t where t is based on a t distribution with n - p - 1 degrees of freedom.
33
Example Hypotheses For = .05 and d.f. = 17, t.025 = 2.11
Reject H0 if p-value < .05 or if t > 2.11 Rejection Rule Test Statistics Since t=7.07 > t0.025 =2.11, we reject H0.
34
Testing for Significance of Regression: F Test
H0: 1 = 2 = = p = 0 Ha: One or more of the parameters is not equal to zero. Hypotheses Test Statistics F = MSR/MSE Rejection Rule Reject H0 if p-value < a or if F > F, where F is based on an F distribution with p d.f. in the numerator and n - p - 1 d.f. in the denominator.
35
Example F = 42.8 > F0.05 = 3.59, so we can reject H0. Hypotheses
Ha: One or both of the parameters is not equal to zero. For = .05 and d.f. = 2, 17; F.05 = 3.59 Reject H0 if p-value < .05 or F > 3.59 Rejection Rule Test Statistics F = MSR/MSE = /5.86 = 42.8 F = 42.8 > F0.05 = 3.59, so we can reject H0.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.