Download presentation
Presentation is loading. Please wait.
Published byAnna Parks Modified over 9 years ago
1
1 Multivariate Linear Regression Models Shyh-Kang Jeng Department of Electrical Engineering/ Graduate Institute of Communication/ Graduate Institute of Networking and Multimedia
2
2 Regression Analysis A statistical methodology –For predicting value of one or more response (dependent) variables –Predict from a collection of predictor (independent) variable values
3
3 Example 7.1 Fitting a Straight Line Observed data Linear regression model z1z1z1z101234 y14389
4
4 Example 7.1 Fitting a Straight Line 032145 z y 2 4 6 8 10 0
5
5 Classical Linear Regression Model
6
6
7
7 Example 7.1
8
8 Examples 6.6 & 6.7
9
9 Example 7.2 One-Way ANOVA
10
10 Method of Least Squares
11
11 Result 7.1
12
12 Proof of Result 7.1
13
13 Proof of Result 7.1
14
14 Example 7.1 Fitting a Straight Line Observed data Linear regression model z1z1z1z101234 y14389
15
15 Example 7.3
16
16 Coefficient of Determination
17
17 Geometry of Least Squares
18
18 Geometry of Least Squares
19
19 Projection Matrix
20
20 Result 7.2
21
21 Proof of Result 7.2
22
22 Proof of Result 7.2
23
23 Result 7.3 Gauss Least Square Theorem
24
24 Proof of Result 7.3
25
25 Result 7.4
26
26 Proof of Result 7.4
27
27 Proof of Result 7.4
28
28 Proof of Result 4.11
29
29 Proof of Result 7.4
30
30 Proof of Result 7.4
31
31 Proof of Result 7.4
32
32 2 Distribution
33
33 Result 7.5
34
34 Proof of Result 7.5
35
35 Example 7.4 (Real Estate Data) 20 homes in a Milwaukee, Wisconsin, neighborhood Regression model
36
36 Example 7.4
37
37 Result 7.6
38
38 Effect of Rank In situations where Z is not of full rank, rank(Z) replaces r+1 and rank(Z 1 ) replaces q+1 in Result 7.6
39
39 Proof of Result 7.6
40
40 Proof of Result 7.6
41
41 Wishart Distribution
42
42 Generalization of Result 7.6
43
43 Example 7.5 (Service Ratings Data)
44
44 Example 7.5: Design Matrix
45
45 Example 7.5
46
46 Result 7.7
47
47 Proof of Result 7.7
48
48 Result 7.8
49
49 Proof of Result 7.8
50
50 Example 7.6 (Computer Data)
51
51 Example 7.6
52
52 Adequacy of the Model
53
53 Residual Plots
54
54 Q-Q Plots and Histograms Used to detect the presence of unusual observations or severe departures from normality that may require special attention in the analysis If n is large, minor departures from normality will not greatly affect inferences about
55
55 Test of Independence of Time
56
56 Example 7.7: Residual Plot
57
57 Leverage “ Outliers ” in either the response or explanatory variables may have a considerable effect on the analysis and determine the fit Leverage for simple linear regression with one explanatory variable z
58
58 Mallow’s C p Statistic Select variables from all possible combinations
59
59 Usage of Mallow’s C p Statistic
60
60 Stepwise Regression 1. The predictor variable that explains the largest significant proportion of the variation in Y is the first variable to enter 2. The next to enter is the one that makes the highest contribution to the regression sum of squares. Use Result 7.6 to determine the significance (F-test)
61
61 Stepwise Regression 3. Once a new variable is included, the individual contributions to the regression sum of squares of the other variables already in the equation are checked using F -tests. If the F-statistic is small, the variable is deleted 4. Steps 2 and 3 are repeated until all possible additions are non- significant and all possible deletions are significant
62
62 Treatment of Colinearity If Z is not of full rank, Z’Z does not have an inverse Colinear Not likely to have exact colinearity Possible to have a linear combination of columns of Z that are nearly 0 Can be overcome somewhat by –Delete one of a pair of predictor variables that are strongly correlated –Relate the response Y to the principal components of the predictor variables
63
63 Bias Caused by a Misspecified Model
64
64 Example 7.3 Observed data Regression model z 1 z 1 0 1 2 3 4 y 1 y 1 1 4 3 8 9 y 2 y 2 -1 -1 2 3 2
65
65 Multivariate Multiple Regression
66
66 Multivariate Multiple Regression
67
67 Multivariate Multiple Regression
68
68 Multivariate Multiple Regression
69
69 Multivariate Multiple Regression
70
70 Multivariate Multiple Regression
71
71 Example 7.8
72
72 Example 7.8
73
73 Result 7.9
74
74 Proof of Result 7.9
75
75 Proof of Result 7.9
76
76 Proof of Result 7.9
77
77 Forecast Error
78
78 Forecast Error
79
79 Result 7.10
80
80 Result 7.11
81
81 Example 7.9
82
82 Other Multivariate Test Statistics
83
83 Predictions from Regressions
84
84 Predictions from Regressions
85
85 Predictions from Regressions
86
86 Example 7.10
87
87 Example 7.10
88
88 Example 7.10
89
89 Linear Regression
90
90 Result 7.12
91
91 Proof of Result 7.12
92
92 Proof of Result 7.12
93
93 Population Multiple Correlation Coefficient
94
94 Example 7.11
95
95 Linear Predictors and Normality
96
96 Result 7.13
97
97 Proof of Result 7.13
98
98 Invariance Property
99
99 Example 7.12
100
100 Example 7.12
101
101 Prediction of Several Variables
102
102 Result 7.14
103
103 Example 7.13
104
104 Example 7.13
105
105 Partial Correlation Coefficient
106
106 Example 7.14
107
107 Mean Corrected Form of the Regression Model
108
108 Mean Corrected Form of the Regression Model
109
109 Mean Corrected Form for Multivariate Multiple Regressions
110
110 Relating the Formulations
111
111 Example 7.15 Example 7.6, classical linear regression model Example 7.12, joint normal distribution, best predictor as the conditional mean Both approaches yielded the same predictor of Y 1
112
112 Remarks on Both Formulation Conceptually different Classical model –Input variables are set by experimenter –Optimal among linear predictors Conditional mean model –Predictor values are random variables observed with the response values –Optimal among all choices of predictors
113
113 Example 7.16 Natural Gas Data
114
114 Example 7.16 : First Model
115
115 Example 7.16 : Second Model
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.