Presentation is loading. Please wait.

Presentation is loading. Please wait.

Simple and multiple regression analysis in matrix form Least square Beta estimation Beta Simple linear regression Multiple regression with two predictors.

Similar presentations


Presentation on theme: "Simple and multiple regression analysis in matrix form Least square Beta estimation Beta Simple linear regression Multiple regression with two predictors."— Presentation transcript:

1 Simple and multiple regression analysis in matrix form Least square Beta estimation Beta Simple linear regression Multiple regression with two predictors Multiple regression with three predictors Sum of square R 2 Test on  parameters Covariance matrix of the  Standard error of the 

2 Simple and multiple regression analysis in matrix form Tests on individual predictors Variance of individual predictors Correlation between predictors Standardized matrices Correlation matrices Sum of squares in Z R 2 in Z R 2 in Z R 2 between independent variables R 2 Standard error of  in Z

3 Least square Starting from the general: The method of least squares estimate of the beta parameter minimizing the sum of squares due to error. In fact, if:

4 You can estimate: Least square

5 Simple linear regression

6

7

8 intercepts slope

9 Multiple regression Similar to the simple A single dependent variable (Y) Two or more independent variables (X) Multiple correlation (rather than simple) Estimation by least squares

10 Simple linear regression (var.: 1 dep., 1 indep.) Multiple linear regression (Var.:1 dep., 2 indep.) interceptserror Independent variables slope Multiple regression

11 Multiple regression matrix form

12

13

14 X’X inversa Multiple regression matrix form

15

16

17 In matrix notation is briefly expressed : Multiple regression with three predictors

18

19 Matrix form

20

21

22

23 General scheme

24

25 The least squares method allows to check the following equality: Sum squares

26 Since in general: it's possible to derive that the sum of the squares of the distances of y from its average can be decomposed into the sum of squares due to regression and the sum of squares due to error, according to: Sum squares

27 It should be noted the equivalence of :

28 Sum squares

29

30 In summary : Sum squares

31 R2R2

32 Adjusted R 2 YY’ Because the coefficient of determination depends on both the number of observations (n) that the number of independent variables (k) it is convenient to correct by the degrees of freedom. Adjusted R 2 YY’ In our example :

33 Once a regression model has been constructed, it may be important to confirm the goodness of fit (R-squared )of the model and the statistical significance of the estimated parameters. Statistical significance can be checked by an F-test of the overall fit, followed by t-tests of individual parametersgoodness of fitR-squared statistical significanceF-testt-tests Test on  parameters

34 You can test the hypothesis of differences with 0 of the parameters  i taken together : Test on  parameters

35 k= Number of columns of the matrix X excluding X 0 n= Number of observations in y Test on  parameters

36 k= Number of columns of the matrix X excluding X 0 n= Number of observations in y

37 Covariance matrix of the  An estimate of the covariance matrix of the beta values result by: We denote:

38 Covariance matrix of the  Where the diagonal elements are an estimate of the variance of the single  i

39 Standard error of the  The standard error of the parameters can be calculated with the following formula: where c ii is the diagonal element inside the matrix(X’X) -1 corresponding to the parameter  i.

40 Standard error of the  Nota: quando il valore di c ii è elevato il valore di se b i cresce, indicando che la variabile X i ha un alto coefficiente di correlazione multipla con le altre variabili X.

41 Standard error of the  the increase in R 2 i led to a decreases of the denominator of the ratio and, consequently, increases the value of the standard error of the parameter  i. The standard error of the  i can also be calculated in the following way: where

42 With the standard error of measurement associated with each  i you can make a t- test to verify: Tests on individual predictors

43 With the standard error of measurement associated with each  i is also possible to estimate the confidence interval for each parameter:

44 Tests on individual predictors 1.Calculate the SSreg for the model containing all the independent variables. 2.Calculate the SSreg for the model excluding the variable for which you want to test the significance (SS-i). 3.Perform an F-test with the numerator equal to the difference SS reg -SS i weighted for the difference between the degrees of freedom of the two models, and with denominator SS REs / (nk-1). In order to conduct a statistical test on the regression coefficients is necessary:

45 Tests on individual predictors To test, for example, only the weight of the first predictor compared to the total model, it is necessary to calculate a new matrix  i from the matrix X i which was taken off the column belonging to the first predictor. From this follows immediately the calculation of SSi.

46 Tests on individual predictors

47 Same procedure is followed to test any subset of predictors. Similarly we have:

48 Tests on individual predictors It is interesting to note that this test on a single predictor is equivalent to the t-test b1 = 0. When the numerator there is only one degree of freedom, that is in fact the equivalence:

49 Summary table On this occasion, none of the estimated parameters obtained statistical significance on the hypothesis  i  0

50 Variance of individual predictors X i Using the matrix X'X we can calculate the variance of each variable X i.

51 Variance of individual predictors X i

52 Covariance between predictors and the dependent variable It is possible to calculate the covariance between the independent variables and the dependent variable according to:

53 Covariance between predictors and the dependent variable The correlation between the independent variables and the dependent variable is given by: As we will see later the use of standardized matrices simplifies the calculation immediately.

54 Test on multiple predictor You can perform a statistical test on a group of predictors in order to verify the significance. To do this, you use the formula specified above : To test, for example, the weight of only the first and second predictors with respect to the total model, it is necessary to calculate a new matrix  i from the matrix X i which was taken off the column belonging to these predictors. From this follows immediately the calculation of SSi.

55 Test on multiple predictor

56 Correlation between predictors Standard condition of independence between the variables X i

57 Correlation between predictors Condition of dependence between variables X i Completely standardized solution.

58 We denote by R i. the multiple correlation of the variable X i with the remaining variables, denoted by X j Correlation between predictors The element c ii represents the value of the diagonal of the matrix (X'X) -1 while S 2 i is the variance of the variable X i.

59 In case you do not have the X'X matrix but you have the MS res and the standard error of the parameter  i, the correlation between one X and the other one can be calculated in the following manner: Correlation between predictors

60

61 The X matrix and the y matrix can be converted into standardized scores by dividing the deviation of each element from the average for the appropriate standard deviation. Standardized matrices

62 In our example we have: Standardized matrices

63 With standardized variables is not necessary to include in the matrix Z the component 1 as the parameter  0 is equal to 0.

64 The standardized coefficients  can be obtained from those non-standardized using the formula: The equation of the regression line becomes: Standardized matrices

65 In our example we have:

66 Use standardized matrices allows to set the parameter  0 = 0. In fact, if the variables are standardized the intercept value for Y is 0, since all the means are equal to 0;Inoltre, essendo the correlation between any two standardized variables is: with i, j between 1 and k. Standardized matrices

67 Correlation matrices If we multiply the matrix (Z'Z) for the scalar [1 / (n-1)] we obtain the correlation matrix R between the independent variables

68 In our example we have: Correlation matrices

69 Correlation of Y with individual predictors Similarly if the variable Y is also standardized and multiply the product by the scalar Z'Y z [1 / (n-1)] we obtain the correlation matrix r yi of the variable Y with its predictors X i.

70 Correlation of Y with individual predictors

71 The solution of the system of normal equations of the line leads to the following equation: The estimated values ​​ can be obtained using the equation: Correlation of Y with individual predictors

72 With standardized variables we have: Starting from the general formulas it's possible to have the following simplified formulas: Sum of squares in Z

73

74 Calculation of R 2 y.123 Having decomposed the variance component due to the regression and the component due to the residuals, it is immediate to calculate:

75 Multiple correlation between the X i.yz If in general the squared multiple correlation of a variable with the other independent Xi is: in the presence of standardized variables, it becomes: where the element a ii belongs to the diagonal of the matrix R -1.

76 If you want to calculate the other two coefficients now you will have to do the following: For example, the squared multiple correlation between the first variable X 1 and the other two can be calculated in the following way: Multiple correlation between the X i.yz

77 Standard error of  z The standard error of the standardized parameters is obtainable by the general formula:

78 Standard error of  z You now have all the elements to test the differences of individual predictors from 0, obtaining the same results obtained with the non-standardized variables.


Download ppt "Simple and multiple regression analysis in matrix form Least square Beta estimation Beta Simple linear regression Multiple regression with two predictors."

Similar presentations


Ads by Google