Presentation is loading. Please wait.

Presentation is loading. Please wait.

Equations in Simple Regression Analysis. The Variance.

Similar presentations


Presentation on theme: "Equations in Simple Regression Analysis. The Variance."— Presentation transcript:

1 Equations in Simple Regression Analysis

2 The Variance

3 The standard deviation

4 The covariance

5 The Pearson product moment correlation

6 The normal equations (for the regressions of y on x)

7 The structural model (for an observation on individual i)

8 The regression equation

9 Partitioning a deviation score, y

10 Partitioning the sum of squared deviations (sum of squares, SS y )

11 Calculation of proportions of sums of squares due to regression and due to error (or residual)

12 Alternative formulas for computing the sums of squares due to regression

13 Test of the regression coefficient, b yx, (i.e. test the null hypothesis that b yx = 0) First compute the variance of estimate

14 Test of the regression coefficient, b yx, (i.e. test the null hypothesis that b yx = 0) Then obtain the standard error of estimate Then compute the standard error of the regression coefficient, S b

15 The test of significance of the regression coefficient ( b yx ) The significance of the regression coefficient is tested using a t test with (N-k-1) degrees of freedom:

16 Computing regression using correlations The correlation, in the population, is given by The population correlation coefficient, ρ xy, is estimated by the sample correlation coefficient, r xy

17 Sums of squares, regression (SS reg ) Recalling that r 2 gives the proportion of variance of Y accounted for (or explained) by X, we can obtain or, in other words, SS reg is that portion of SS y predicted or explained by the regression of Y on X.

18 Standard error of estimate From SS res we can compute the variance of estimate and standard error of estimate as (Note alternative formulas were given earlier.)

19 Testing the Significance of r The significance of a correlation coefficient, r, is tested using a t test: With N-2 degrees of freedom.

20 Testing the difference between two correlations To test the difference between two Pearson correlation coefficients, use the “Comparing two correlation coefficients” calculator on my web site.

21 Testing the difference between two regression coefficients This, also, is a t test: Where was given earlier. When the variances,, are unequal, used the pooled estimate given on page 258 of our textbook.

22 Other measures of correlation Chapter 10 in the text gives several alternative measures of correlation: Point-biserial correlation Phi correlation Biserial correlation Tetrachoric correlation Spearman correlation

23 Point-biserial and Phi correlation These are both Pearson Product-moment correlations The Point-biserial correlation is used when on variable is a scale variable and the other represents a true dichotomy. For instance, the correlation between an performance on an item—the dichotomous variable—and the total score on a test—the scaled variable.

24 Point-biserial and Phi correlation The Phi correlation is used when both variables represent a true dichotomy. For instance, the correlation between two test items.

25 Biserial and Tetrachoric correlation These are non-Pearson correlations. Both are rarely used anymore. The biserial correlation is used when one variable is truly a scaled variable and the other represents an artificial dichotomy. The Tetrachoric correlation is used when both variables represent an artificial dichotomy.

26 Spearman’s Rho Coefficient and Kendall’s Tau Coefficient Spearman’s rho is used to compute the correlation between two ordinal (or ranked) variables. It is the correlation between two sets of ranks. Kendall’s tau (see pages 286-288 in the text) is also a measure of the relationship between two sets of ranked data.


Download ppt "Equations in Simple Regression Analysis. The Variance."

Similar presentations


Ads by Google