Download presentation
Presentation is loading. Please wait.
Published byRowan Wilgus Modified over 9 years ago
1
Correlation, Reliability and Regression Chapter 7
2
Correlation Statistic that describes the relationship between scores (Pearson r). Number is the correlation coefficient. Ranges between +1.00 and –1.00. Positive is direct relationship. Negative is inverse relationship. .00 is no relationship. Does not mean cause and effect. Measured by a Z score. Generally looking for scores greater than.5
3
Reliability Statistic used to determine repeatability Number ranges between 0 and 1 Always positive Closer to one is greater reliability Closer to 0 is less reliability Generally looking for values greater than.8
4
Scattergram or Scatterplot Designate one variable X and one Y. Draw and label axes. Lowest scores are bottom left. Plot each pair of scores. Positive means high on both scores. Negative means high on one and low on the other. IQ and GPA? ~ 0.68
5
Example (high correlation with systematic bias) Trial 1 Trial 2 1012 9 11 12 14 11 13 13 15 8 10 S1 S2 S3 S4 S5 S6
6
Positive Data
7
Positive Plot r=.93
8
Negative Data
9
Negative Plot r=-.92
10
Null Data
11
Null Plot (orthogonal) r=.34r=-.00
12
Pearson (Interclass Correlation) Ignores the systematic bias Has agreement (rank) but not correspondence (raw score) The order and the SD of the scores remain the same The mean may be different between the two tests The r can still be high (i.e. close to 1.0)
13
Calculation of r
14
ICC (Intraclass Correlation) Addresses correspondence and agreement R = 1.0 is perfect reliability 0.0 is no reliability
15
ICC Advantages More then two variables (ratings, raters etc.) Will find the systematic bias Interval, ratio or ordinal data
16
Example Trial 1 Trial 2 1012 9 11 12 14 11 13 13 15 8 10 S1 S2 S3 S4 S5 S6
17
ICC ICC = BMS - EMS BMS + (k-1) EMS
18
Trial 1 & 2 BMS - EMS BMS +(k-1)EMS + k [(TMS-EMS)/n] ICC = 92.13 – 3.96 92.13 +(2-1)3.96 + 2 [(70.53-3.96)/15] ICC = = 0.84 Pearson r = 0.91
19
Example Trial 1 Trial 2 1012 9 11 12 14 11 13 13 15 8 10 S1 S2 S3 S4 S5 S6 BMS TMS EMS
20
What is a Mean Square? Sum of squared deviations divided by the degrees of freedom (df=values free to vary when sum is set) SSx = sum of squared deviations about the mean which is a variance estimator
21
Running ICC on SPSS Analyze, scale, reliability analysis Choose two or more variables Click statistics, check ICC at bottom Two-way mixed, consistency Use single measures on output
22
Pearson vs. ICC Trial 1Trial 2 218231 243275 205210 214244 226240 220226 211229 267295 228233
23
Interpretation (positive or negative) <.20 very low .20-.39 low .40-.59 moderate .60-.79 high .80-1.00 very high Must also consider the p-value
24
Correlation Conclusion Statement 1. Always past tense 2. Include interpretation 3. Include ‘significant’ 4. Include p or alpha value 5. Include direction 6. Include r value 7. Use variable names There was a high significant (p<0.05) positive correlation (r=.78) between X and Y.
25
Pearson vs. ICC 60120 144.74115.52 181.09165.43 85.6478.14 100.9374.85 168.3147.26 98.1188.12 116.1994.94 114.4398.3 187.93154.81 mean133.04113.0411
26
Curvilinear Scores curve around line of best fit. Also called a trend analysis. More complex statistics.
28
Coefficient of Determination Represents the common variance between scores. Square of the r value. % explained. How much one variable affects the other
29
R 2 the proportion of variance that two measures have in common-overlap determines relationship-explained variance
30
Partial Correlation the degree of relationship between two variables while ruling out that degree of correlation attributable to other variables
31
Simple Linear Regression Predict one variable from one another If measurement on one variable is difficult or missing Prediction is not perfect but contains error High reliability if error is low and R is high
32
Residual is vertical distance of any point from the line of best fit (predicted) r=.93 Positive and negative distances are equal
33
Prediction Y=(bx)+c Y is the predicted value B is the slope of the line X is the raw value of the predictor C is the Y intercept (Y when x = zero) Y vertical/X horizontal
34
SPSS
35
SPSS Printout Z-scores
36
Prediction Y p = (bx)+c HT p = (.85x80)+(108.58) HT p = (68)+(108.58) HT p = 176.58 Residual (error) = diff between predicted and actual Subject must come from that population! HTWT 185.0080.00 185.0087.00 152.5052.00 155.0064.10 172.0066.00 179.0081.00 160.0067.72 174.0076.00 154.0060.00 165.0070.00
37
Standard Error of the Estimate (SEE) is the standard deviation of the distribution of residual scores Error associated with the predicted value Read as a SD or SEM value 68%, 95%, 99% SEE x 2 then add and subtract it from the predicted score to determine 95% CI of the predicted score. SEE SEE = the square root of The squared residuals Divided by the number of pairs
38
Prediction Y p = (bx)+c HT p = (.85x80)+(108.58) HT p = (68)+(108.58) HT p = 176.58 SEE x 2 = 19.5 95% CI = 157.08 – 196.08 HTWT 185.0080.00 185.0087.00 152.5052.00 155.0064.10 172.0066.00 179.0081.00 160.0067.72 174.0076.00 154.0060.00 165.0070.00
39
Multiple Regression Uses multiple X variables to predict Y Results in beta weights for each X variable Y=(b 1 x 1 ) + (b 2 x 2 ) + (b 3 x 3 ) … + c
40
SPSS - R Prediction Y=(WT x.40) - (skinfold x 1.04) + 156.45
41
Equation Y=(WTx.40)- (skinfoldx1.04)+156.45 Y=(80x.40)- (11x1.04)+156.45 Y=(32)-(11.44)+156.45 Y=177.01 SEE=8.67 (x2=17.34) CI=159.67-194.35 HTWTSkin 185.0080.0011.00 185.0087.0012.00 152.5052.0023.00 155.0064.1025.00 172.0066.0010.00 179.0081.0011.00 160.0067.7220.00 174.0076.0013.00 154.0060.0022.00 165.0070.009.00
42
Next Class Chapter 8 & 13 t-tests and chi-square.
43
Homework 1. Make a scatterplot with trendline and r and r 2 of two ratio variables. 2. Run Pearson r on four different variables and hand draw a scatterplot for two. 3. Run ICC between VJ1 and VJ2. 4. Run linear regression on standing long jump and predict stair up time. Work out the equation and CI for subject #2. 5. Run multiple regression on subject #2 and add vjump running, circumference and weight to predictors. Also out the equation and CI.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.