Download presentation
Presentation is loading. Please wait.
Published byLewis Chapman Modified over 8 years ago
1
STATISTICAL TESTS USING SPSS Dimitrios Tselios/ 14-01-2012 Example tests “Discovering statistics using SPSS”, Andy Field
2
Categorical Data (1)- chapter 10 Chi-square statistic (analyzing two categorical data) r : number of groups of the first categorical variable c : number of groups of the second categorical variable df=(r-1)(c-1) example if r=2, c=2 => df=1 from A4 table => if df=1 then critical value =3.84 (p=0.05) =6.83 (p=0.01) Fisher’s exact test for small sample (expected frequencies in each cell must be greater than 5) Fisher’s test overcomes the problem of small samples.
3
Categorical Data (2) The likehood ratio (A better approximation) (It is preferred when samples are small. This statistic will be roughly the same as Pearson’s chi-square for large samples) Yate’s correction (reduces a Type I error) (It is used when you bare a 2 x 2 contingency table (i.e. two categorical variables with two categories). Assumptions of the chi-square test Independence of data The expected frequencies should be greater than 5.
4
Categorical Data (3) Doing chi-square on SPSS Example from Andy Field’s book Weight Cases Data -> Weight Cases ->Weight Cases by Frequency Variable Running the analysis Analyze -> Descriptive Statistics-> Crosstabs ->Select Row and Column variable ->Click Statistics. Output For the chi-square test Crosstabulation Chi-square Test (table)
5
Categorical Data (4) Reporting the results of chi-square test Example : There is a significant association between the type of training and whether or not cats would dance Others statistics for categorical data Chi-square as regression Loglinear analysis
6
Correlation (1)- chapter 6 Covariance (not a standardized measure) Standardization and the correlation coefficient standardized covariance -> correlation coefficient (Pearson correlation coefficient) Caution The third-variable problem Direction of causality
7
Correlation (2) Bivariate Correlation - Analyze-> Correlate Bivariate … Example file: Exam anxiety.sav Assumptions of Pearson’s r -Data are interval -Normally distributed Using R2 for interpretation Spearman’s correlation coefficient (It is non-parametric statistic) Analyze-> Correlate-> Bivariate…-> Check Spearman’s rho.
8
Correlation (3) Kendall’s tau (non-parametric) (many scores have the same rank) How to report correlation coefficient Example : There was a significant relationship between the number of adverts watched and the number of packets of sweets purchased [r=0.87, p(one-tailed)<0.05]
9
Regression Analysis (1)- chapter 7 An introduction to regression (prediction) outcome variable from one predictor variable (simple) outcome variable from several predictor variables (multiple) outcome i =(model)+error i Doing simple regression on SPSS (Simple) (Example file: Record 1.Sav) Analyze-> Regression -> Linear - Select Dependent variable and Independent Variable - Click OK
10
Regression Analysis (2) Interpreting a Simple regression Model Summary: R- this value represents the simple correlation, R 2 is the quantity of the independent variable can account of the variation in the dependent variable. Analysis of variance (ANOVA), F F-ratio, Sig. Interpretation: The regression model overall predicts the dependent variable significantly well.
11
Regression Analysis (3) Model parameters Because t for each coefficient has p<0.001 then we conclude that b0 and b1 are different from 0 and consequently advertising budget makes a significant contribution (p<0.001) to predicting record sales. Model of prediction: record sales i = b 0 + b 1 advertising budget i = =134.14 + (0.096 advertising budget i )
12
Regression Analysis (4) How to do multiple regression using SPSS (Methods : Hierarchical, Forced entry, Stepwise methods) Analyze-> Regression-> Linear Select Dependent variable and Independent Variables Select Method (default : Enter or Hierarchical) Statistics (several options like outliers) Interpreting multiple regression [ F = 67,733, df=2, p<0.001 ]
13
Regression Analysis (5) Because for the constant b 0 [ t = 0.191, p > 0.05 ] then we assert that the predicting model does not work.
14
One way ANOVA on SPSS Running one-way ANOVA on SPSS (Example file: Viagra.sav) Analyze-> Compare Means-> One-Way ANOVA… Reporting results from one-way independent ANOVA There was a significant effect of Viagra on levels of libido, [ F(2,12) =5.12, p < 0.05, ω =.60 ] There was a significant linear trend, [ F(1,12) = 9.97, p < 0.01, ω =.62 ], indicating that as the dose of Viagra increased, libido increased proportionately.
15
Cronbach’s a test (1) Check the reliability of the questionnaire Actually, it validates the scale of the questionnaire Groups of questions An example Analyze Scale Reliability Analysis… Select the questions of the group
16
Cronbach’s a test (2) Reverse-phrased items Scale if item deleted a>0.8 How to report: The fear of computers, fear of statistics, and fear of math of subscales of the SAQ all had high reliabilities, all Cronbach’s a=0.82. However, the fact of negative peer evaluation subscale had reliability, Cronbach’s a=0.57.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.