Correlations and T-tests

Slides:



Advertisements
Similar presentations
Inferential Statistics
Advertisements

Chapter 18: The Chi-Square Statistic
Lesson 10: Linear Regression and Correlation
SPSS Session 5: Association between Nominal Variables Using Chi-Square Statistic.
Correlation and Linear Regression.
Inference for Regression
Bivariate Analyses.
Bivariate Analysis Cross-tabulation and chi-square.
Describing Relationships Using Correlation and Regression
Chapter 11 Contingency Table Analysis. Nonparametric Systems Another method of examining the relationship between independent (X) and dependant (Y) variables.
Inference1 Data Analysis Inferential Statistics Research Methods Gail Johnson.
Correlation Chapter 9.
Statistics II: An Overview of Statistics. Outline for Statistics II Lecture: SPSS Syntax – Some examples. Normal Distribution Curve. Sampling Distribution.
Matching level of measurement to statistical procedures
UNDERSTANDING RESEARCH RESULTS: STATISTICAL INFERENCE © 2012 The McGraw-Hill Companies, Inc.
Today Concepts underlying inferential statistics
Data Analysis Statistics. Levels of Measurement Nominal – Categorical; no implied rankings among the categories. Also includes written observations and.
Multiple Regression – Basic Relationships
Summary of Quantitative Analysis Neuman and Robson Ch. 11
SW388R6 Data Analysis and Computers I Slide 1 One-sample T-test of a Population Mean Confidence Intervals for a Population Mean.
Leon-Guerrero and Frankfort-Nachmias,
Mann-Whitney and Wilcoxon Tests.
Inferential Statistics
Chapter 12 Inferential Statistics Gay, Mills, and Airasian
Inferential Statistics
Chapter 8: Bivariate Regression and Correlation
Lecture 16 Correlation and Coefficient of Correlation
Week 12 Chapter 13 – Association between variables measured at the ordinal level & Chapter 14: Association Between Variables Measured at the Interval-Ratio.
AM Recitation 2/10/11.
Selecting the Correct Statistical Test
Week 9 Chapter 9 - Hypothesis Testing II: The Two-Sample Case.
LEARNING PROGRAMME Hypothesis testing Intermediate Training in Quantitative Analysis Bangkok November 2007.
Inferential Statistics & Test of Significance
Bivariate Relationships Analyzing two variables at a time, usually the Independent & Dependent Variables Like one variable at a time, this can be done.
Learning Objectives In this chapter you will learn about the t-test and its distribution t-test for related samples t-test for independent samples hypothesis.
Chapter 8 – 1 Chapter 8: Bivariate Regression and Correlation Overview The Scatter Diagram Two Examples: Education & Prestige Correlation Coefficient Bivariate.
Data Analysis (continued). Analyzing the Results of Research Investigations Two basic ways of describing the results Two basic ways of describing the.
Hypothesis Testing Using the Two-Sample t-Test
Correlation & Regression
Chapter 14 – 1 Chapter 14: Analysis of Variance Understanding Analysis of Variance The Structure of Hypothesis Testing with ANOVA Decomposition of SST.
Inference and Inferential Statistics Methods of Educational Research EDU 660.
Educational Research Chapter 13 Inferential Statistics Gay, Mills, and Airasian 10 th Edition.
Correlation & Regression Chapter 15. Correlation It is a statistical technique that is used to measure and describe a relationship between two variables.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 16 Data Analysis: Testing for Associations.
Chapter 14 – 1 Chapter 14: Analysis of Variance Understanding Analysis of Variance The Structure of Hypothesis Testing with ANOVA Decomposition of SST.
CHI SQUARE TESTS.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
1 Regression & Correlation (1) 1.A relationship between 2 variables X and Y 2.The relationship seen as a straight line 3.Two problems 4.How can we tell.
T tests comparing two means t tests comparing two means.
Introducing Communication Research 2e © 2014 SAGE Publications Chapter Seven Generalizing From Research Results: Inferential Statistics.
1 Chapter 10 Correlation. 2  Finding that a relationship exists does not indicate much about the degree of association, or correlation, between two variables.
Linear Regression and Correlation Chapter GOALS 1. Understand and interpret the terms dependent and independent variable. 2. Calculate and interpret.
Copyright © 2014 by Nelson Education Limited Chapter 11 Introduction to Bivariate Association and Measures of Association for Variables Measured.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 11 Testing for Differences Differences betweens groups or categories of the independent.
Chapter 13 Understanding research results: statistical inference.
Jump to first page Inferring Sample Findings to the Population and Testing for Differences.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Linear Regression and Correlation Chapter 13.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 4 Investigating the Difference in Scores.
Educational Research Inferential Statistics Chapter th Chapter 12- 8th Gay and Airasian.
(Slides not created solely by me – the internet is a wonderful tool) SW388R7 Data Analysis & Compute rs II Slide 1.
Interpretation of Common Statistical Tests Mary Burke, PhD, RN, CNE.
Bivariate Association. Introduction This chapter is about measures of association This chapter is about measures of association These are designed to.
CHAPTER 15: THE NUTS AND BOLTS OF USING STATISTICS.
Outline Sampling Measurement Descriptive Statistics:
REGRESSION G&W p
CJ 526 Statistical Analysis in Criminal Justice
UNDERSTANDING RESEARCH RESULTS: STATISTICAL INFERENCE
Statistics II: An Overview of Statistics
Multiple Regression – Split Sample Validation
Presentation transcript:

Correlations and T-tests Matching level of measurement to statistical procedures

We can match statistical methods to the level of measurement of the two variables that we want to assess: Level of Measurement Nominal Ordinal Interval Ratio Chi-square T-test ANOVA Chi-Square Correlation Regression

However, we should only use these tests when: We have a normal distribution for an interval or ratio level variable. When the dependent variable (for Correlation, T-test, ANOVA, and Regression) is interval or ratio. When our sample has been randomly selected or is from a population.

Interpreting a Correlation from an SPSS Printout

A correlation is: An association between two interval or ratio variables. Can be positive or negative. Measures the strength of the association between the two variables and whether it is large enough to be statistically signficant. Can range from -1.00 to 0.00 and from 0.00 to 1.00.

Example: Types of Relationships Positive Negative No Relationship Income ($) Education (yrs) 20,000 10 18 14 30,000 12 16 40,000 50,000 75,000

The stronger the correlation the closer it will be to 1. 00 or -1. 00 The stronger the correlation the closer it will be to 1.00 or -1.00. Weak correlations will be close to 0.00 (either positive or negative)

You can see the degree of correlation (association) by using a scatterplot graph

Looking at a scatterplot from the same data set, current and beginning salary we can see a stronger correlation

If we run the correlation between these two variables in SPSS, we find

Decision: r (correlation) = .88 at p. = .000. For these two variables, if we were to test a hypothesis at Confidence Level, .01 Alternative Hypothesis: There is a positive association between beginning and current salary. Null Hypothesis: There is no association between beginning and current salary. Decision: r (correlation) = .88 at p. = .000. .000 is less than .01. We reject the null hypothesis and accept the alternative hypothesis! (Bonus Question): Why would we expect the previous correlation to be statistically significant at below the p.= .01 level? Answer: This is a large data set N = 474 – this makes it likely that if there is a correlation, it will be statistically significant at a low significance (p) level. Larger data sets are less likely to be affected by sampling or random error!

Other important information on correlation Correlation does not tell us if one variable “causes” the other – so there really isn’t an independent or dependent variable. With correlation, you should be able to draw a straight line between the highest and lowest point in the distribution. Points that are off the “best fit” line, indicate that the correlation is less than perfect (-1/+1). Regression is the statistical method that allows us to determine whether the value of one interval/ratio level can be used to predict or determine the value of another.

Another measure of association is a t-test. T-tests Measure the association between a nominal level variable and an interval or ratio level variable. It looks at whether the nominal level variable causes a change in the interval/ratio variable. Therefore the nominal level variable is always the independent variable and the interval/ratio variable is always the dependent.

Example of t-test – Self –Esteem Scores Men Women 32 34 44 18 56 52 16 21 33 39 26 25 35 28 20 32.875 29.25

Important things to know about an independent samples t-test It can only be used when the nominal variable has only two categories. Most often the nominal variable pertains to membership in a specific demographic group or a sample. The association examined by the independent samples t-test is whether the mean of interval/ratio variable differs significantly in each of the two groups. If it does, that means that group membership “causes” the change or difference in the mean score.

Looking at the difference in means between the two groups, can we tell if the difference is large enough to be statistically significant?

T-test results

Positive and Negative t-tests Your t-test will be positive when, the lowest value category (1,2) or (0,1) is entered into the grouping menu first and the mean of that first group is higher than the second group. Your t-test will be negative when the lowest value category is entered into the grouping menu first and the mean of the second group is higher than the first group.

Paired Samples T-Test Used when respondents have taken both a pre and post-test using the same measurement tool (usually a standardized test). Supplements results obtained when the mean scores for all the respondents on the post test is subtracted from the pre test scores. If there is a change in the scores from the pre test and post test, it usually means that the intervention is effective. A statistically significant paired samples t-test usually means that the change in pre and post test score is large enough that the change can not be simply due to random or sampling error. An important exception here is that the change in pre and post test score must be in the direction (positive/negative specified in the hypothesis).

Pair-samples t-test (continued) For example if our hypothesis states that: Participation in the welfare reform experiment is associated with a positive change in welfare recipient wages from work and participation in the experiment actually decreased wages, then our hypothesis would not be confirmed. We would accept the null hypothesis and accept the alternative hypothesis. Pre-test wages = Mean = $400 per month for each participant Post-test wages = Mean = $350 per month for each participant. However, we need to know the t-test value to know if the difference in means is large enough to be statistically significant. What are the alternative and null hypothesis for this study?

Let’s test a hypothesis for an independent t-test We want to know if women have higher scores on a test of exam-related anxiety than men. The researcher has set the confidence level for this study at p. = .05. On the SPSS printout, t=2.6, p. = .03. What are the alternative and null hypothesis? Can we accept or reject the null hypothesis.

Answer Alternative hypothesis: Women have higher levels of exam-related anxiety than men as measured by a standardized test. Null hypothesis: There will be no difference between men and women on the standardized test of exam-related anxiety. Reject the null hypothesis, (p = .03 is less than the confidence level of .05.) Accept the alternative hypothesis. There is a relationship.

Computing a Correlation Select Analyze Select Correlate Select two or more variables and click add Click o.k.

Computing an independent t-test Select Analyze Select Means Select Independent T-test Select Test (Dependent Variable - must be ratio) Select Grouping Variable (must be nominal – only two categories) Select numerical category for each group (Usually group 1 = 1, group 2 = 2) Click o.k.

Computing a paired sample t-test Select Analyze Select Compare Means Select Paired Samples T-test Highlight two interval/ratio variables – should be from pre and post test Click on arrow Click o.k.

Data from Paired Sample T-test

More data from paired samples t-test

Analysis of Variance (ANOVA) Is used when you want to compare means for three or more groups. You have a normal distribution (random sample or population. It can be used to determine causation. It contains an independent variable that is nominal and a dependent variable that is interval/ratio.