Correlation I have two variables, practically „equal“ (traditionally marked as X and Y) – I ask, if they are independent and if they are „correlated“,

Slides:



Advertisements
Similar presentations
A Brief Introduction to Spatial Regression
Advertisements

CORRELATION. Overview of Correlation u What is a Correlation? u Correlation Coefficients u Coefficient of Determination u Test for Significance u Correlation.
Tests of Significance for Regression & Correlation b* will equal the population parameter of the slope rather thanbecause beta has another meaning with.
Hypothesis Testing Steps in Hypothesis Testing:
Correlation Mechanics. Covariance The variance shared by two variables When X and Y move in the same direction (i.e. their deviations from the mean are.
Université d’Ottawa / University of Ottawa 2001 Bio 4118 Applied Biostatistics L10.1 CorrelationCorrelation The underlying principle of correlation analysis.
CORRELATION. Overview of Correlation u What is a Correlation? u Correlation Coefficients u Coefficient of Determination u Test for Significance u Correlation.
PSY 307 – Statistics for the Behavioral Sciences
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Designing Experiments In designing experiments we: Manipulate the independent.
The Simple Regression Model
SIMPLE LINEAR REGRESSION
SIMPLE LINEAR REGRESSION
Chapter 9 For Explaining Psychological Statistics, 4th ed. by B. Cohen 1 What is a Perfect Positive Linear Correlation? –It occurs when everyone has the.
1 PARAMETRIC VERSUS NONPARAMETRIC STATISTICS Heibatollah Baghi, and Mastee Badii.
Hypothesis Testing and T-Tests. Hypothesis Tests Related to Differences Copyright © 2009 Pearson Education, Inc. Chapter Tests of Differences One.
Lecture 16 Correlation and Coefficient of Correlation
SIMPLE LINEAR REGRESSION
Equations in Simple Regression Analysis. The Variance.
Means Tests Hypothesis Testing Assumptions Testing (Normality)
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Correlation.
Introduction to Regression Analysis. Two Purposes Explanation –Explain (or account for) the variance in a variable (e.g., explain why children’s test.
- Interfering factors in the comparison of two sample means using unpaired samples may inflate the pooled estimate of variance of test results. - It is.
Hypothesis of Association: Correlation
Correlation and Regression Used when we are interested in the relationship between two variables. NOT the differences between means or medians of different.
Session 13: Correlation (Zar, Chapter 19). (1)Regression vs. correlation Regression: R 2 is the proportion that the model explains of the variability.
Correlation. Correlation is a measure of the strength of the relation between two or more variables. Any correlation coefficient has two parts – Valence:
- We have samples for each of two conditions. We provide an answer for “Are the two sample means significantly different from each other, or could both.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Correlation. u Definition u Formula Positive Correlation r =
Chapter Eleven Performing the One-Sample t-Test and Testing Correlation.
Significance Tests for Regression Analysis. A. Testing the Significance of Regression Models The first important significance test is for the regression.
Nonparametric Statistics
Pearson’s Correlation The Pearson correlation coefficient is the most widely used for summarizing the relation ship between two variables that have a straight.
Applied Regression Analysis BUSI 6220
Tests of hypothesis Contents: Tests of significance for small samples
Nonparametric Statistics
Spearman’s Rho Correlation
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
Psych 706: stats II Class #4.
Why is this important? Requirement Understand research articles
Statistical Quality Control, 7th Edition by Douglas C. Montgomery.
Modify—use bio. IB book  IB Biology Topic 1: Statistical Analysis
Non-Parametric Tests 12/1.
Chapter 11: Simple Linear Regression
Correlation and Regression
Non-Parametric Tests 12/6.
Non-Parametric Tests.
Elementary Statistics
Quantitative Methods PSY302 Quiz Chapter 9 Statistical Significance
Correlation and Simple Linear Regression
Chapter 9 Hypothesis Testing.
Nonparametric Statistics
Logistic Regression --> used to describe the relationship between
Correlation and Simple Linear Regression
Statistical Inference about Regression
M248: Analyzing data Block D UNIT D3 Related variables.
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
Product moment correlation
SIMPLE LINEAR REGRESSION
Ch 4.1 & 4.2 Two dimensions concept
Nonparametric Statistics
Spearman’s Rank Correlation Coefficient
Spearman’s Rank For relationship data.
Statistical Inference for the Mean: t-test
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Correlation I have two variables, practically „equal“ (traditionally marked as X and Y) – I ask, if they are independent and if they are „correlated“, how much then.

(Pearson) Correlation coefficient If positive deviations from mean in X are connected with positive deviations in Y, and negative ones with negative ones, then the sum is positive Dimensionless number (covariance standardized by variances of single variables), -1 means deterministic negative dependence, +1 deterministic positive dependence.

We presume linear relation, or two-dimensional normal distribution

Even here is r~0, though values aren’t independent But mind, that Y hasn’t normal distribution for this X

r=+0.99 r=-0.99

r=-0.83 r=+0.83

r=-0.45 r=+0.45

Test of null hypothesis H0: =0 r is estimation of parameter of population - . Again translates to the t-test We can use again both, one- and two-tailed test. It is even possible to test null hypothesis, that =some non-zero value, procedure is more complicated.

There are also tabled critical values of r (for different sample sizes)

Comparison with regression It holds, that coefficient of determination in regression (R2) is square of correlation coefficient computed from the same two variables. Probability level of significance test about independence is exactly the same in regression and for correlation coefficient.

Just manipulative experiment proves causality

Power of test Regression is significant just when correlation coefficient is significant. Power of test increases (in both) with strength of relation and with number of observations. When I want to estimate somehow, how much observations I need, I must have an idea, how tight the relation is (how high R2 or ρ is in population).

Power of test: critical values r – it is possible to look for how much observations I need to have ~50% chance to reject H0 on given level of significance (at known ρ) More precise calculations are possible, but in any case, I need to have an idea, what is the correlation in population.

Coefficient of rank correlation (Spearmann) [there is also Kendall] I replace every variable with its rank and I compute its correlation coefficient from rank. For greater samples even values for normal (Pearson) correlation coefficient hold. We can use formula d is difference in rank

But also Spearmann c. will be 0 in this case We can say, that Pearson correlation coefficient is a measure of linear dependence, Spearman is a measure of monotonic dependence.

Another possibility is to use permutation test I change values of independent variable randomly and I count, how many times the resulted dependent variable will be “so nice” as from our data.