Relationship between two continuous variables: correlations and linear regression both continuous. Correlation – larger values of one variable correspond.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

Correlation and regression
Chapter 12 Simple Linear Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Correlation and Regression
Sociology 601 Class 17: October 28, 2009 Review (linear regression) –new terms and concepts –assumptions –reading regression computer outputs Correlation.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Correlation and Regression. Spearman's rank correlation An alternative to correlation that does not make so many assumptions Still measures the strength.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Chapter Eighteen MEASURES OF ASSOCIATION
Measures of Association Deepak Khazanchi Chapter 18.
Business Statistics - QBM117 Statistical inference for regression.
Simple Linear Regression Analysis
Statistical hypothesis testing – Inferential statistics II. Testing for associations.
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
Introduction to Regression Analysis. Two Purposes Explanation –Explain (or account for) the variance in a variable (e.g., explain why children’s test.
Ms. Khatijahhusna Abd Rani School of Electrical System Engineering Sem II 2014/2015.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Data Handling & Analysis BD7054 Scatter Plots Andrew Jackson
Correlation and Regression Used when we are interested in the relationship between two variables. NOT the differences between means or medians of different.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
CHAPTER 3 INTRODUCTORY LINEAR REGRESSION. Introduction  Linear regression is a study on the linear relationship between two variables. This is done by.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Regression. Types of Linear Regression Model Ordinary Least Square Model (OLS) –Minimize the residuals about the regression linear –Most commonly used.
Examining Relationships in Quantitative Research
Introduction to Biostatistics and Bioinformatics Regression and Correlation.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Advanced Statistical Methods: Continuous Variables REVIEW Dr. Irina Tomescu-Dubrow.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Stats of Engineers, Lecture 8. 1.If the sample mean was larger 2.If you increased your confidence level 3.If you increased your sample size 4.If the population.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Nonparametric Statistics
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
BUSINESS MATHEMATICS & STATISTICS. Module 6 Correlation ( Lecture 28-29) Line Fitting ( Lectures 30-31) Time Series and Exponential Smoothing ( Lectures.
McGraw-Hill/Irwin © 2003 The McGraw-Hill Companies, Inc.,All Rights Reserved. Part Four ANALYSIS AND PRESENTATION OF DATA.
Research Methods: 2 M.Sc. Physiotherapy/Podiatry/Pain Correlation and Regression.
Chapter 12: Correlation and Linear Regression 1.
Nonparametric Statistics
Regression Analysis AGEC 784.
Linear Regression Special Topics.
Kin 304 Regression Linear Regression Least Sum of Squares
Correlation, Regression & Nested Models
Linear Regression Prof. Andy Field.
BPK 304W Regression Linear Regression Least Sum of Squares
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
BPK 304W Correlation.
CORRELATION(r) and REGRESSION (b)
Nonparametric Statistics
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
Inference about the Slope and Intercept
No notecard for this quiz!!
Inference about the Slope and Intercept
Hypothesis testing and Estimation
Association, correlation and regression in biomedical research
Simple Linear Regression
Correlation and Regression
Linear Regression and Correlation
Chapter 14 Inference for Regression
Linear Regression and Correlation
Inference for Regression
Created by Erin Hodgess, Houston, Texas
Correlation & Regression
3 basic analytical tasks in bivariate (or multivariate) analyses:
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Relationship between two continuous variables: correlations and linear regression both continuous. Correlation – larger values of one variable correspond to larger/smaller values of the other variable. r measures the strenght. From plus one to minus one, zero – no relationship; one – on a straight line. p measures stat significance, significance of r differing from zero. Parameetric or Pearson correlation assumes normal distribution of both variables.

We start calculating Pearson’s r from calculating covariance: ... which is not convenient, so let’s rescale it: .... and the result is between –1 ja +1

r = 0.96 r = 0.53 r = 0.43 r = -1 r = -0.83 r = -0.96

Non-parametric correlation relies on ranks. Single observations far away do not disturb. Usually Spearman’s (rank) correlation. Power is lower, But also real differences – what to think about a non-linear relationship? Ordinal variables. Philosophical aspect – we can describe the same thing differently in mathematical terms!

We report the result: “between …. there was a correlation (r= , N=, p= )” or if non-parametric then “..... (rs= ; N=, p= )” Symmetrical and dimensionless. To appoximate the relationship by a function - regression. Least-squares method – residuals predicting: predicted value. The fitted line has two parameters: intercept and slope (b). Slope has a unit, value depends on the units of the axes.

eggs laid weight, kg y = 2,04x – 1,2

wool production, kg hours basked y = -0,195x + 7,1

Test following the path of ANOVA F=MSmodel/MSerror SStotal=SSmodel+SSerror, R2 = SSmodel/SStotal model acconts ... % of variance. Two ways to express strength – slope and R2 , p does not measure the strength of the relationship.

Presenting results “weight depended on length (b=..., R2= ....., df=....., F= ..., p<0.001)” equation: length = 3.78*temperature + 47.6 Standard error of slope Intercept zero – proportional, if x changes k times, then also y changes k. Regression is not symmetrical!

Assumptions of regression analysis are as follows: - residuals should be normally distributed; - variance of residuals must be independent on the values of x – otherwise heteroscedastic. - no other dependence on x; Distribution of x variable not important. Transformations – but do not forget when writing the equation. Regression through the origin.