Regression analysis Linear regression Logistic regression.

Slides:



Advertisements
Similar presentations
9: Examining Relationships in Quantitative Research ESSENTIALS OF MARKETING RESEARCH Hair/Wolfinbarger/Ortinau/Bush.
Advertisements

Regression analysis Linear regression Logistic regression.
Lesson 10: Linear Regression and Correlation
Kin 304 Regression Linear Regression Least Sum of Squares
Chapter 12 Simple Linear Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Simple Linear Regression 1. Correlation indicates the magnitude and direction of the linear relationship between two variables. Linear Regression: variable.
Chapter 12 Simple Linear Regression
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
Multipe and non-linear regression. What is what? Regression: One variable is considered dependent on the other(s) Correlation: No variables are considered.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
N-way ANOVA. 3-way ANOVA 2 H 0 : The mean respiratory rate is the same for all species H 0 : The mean respiratory rate is the same for all temperatures.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Chapter Topics Types of Regression Models
Simple Linear Regression Analysis
Measures of Association Deepak Khazanchi Chapter 18.
Correlation and Regression Analysis
Regression Model Building Setting: Possibly a large set of predictor variables (including interactions). Goal: Fit a parsimonious model that explains variation.
Simple Linear Regression Analysis
Regression and Correlation
Correlation & Regression
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Chapter 11 Simple Regression
ASSOCIATION BETWEEN INTERVAL-RATIO VARIABLES
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Review of Statistical Models and Linear Regression Concepts STAT E-150 Statistical Methods.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Chapter 12 Examining Relationships in Quantitative Research Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Regression. Population Covariance and Correlation.
Examining Relationships in Quantitative Research
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
April 4 Logistic Regression –Lee Chapter 9 –Cody and Smith 9:F.
Regression: Checking the Model Peter T. Donnan Professor of Epidemiology and Biostatistics Statistics for Health Research.
Chapter 16 Data Analysis: Testing for Associations.
Simple Linear Regression (SLR)
Examining Relationships in Quantitative Research
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
28. Multiple regression The Practice of Statistics in the Life Sciences Second Edition.
Multiple Logistic Regression STAT E-150 Statistical Methods.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Correlation and Linear Regression Peter T. Donnan Professor of Epidemiology and Biostatistics Statistics for Health Research.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Simple Linear Regression
Regression Analysis AGEC 784.
REGRESSION (R2).
Correlation, Bivariate Regression, and Multiple Regression
Basic Estimation Techniques
Kin 304 Regression Linear Regression Least Sum of Squares
Multiple Regression.
Regression Analysis.
BPK 304W Regression Linear Regression Least Sum of Squares
BIVARIATE REGRESSION AND CORRELATION
BPK 304W Correlation.
Simple Linear Regression
Correlation and Regression
Regression Part II.
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Regression analysis Linear regression Logistic regression

Relationship and association 2

Straight line 3

Best straight line? 4

Best straight line! 5 Least square estimation

Simple linear regression 1.Is the association linear? 6

Simple linear regression 1.Is the association linear? 2.Describe the association: what is b 0 and b 1 BMI = -12.6kg/m kg/m 3 *Hip 7

Simple linear regression 1.Is the association linear? 2.Describe the association 3.Is the slope significantly different from 0? Help SPSS!!! 8 Coefficients a Model Unstandardized Coefficients Standardized Coefficients tSig. BStd. ErrorBeta 1(Constant)-12,5812,331-5,396,000 Hip,345,023,56515,266,000 a. Dependent Variable: BMI

Simple linear regression 1.Is the association linear? 2.Describe the association 3.Is the slope significantly different from 0? 4.How good is the fit? How far are the data points fom the line on avarage? 9

The Correlation Coefficient, r 10 R = 0 R = 1 R = 0.7 R = -0.5

r 2 – Goodness of fit How much of the variation can be explained by the model? 11 R 2 = 0 R 2 = 1 R 2 = 0.5 R 2 = 0.2

Multiple linear regression Could waist measure descirbe some of the variation in BMI? BMI =1.3 kg/m kg/m 3 * Waist Or even better: 12

Multiple linear regression Adding age: adj R 2 = Adding thigh: adj R 2 = 0.352? 13 Coefficients a Model Unstandardized Coefficients Standardized Coefficients tSig. 95,0% Confidence Interval for B BStd. ErrorBetaLower BoundUpper Bound 1(Constant)-9,0012,449-3,676,000-13,813-4,190 Waist,168,043,2013,923,000,084,252 Hip,252,031,4118,012,000,190,313 Age-,064,018-,126-3,492,001-,101-,028 a. Dependent Variable: BMI Coefficients a Model Unstandardized Coefficients Standardized Coefficients tSig. 95,0% Confidence Interval for B BStd. ErrorBetaLower BoundUpper Bound 1(Constant)3,5811,7842,007,045,0757,086 Waist,168,043,2013,923,000,084,252 Age-,064,018-,126-3,492,001-,101-,028 Thigh,252,031,4118,012,000,190,313 a. Dependent Variable: BMI

Assumptions 1.Dependent variable must be metric continuous 2.Independent must be continuous or ordinal 3.Linear relationship between dependent and all independent variables 4.Residuals must have a constant spread. 5.Residuals are normal distributed 6.Independent variables are not perfectly correlated with each other 14

Multiple linear regression in SPSS 15

Click ‘statistics’ and ‘plots’ 16

Logistic regression 17

Logistic Regression If the dependent variable is categorical and especially binary? Use some interpolation method Linear regression cannot help us. 18

19 The sigmodal curve

20 The sigmodal curve The intercept basically just ‘scale’ the input variable

21 The sigmodal curve The intercept basically just ‘scale’ the input variable Large regression coefficient → risk factor strongly influences the probability

22 The sigmodal curve The intercept basically just ‘scale’ the input variable Large regression coefficient → risk factor strongly influences the probability Positive regression coefficient → risk factor increases the probability Logistic regession uses maximum likelihood estimation, not least square estimation

Does age influence the diagnosis? Continuous independent variable 23 Variables in the Equation BS.E.WalddfSig.Exp(B) 95% C.I.for EXP(B) LowerUpper Step 1 a Age,109,010108,7451,0001,1151,0921,138 Constant-4,213,42399,0971,000,015 a. Variable(s) entered on step 1: Age.

Simple logistic regression in SPSS 24

Does previous intake of OCP influence the diagnosis? Categorical independent variable Variables in the Equation BS.E.WalddfSig.Exp(B) 95% C.I.for EXP(B) LowerUpper Step 1 a OCP(1)-,311,1802,9791,084,733,5151,043 Constant,233,1233,5831,0581,263 a. Variable(s) entered on step 1: OCP. 25

Odds ratio 26

Simple logistic regression with catagorical predictor in SPSS 27

Multiple logistic regression Variables in the Equation BS.E.WalddfSig.Exp(B) 95% C.I.for EXP(B) LowerUpper Step 1 a Age,123,011115,3431,0001,1311,1061,157 BMI,083,01918,7321,0001,0871,0461,128 OCP,528,2195,8081,0161,6951,1042,603 Constant-6,974,76283,7771,000,001 a. Variable(s) entered on step 1: Age, BMI, OCP. 28

Predicting the diagnosis by logistic regression What is the probability that the tumor of a 50 year old woman who has been using OCP and has a BMI of 26 is malignant? z = * * *1 = p = 1/(1+e ) = Variables in the Equation BS.E.WalddfSig.Exp(B) 95% C.I.for EXP(B) LowerUpper Step 1 a Age,123,011115,3431,0001,1311,1061,157 BMI,083,01918,7321,0001,0871,0461,128 OCP,528,2195,8081,0161,6951,1042,603 Constant-6,974,76283,7771,000,001 a. Variable(s) entered on step 1: Age, BMI, OCP.

Multiple logistic regression in SPSS 30

Opgaver Vis at odds = e z 31