Applied Quantitative Analysis and Practices LECTURE#28 By Dr. Osman Sadiq Paracha.

Slides:



Advertisements
Similar presentations
Managerial Economics in a Global Economy
Advertisements

Lesson 10: Linear Regression and Correlation
13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Multiple Regression. Outline Purpose and logic : page 3 Purpose and logic : page 3 Parameters estimation : page 9 Parameters estimation : page 9 R-square.
Regression single and multiple. Overview Defined: A model for predicting one variable from other variable(s). Variables:IV(s) is continuous, DV is continuous.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 14 Using Multivariate Design and Analysis.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Statistical Analysis SC504/HS927 Spring Term 2008 Session 5: Week 20: 15 th February OLS (2): assessing goodness of fit, extension to multiple regression.
Multivariate Data Analysis Chapter 4 – Multiple Regression.
Lecture 6: Multiple Regression
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Lecture 5: Simple Linear Regression
Multiple Regression Research Methods and Statistics.
Multiple Regression – Basic Relationships
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Lorelei Howard and Nick Wright MfD 2008
Multiple Regression Dr. Andy Field.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Introduction to Multiple Regression Statistics for Managers.
CHAPTER 5 REGRESSION Discovering Statistics Using SPSS.
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Lecture 15 Basics of Regression Analysis
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
Linear Regression and Correlation
Multiple Regression PSYC 4310 Advanced Experimental Methods and Statistics © 2013, Michael Kalsher.
Chapter 12 Examining Relationships in Quantitative Research Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Applied Quantitative Analysis and Practices LECTURE#22 By Dr. Osman Sadiq Paracha.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
MBP1010H – Lecture 4: March 26, Multiple regression 2.Survival analysis Reading: Introduction to the Practice of Statistics: Chapters 2, 10 and 11.
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Department of Cognitive Science Michael J. Kalsher Adv. Experimental Methods & Statistics PSYC 4310 / COGS 6310 Regression 1 PSYC 4310/6310 Advanced Experimental.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Lecture 4 Introduction to Multiple Regression
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 12 Testing for Relationships Tests of linear relationships –Correlation 2 continuous.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
Applied Quantitative Analysis and Practices LECTURE#31 By Dr. Osman Sadiq Paracha.
Correlation & Regression Analysis
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 10 th Edition.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Applied Quantitative Analysis and Practices LECTURE#30 By Dr. Osman Sadiq Paracha.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 10 th Edition.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
Linear Regression Chapter 7. Slide 2 What is Regression? A way of predicting the value of one variable from another. – It is a hypothetical model of the.
4 basic analytical tasks in statistics: 1)Comparing scores across groups  look for differences in means 2)Cross-tabulating categoric variables  look.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Chapter 8 Relationships Among Variables. Outline What correlational research investigates Understanding the nature of correlation What the coefficient.
Week of March 23 Partial correlations Semipartial correlations
Biostatistics Regression and Correlation Methods Class #10 April 4, 2000.
(Slides not created solely by me – the internet is a wonderful tool) SW388R7 Data Analysis & Compute rs II Slide 1.
Michael J. Kalsher PSYCHOMETRICS MGMT 6971 Regression 1 PSYC 4310 Advanced Experimental Methods and Statistics © 2014, Michael Kalsher.
Regression. Why Regression? Everything we’ve done in this class has been regression: When you have categorical IVs and continuous DVs, the ANOVA framework.
Stats Methods at IC Lecture 3: Regression.
Multiple Regression.
Regression Analysis.
Multiple Regression Prof. Andy Field.
Linear Regression Prof. Andy Field.
Regression Analysis.
Multiple Regression – Part II
CHAPTER 29: Multiple Regression*
Multiple Regression.
Product moment correlation
Regression Analysis.
Introduction to Regression
Chapter Thirteen McGraw-Hill/Irwin
3 basic analytical tasks in bivariate (or multivariate) analyses:
Regression Part II.
Presentation transcript:

Applied Quantitative Analysis and Practices LECTURE#28 By Dr. Osman Sadiq Paracha

Previous Lecture Summary Further SPSS Application of Simple Linear regression Multiple Regression Introduction Multiple Regression Equation Sample Size considerations

What is Multiple Regression? Linear Regression is a model to predict the value of one variable from another. Multiple Regression is a natural extension of this model: We use it to predict values of an outcome from several predictors. It is a hypothetical model of the relationship between several variables.

Regression: An Example A record company boss was interested in predicting album sales from advertising. Data 200 different album releases Outcome variable: Sales (CDs and Downloads) in the week after release Predictor variables The amount spent promoting the album before release Number of plays on the radio

Multiple Regression as an Equation With multiple regression the relationship is described using a variation of the equation of a straight line.

b0b0 b 0 is the intercept. The intercept is the value of the Y variable when all Xs = 0. This is the point at which the regression plane crosses the Y-axis (vertical).

Beta Values b 1 is the regression coefficient for variable 1. b 2 is the regression coefficient for variable 2. b n is the regression coefficient for n th variable.

The Model with Two Predictors

Sample Size Considerations Simple regression can be effective with a sample size of 20, but maintaining power at.80 in multiple regression requires a minimum sample of 50 and preferably 100 observations for most research situations. The minimum ratio of observations to variables is 5 to 1, but the preferred ratio is 15 or 20 to 1, and this should increase when stepwise estimation is used. Maximizing the degrees of freedom improves generalizability and addresses both model parsimony and sample size concerns.

Sample Size Considerations

Estimating the Regression Model and Assessing Overall Model Fit The analyst must accomplish three basic tasks: 1.Select a method for specifying the regression model to be estimated, 2.Assess the statistical significance of the overall model in predicting the dependent variable, and 3.Determine whether any of the observations exert an undue influence on the results.

Methods of Regression Hierarchical: Experimenter decides the order in which variables are entered into the model. Forced Entry: All predictors are entered simultaneously. Stepwise: Predictors are selected using their semi-partial correlation with the outcome.

Hierarchical Regression Known predictors (based on past research) are entered into the regression model first. New predictors are then entered in a separate step/block. Experimenter makes the decisions.

Hierarchical Regression It is the best method: Based on theory testing. You can see the unique predictive influence of a new variable on the outcome because known predictors are held constant in the model. Bad Point: Relies on the experimenter knowing what they’re doing!

Forced Entry Regression All variables are entered into the model simultaneously. The results obtained depend on the variables entered into the model. It is important, therefore, to have good theoretical reasons for including a particular variable.

Stepwise Regression I Variables are entered into the model based on mathematical criteria. Computer selects variables in steps. Step 1 SPSS looks for the predictor that can explain the most variance in the outcome variable.

Stepwise Regression II Step 2: Having selected the 1 st predictor, a second one is chosen from the remaining predictors. The semi-partial correlation is used as a criterion for selection.

Semi-Partial Correlation Partial correlation: measures the relationship between two variables, controlling for the effect that a third variable has on them both. A semi-partial correlation: Measures the relationship between two variables controlling for the effect that a third variable has on only one of the others.

Semi-Partial Correlation in Regression The semi-partial correlation Measures the relationship between a predictor and the outcome, controlling for the relationship between that predictor and any others already in the model. It measures the unique contribution of a predictor to explaining the variance of the outcome.

Problems with Stepwise Methods Rely on a mathematical criterion. Variable selection may depend upon only slight differences in the Semi-partial correlation. These slight numerical differences can lead to major theoretical differences. Should be used only for exploration

R and R 2 R The correlation between the observed values of the outcome, and the values predicted by the model. R 2 The proportion of variance accounted for by the model. Adj. R 2 reduces the R 2 by taking into account the sample size and the number of independent variables in the regression model (It becomes smaller as we have fewer observations per independent variable)..

Analysis of Variance: ANOVA The F-test looks at whether the variance explained by the model (SS R ) is significantly greater than the error within the model (SS E ). It tells us whether using the regression model is significantly better at predicting values of the outcome than using the mean.

How to Interpret Beta Values Beta values: the change in the outcome associated with a unit change in the predictor. Standardised beta values: tell us the same but expressed as standard deviations.

Lecture Summary Methods of Multiple Regression Hierarchical Regression Forced Entry Method Stepwise Method Problems with multiple regression R and R 2 Beta values