Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.

Slides:



Advertisements
Similar presentations
Chapter 12 Simple Linear Regression
Advertisements

Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Forecasting Using the Simple Linear Regression Model and Correlation
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Linear regression models
Simple Linear Regression and Correlation
Chapter 12 Simple Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chapter 10 Simple Regression.
Chapter 12 Simple Regression
Chapter 13 Introduction to Linear Regression and Correlation Analysis
SIMPLE LINEAR REGRESSION
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Ch. 14: The Multiple Regression Model building
Korelasi dalam Regresi Linear Sederhana Pertemuan 03 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Simple Linear Regression and Correlation
Chapter 7 Forecasting with Simple Regression
Introduction to Regression Analysis, Chapter 13,
Simple Linear Regression Analysis
Lecture 5 Correlation and Regression
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Introduction to Linear Regression and Correlation Analysis
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
Relationships between Variables. Two variables are related if they move together in some way Relationship between two variables can be strong, weak or.
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
Ms. Khatijahhusna Abd Rani School of Electrical System Engineering Sem II 2014/2015.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
Introduction to Linear Regression
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
14- 1 Chapter Fourteen McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved.
Chapter 13 Multiple Regression
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Environmental Modeling Basic Testing Methods - Statistics III.
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 10 th Edition.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
The simple linear regression model and parameter estimation
Lecture 11: Simple Linear Regression
Correlation and Simple Linear Regression
Quantitative Methods Simple Regression.
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
Introduction to Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Multiple Regression

Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter –Mean value of dependent variable (Y) when the independent variable (X) is zero

Simple Regression in detail Β1 => Model parameter - Slope that measures change in mean value of dependent variable associated with a one- unit increase in the independent variable ε i => - Error term that describes the effects on Y i of all factors other than value of X i

Assumptions of the Regression Model Error term is normally distributed (normality assumption) Mean of error term is zero (E{ ε i } = 0) Variance of error term is a constant and is independent of the values of X (constant variance assumption) Error terms are independent of each other (independent assumption) Values of the independent variable X is fixed –No error in X values.

Estimating the Model Parameters Calculate point estimate b o and b 1 of unknown parameter β o and β 1 Obtain random sample and use this information from sample to estimate βo and β 1 Obtain a line of best "fit" for sample data points - least squares line = b o + b 1 X i Where is the predicted value of Y

Values of Least Squares Estimates b o and b 1 b 1 = n  x i y i - (  x i )(  y i ) n  x i 2 - (  x i ) 2 b o = y - b i x Where y =  y i ;x =  x i n n b o and b 1 vary from sample to sample. Variation is given by their Standard Errors S bo and S b1

Example 1 To see relationship between Advertising and Store Traffic Store Traffic is the dependent variable and Advertising is the independent variable We find using the formulae that b o = and b 1 =1.54 Are b o and b 1 significant? What is Store Traffic when Advertising is 600?

Example 2 Consider the following data Using formulae we find that b 0 = and b 1 = 1.05 Sales (X)Advertising(Y)

Example 2 Therefore the regression model would be Ŷ = X i r 2 = (0.74) 2 = 0.54 (Variance in sales (Y) explained by ad (X)) Assume that the S bo (Standard error of b 0 ) = 0.51 and S b1 = 0.26 at  = 0.5, df = 4, Is b o significant? Is b 1 significant?

Idea behind Estimation: Residuals Difference between the actual and predicted values are called Residuals Estimate of the error in the population e i = y i - y i = y i - (b o + b 1 x i ) Quantities in hats are predicted quantities b o and b 1 minimize the residual or error sums of squares (SSE) SSE =  e i 2 = (  (y i - y i ) 2 = Σ [y i -(b o + b 1 x i )] 2

Testing the Significance of the Independent Variables Null Hypothesis There is no linear relationship between the independent & dependent variables Alternative Hypothesis There is a linear relationship between the independent & dependent variables

Testing the Significance of the Independent Variables Test Statistic t = b 1 - β 1 s b1 Degrees of Freedom v = n - 2 Testing for a Type II Error H 0 : β 1 = 0 H 1 : β 1 0 Decision Rule Reject H 0 : β 1 = 0 if α > p value

Significance Test for Store Traffic Example Null hypothesis, H o : β 1 =0 Alternative hypothesis, H A : β 1  0 The test statistic is t = = =7.33 With as 0.5 and with Degree of Freedom v = n-2 =18, the value of t from the table is 2.10 Since, we reject the null hypothesis of no linear relationship. Therefore Advertising affects Store Traffic

Predicting the Dependent Variable How well does the model y i = b o + b i x i predict? Error of prediction without indep var is y i - y i Error of prediction with indep var is y i - y i Thus, by using indep var the error in prediction reduces by (y i – y i )-(y i - y i )= (y i – y i ) It can be shown that (y i - y) 2 = ( y i - y) 2 + (y i - y i ) 2

Predicting the Dependent Variable Total variation (SST)= Explained variation (SSM) + Unexplained variation (SSE) A measure of the model’s ability to predict is the Coefficient of Determination (r 2 ) r 2 = = For our example, r 2 =0.74, i.e, 74% of variation in Y is accounted for by X r 2 is the square of the correlation between X and Y

Multiple Regression Used when more than one indep variable affects dependent variable General model Where Y: Dependent variable : Independent variables : Coefficients of the n indep variables : A constant (Intercept)

Issues in Multiple Regression Which variables to include Is relationship between dep variables and each of the indep variables linear? Is dep variable normally distributed for all values of the indep variables? Are each of the indep variables normally distributed (without regard to dep var) Are there interaction variables? Are indep variables themselves highly correlated?

Example 3 Cataloger believes that age (AGE) and income (INCOME) can predict amount spent in last 6 months (DOLLSPENT) The regression equation is DOLLSPENT = INCOME AGE What happens when income(age) increases? Are the coefficients significant?

Example 4 Which customers are most likely to buy? Cataloger believes that ratio of total orders to total pieces mailed is good measure of purchase likelihood Call this ratio RESP Indep variables are - TOTDOLL: total purchase dollars - AVGORDR: average dollar order - LASTBUY: # of months since last purchase

Example 4 Analysis of Variance table - How is total sum of squares split up? - How do you get the various Deg of Freedom? - How do you get/interpret R-square? - How do you interpret the F statistic? - What is the Adjusted R-square?

Example 4 Parameter estimates table - What are the t-values corresp to the estimates? - What are the p-values corresp to the estimates? - Which variables are the most important? - What are standardized estimates? - What to do with non-significant variables?