Regression model with multiple predictors

Slides:



Advertisements
Similar presentations
Topic 12: Multiple Linear Regression
Advertisements

The Multiple Regression Model.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Chapter 12 Simple Linear Regression
Chapter 10 Simple Regression.
Chapter Topics Types of Regression Models
Chapter 7 Forecasting with Simple Regression
Simple Linear Regression Models
No Intercept Regression and Analysis of Variance.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
Go to Table of Content Single Variable Regression Farrokh Alemi, Ph.D. Kashif Haqqi M.D.
Chapter 5: Regression Analysis Part 1: Simple Linear Regression.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Multiple Regression Petter Mostad Review: Simple linear regression We define a model where are independent (normally distributed) with equal.
ANOVA for Regression ANOVA tests whether the regression model has any explanatory power. In the case of simple regression analysis the ANOVA test and the.
Regression Analysis Relationship with one independent variable.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Simple Linear Regression (SLR)
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Essentials of Business Statistics: Communicating with Numbers By Sanjiv Jaggia and Alison Kelly Copyright © 2014 by McGraw-Hill Higher Education. All rights.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
ENGR 610 Applied Statistics Fall Week 11 Marshall University CITE Jack Smith.
Analysis of variance approach to regression analysis … an (alternative) approach to testing for a linear association.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Multiple Regression.
Chapter 13 Simple Linear Regression
The simple linear regression model and parameter estimation
Lecture 11: Simple Linear Regression
Chapter 14 Introduction to Multiple Regression
Chapter 20 Linear and Multiple Regression
Regression Analysis AGEC 784.
Inference for Least Squares Lines
Statistics for Managers using Microsoft Excel 3rd Edition
Ch12.1 Simple Linear Regression
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
John Loucks St. Edward’s University . SLIDES . BY.
Multiple Regression and Model Building
Chapter 11 Simple Regression
Statistics for Business and Economics (13e)
Relationship with one independent variable
Chapter 13 Simple Linear Regression
Simple Linear Regression
Quantitative Methods Simple Regression.
Slides by JOHN LOUCKS St. Edward’s University.
Correlation and Simple Linear Regression
CHAPTER 29: Multiple Regression*
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Review of Chapter 3 where Multiple Linear Regression Model:
Multiple Regression.
Prepared by Lee Revere and John Large
Multiple Regression Models
Simple Linear Regression
PENGOLAHAN DAN PENYAJIAN
Correlation and Simple Linear Regression
Relationship with one independent variable
CHAPTER 14 MULTIPLE REGRESSION
Multiple Linear Regression
Simple Linear Regression
Chapter Fourteen McGraw-Hill/Irwin
Introduction to Regression
St. Edward’s University
Chapter 13 Simple Linear Regression
Presentation transcript:

Regression model with multiple predictors Set 8 Regression model with multiple predictors

Multivariate Data +1 of variables One response (dependent) variable y = price of an item purchased by the individual K explanatory (independent) variables x1 = income of an individual x2 = education of an individual . xK = age of an individual Data for individual i : (x1i ,x2i ,. . . , xKi ,yi )

Graphs and Summary Measures Matrix plot Correlation matrix

Multiple Linear Regression Model yi = b0 + b1x1i+. . . + bKxKi + ei Response = Model + Error term xk is an explanatory (independent) variable b0 and b1, . . ., bK are unknown parameters b0, intercept b1, . . ., bK, are regression coefficients ei, unknown error, E(ei)=0, Var(ei)= s2

Least Square (LS) Estimation Estimate the unknown regression my|x’s = b0 + b1x1 + . . . + bKxK by the LS regression line Given in regression output LS estimate of the intercept parameter b0 LS model passes through the mean values

Statistics for each coefficient bj Estimate bj for each coefficient bj Standard error of the estimate for each coefficient: SE(bj) T statistic for bj = 0 (xj can be dropped from the model) Statistics that can be computed Margin of error using t table: ME(bj) = t times SE(bj), df=n-K-1 Interval estimate for bj: [bj - ME(bj), bj + ME(bj)] All values inside the interval for bj are acceptable Any value outside of the interval for bj is not acceptable If zero is inside the interval, xj can be dropped from the model

ANOVA Table Sums of Square SS Total = SS Error + SS Regression Degrees of freedom: n - 1 = n – K-1 + K Given in regression output

ANOVA Table Mean Squares & F Mean Square Error s2 is the LS estimate of s2 Standard Error of regression Mean Square regression F-ratio (Test for the regression relationship) Null model H: b1 = b2 = . . . = bK = 0 df1=K, df2=n-K-1 Given in regression output

Analysis of Variance (ANOVA) Table Source df Regression K Error n-K-1 Total n-1 SS SSR SSE SST MS MSR MSE F ratio

R2 and Adjusted R2 Fraction of variation of Y explained by the regression model R2 increases as predictors are added to the model F-ratio Adjusted R2 R2adj takes the number of predictors into account Useful for comparing models with different numbers of predictors Given in regression output

The Special Case of K=1 R2=Corr2(Y, X) F ratio = (T ratio)2 When there is only one variable in the model, K=1 R2=Corr2(Y, X) F ratio = (T ratio)2 When df1=1, F for a=.05 is equal to t2 for a=.025 with df2

Partial F Test Test for dropping more than one variable from the model Compute the estimate of the model without the variables to be dropped (reduce model) Partial F test df1=Kfull-Kreduced, df2=n - Kfull -1 (Same as SSEfull ) Reject the reduced model if F is large Sequential SS: Place predictors to be tested last in the models Alternative formula with R2’s of the full and reduced models

Margins of Error for Predictions Margin of error using t table: df=n-K-1 Prediction of the mean outcome t (SE of prediction of the mean outcome) Interval estimate Given as an option output Prediction of a single outcome t (SE of prediction of the single outcome) The interval for prediction for a single outcome is wider than the interval for the mean outcome

Residuals Fitted values LS Residuals

Residual Diagnostics Distribution of the residuals should be normal Distribution plot Normal probability plot Tests of normality Residuals must be patternless Time series (sequence) plot of residuals Plot of residuals against the fitted values Plot of residuals against each xj