Regression Analysis Relationship with one independent variable.

Slides:



Advertisements
Similar presentations
Lecture 10 F-tests in MLR (continued) Coefficients of Determination BMTRY 701 Biostatistical Methods II.
Advertisements

Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Chapter 12 Simple Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
Chapter 10 Simple Regression.
The Simple Regression Model
1 Pertemuan 13 Uji Koefisien Korelasi dan Regresi Matakuliah: A0392 – Statistik Ekonomi Tahun: 2006.
SIMPLE LINEAR REGRESSION
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Korelasi dalam Regresi Linear Sederhana Pertemuan 03 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
BCOR 1020 Business Statistics
This Week Continue with linear regression Begin multiple regression –Le 8.2 –C & S 9:A-E Handout: Class examples and assignment 3.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Simple Linear Regression and Correlation
Chapter 7 Forecasting with Simple Regression
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Lecture 15 Basics of Regression Analysis
Introduction to Linear Regression and Correlation Analysis
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 1 Slide © 2004 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
CHAPTER 14 MULTIPLE REGRESSION
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
Applied Quantitative Analysis and Practices LECTURE#22 By Dr. Osman Sadiq Paracha.
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Go to Table of Content Single Variable Regression Farrokh Alemi, Ph.D. Kashif Haqqi M.D.
Chapter 5: Regression Analysis Part 1: Simple Linear Regression.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
ANOVA for Regression ANOVA tests whether the regression model has any explanatory power. In the case of simple regression analysis the ANOVA test and the.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 13 Multiple Regression
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Lecture 10: Correlation and Regression Model.
Environmental Modeling Basic Testing Methods - Statistics III.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 19 Measure of Variation in the Simple Linear Regression Model (Data)Data.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Multiple Regression.
Lecture 11: Simple Linear Regression
Simple Linear Regression & Correlation
Statistics for Managers using Microsoft Excel 3rd Edition
Statistics for Business and Economics (13e)
Relationship with one independent variable
Simple Linear Regression
Quantitative Methods Simple Regression.
Multiple Regression.
Prepared by Lee Revere and John Large
Relationship with one independent variable
SIMPLE LINEAR REGRESSION
St. Edward’s University
Presentation transcript:

Regression Analysis Relationship with one independent variable

Lecture Objectives You should be able to interpret Regression Output. Specifically, 1.Interpret Significance of relationship (Sig. F) 2.The parameter estimates (write and use the model) 3.Compute/interpret R-square, Standard Error (ANOVA table)

Basic Equation Independent variable (x) Dependent variable (y) ŷ = b 0 + b 1 X b 0 (y intercept) b 1 = slope = ∆y/ ∆x є The straight line represents the linear relationship between y and x.

Understanding the equation What is the equation of this line?

Total Variation Sum of Squares (SST) What if there were no information on X (and hence no regression)? There would only be the y axis (green dots showing y values). The best forecast for Y would then simply be the mean of Y. Total Error in the forecasts would be the total variation from the mean. Dependent variable (y) Independent variable (x) Mean Y Variation from mean (Total Variation)

Sum of Squares Total (SST) Computation Shoe Sizes for 13 Children XYDeviationSquared ObsAge Shoe Sizefrom Meandeviation Sum of Squared Mean Deviations (SST) In computing SST, the variable X is irrelevant. This computation tells us the total squared deviation from the mean for y.

Error after Regression Dependent variable (y) Independent variable (x) Mean Y Total Variation Explained by regression Residual Error (unexplained) Information about x gives us the regression model, which does a better job of predicting y than simply the mean of y. Thus some of the total variation in y is explained away by x, leaving some unexplained residual error.

Computing SSE Shoe Sizes for 13 Children XYResidual ObsAge Shoe SizePred. Y(Error)Squared Sum of Squares PredictionIntercept (bo) Error Equation:Slope (b1)

The Regression Sum of Squares Some of the total variation in y is explained by the regression, while the residual is the error in prediction even after regression. Sum of squares Total = Sum of squares explained by regression + Sum of squares of error still left after regression. SST = SSR + SSE or, SSR = SST - SSE

R-square The proportion of variation in y that is explained by the regression model is called R 2. R 2 = SSR/SST = (SST-SSE)/SST F or the shoe size example, R 2 = ( – )/ = R 2 ranges from 0 to 1, with a 1 indicating a perfect relationship between x and y.

Mean Squared Error MSR = SSR/df regression MSE = SSE/df error df is the degrees of freedom For regression, df = k = # of ind. variables For error, df = n-k-1 Degrees of freedom for error refers to the number of observations from the sample that could have contributed to the overall error.

Standard Error Standard Error (SE) = √ MSE Standard Error is a measure of how well the model will be able to predict y. It can be used to construct a confidence interval for the prediction.

Summary Output & ANOVA SUMMARY OUTPUT Regression Statistics Multiple R R Square Adjusted R Square Standard Error Observations13 ANOVA dfSSMSFSignificance F Regression1 (k) Residual (Error)11 (n-k-1) Total12 (n-1) = SSR/SST = 31.1/48.8 = √MSE = √ =MSR/MSE =31.1/1.6 p-value for regression

The Hypothesis for Regression H 0 : β 1 = β 2 = β 3 = … = 0 H a : At least one of the β s is not 0 If all βs are 0, then it implies that y is not related to any of the x variables. Thus the alternate we try to prove is that there is in fact a relationship. The Significance F is the p-value for such a test.