Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Topic 12: Multiple Linear Regression
Chapter 12 Simple Linear Regression
Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Forecasting Using the Simple Linear Regression Model and Correlation
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Simple Regression. y = mx + b y = a + bx.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Linear regression models
Chapter 12 Simple Linear Regression
Chapter 13 Multiple Regression
Chapter 10 Simple Regression.
Chapter 12 Simple Regression
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 11 th Edition.
Chapter 12 Multiple Regression
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
Ch. 14: The Multiple Regression Model building
Korelasi dalam Regresi Linear Sederhana Pertemuan 03 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Chapter 7 Forecasting with Simple Regression
Introduction to Regression Analysis, Chapter 13,
Simple Linear Regression Analysis
Lecture 5 Correlation and Regression
Regression and Correlation Methods Judy Zhong Ph.D.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
CORRELATION: Correlation analysis Correlation analysis is used to measure the strength of association (linear relationship) between two quantitative variables.
Simple Linear Regression (SLR)
Lecture 10: Correlation and Regression Model.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Environmental Modeling Basic Testing Methods - Statistics III.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
ENGR 610 Applied Statistics Fall Week 11 Marshall University CITE Jack Smith.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
The simple linear regression model and parameter estimation
Lecture 11: Simple Linear Regression
Chapter 20 Linear and Multiple Regression
Quantitative Methods Simple Regression.
24/02/11 Tutorial 3 Inferential Statistics, Statistical Modelling & Survey Methods (BS2506) Pairach Piboonrungroj (Champ)
Simple Linear Regression
Introduction to Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Simple Linear Regression (OLS)

Types of Correlation Positive correlationNegative correlationNo correlation

Simple linear regression describes the linear relationship between an independent variable, plotted on the x-axis, and a dependent variable, plotted on the y-axis Independent Variable (X)dependent Variable (Y)

X Y 1.0

X Y

X Y

X Y ε ε

Fitting data to a linear model interceptslope residuals

How to fit data to a linear model? The Ordinary Least Square Method (OLS)

Least Squares Regression Residual (ε) = Sum of squares of residuals = Model line: we must find values of and that minimise

Y = a + b X b = ∑(X – X bar) (Y – Y bar) / ∑(X- X bar) 2 a = Y bar – b X bar

Regression Coefficients

Required Statistics

Descriptive Statistics

Regression Statistics

Y Variance to be explained by predictors (SST)

Y X1X1 Variance NOT explained by X 1 (SSE) Variance explained by X 1 (SSR)

Regression Statistics

Coefficient of Determination to judge the adequacy of the regression model

Regression Statistics Correlation measures the strength of the linear association between two variables.

Standard Error for the regression model Regression Statistics

ANOVA dfSSMSFP-value Regression1SSRSSR / dfMSR / MSEP(F) Residualn-2SSESSE / df Totaln-1SST If P(F)<  then we know that we get significantly better prediction of Y from the regression model than by just predicting mean of Y. ANOVA to test significance of regression

Hypothesis Tests for Regression Coefficients

Hypotheses Tests for Regression Coefficients

Hypothesis Tests on Regression Coefficients

Hypotheses Test the Correlation Coefficient We would reject the null hypothesis if

Diagnostic Tests For Regressions Expected distribution of residuals for a linear model with normal distribution or residuals (errors).

Diagnostic Tests For Regressions Residuals for a non-linear fit

Diagnostic Tests For Regressions Residuals for a quadratic function or polynomial

Diagnostic Tests For Regressions Residuals are not homogeneous (increasing in variance)

Regression – important points 1.Ensure that the range of values sampled for the predictor variable is large enough to capture the full range to responses by the response variable.

X Y X Y

Regression – important points 2. Ensure that the distribution of predictor values is approximately uniform within the sampled range.

X Y X Y

Assumptions of Regression 1. The linear model correctly describes the functional relationship between X and Y.

Assumptions of Regression 1. The linear model correctly describes the functional relationship between X and Y. Y X

Assumptions of Regression 2. The X variable is measured without error X Y

Assumptions of Regression 3. For any given value of X, the sampled Y values are independent 4. Residuals (errors) are normally distributed. 5. Variances are constant along the regression line.

Multiple Linear Regression (MLR)

The linear model with a single predictor variable X can easily be extended to two or more predictor variables.

Y X1X1 Variance NOT explained by X 1 and X 2 Unique variance explained by X 1 Unique variance explained by X 2 X2X2 Common variance explained by X 1 and X 2

Y X1X1 X2X2 A “good” model

Partial Regression Coefficients (slopes): Regression coefficient of X after controlling for (holding all other predictors constant) influence of other variables from both X and Y. Partial Regression Coefficients interceptresiduals

The matrix algebra of Ordinary Least Square Predicted Values: Residuals: Intercept and Slopes:

Regression Statistics How good is our model?

Regression Statistics Coefficient of Determination to judge the adequacy of the regression model

Adjusted R 2 are not biased! n = sample size k = number of independent variables Regression Statistics

Standard Error for the regression model Regression Statistics

ANOVA dfSSMSFP-value RegressionkSSRSSR / dfMSR / MSEP(F) Residualn-k-1SSESSE / df Totaln-1SST If P(F)<  then we know that we get significantly better prediction of Y from the regression model than by just predicting mean of Y. ANOVA to test significance of regression at least one!

Hypothesis Tests for Regression Coefficients

Hypotheses Tests for Regression Coefficients

Diagnostic Tests For Regressions Expected distribution of residuals for a linear model with normal distribution or residuals (errors).

Standardized Residuals

Avoiding predictors (Xs) that do not contribute significantly to model prediction Model Selection

- Forward selection The ‘best’ predictor variables are entered, one by one. - Backward elimination The ‘worst’ predictor variables are eliminated, one by one. Model Selection

Forward Selection

Backward Elimination

Model Selection: The General Case Reject H 0 if :

 The degree of correlation between Xs.  A high degree of multicolinearity produces unacceptable uncertainty (large variance) in regression coefficient estimates (i.e., large sampling variation)  Imprecise estimates of slopes and even the signs of the coefficients may be misleading.  t-tests which fail to reveal significant factors. Multicolinearity

Scatter Plot

Multicolinearity  If the F-test for significance of regression is significant, but tests on the individual regression coefficients are not, multicolinearity may be present.  Variance Inflation Factors (VIFs) are very useful measures of multicolinearity. If any VIF exceed 5, multicolinearity is a problem.

Thank You!