Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.

Slides:



Advertisements
Similar presentations
Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Advertisements

13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Regression Analysis Simple Regression. y = mx + b y = a + bx.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Linear regression models
Objectives (BPS chapter 24)
Simple Linear Regression
Introduction to Regression Analysis
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Chapter 10 Simple Regression.
9. SIMPLE LINEAR REGESSION AND CORRELATION
Chapter 12 Simple Regression
Chapter 13 Introduction to Linear Regression and Correlation Analysis
The Simple Regression Model
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
SIMPLE LINEAR REGRESSION
Chapter Topics Types of Regression Models
Simple Linear Regression Analysis
SIMPLE LINEAR REGRESSION
BCOR 1020 Business Statistics
Simple Linear Regression and Correlation
McGraw-Hill/Irwin Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. A PowerPoint Presentation Package to Accompany Applied Statistics.
Introduction to Regression Analysis, Chapter 13,
Simple Linear Regression Analysis
Correlation and Linear Regression
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Chapter 11 Simple Regression
Linear Regression and Correlation
Simple Linear Regression Models
Bivariate Regression (Part 1) Chapter1212 Visual Displays and Correlation Analysis Bivariate Regression Regression Terminology Ordinary Least Squares Formulas.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Introduction to Linear Regression
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 5: Regression Analysis Part 1: Simple Linear Regression.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Lecture 10: Correlation and Regression Model.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
The simple linear regression model and parameter estimation
Chapter 4: Basic Estimation Techniques
Chapter 20 Linear and Multiple Regression
Regresi dan Korelasi Pertemuan 10
Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
Relationship with one independent variable
Simple Linear Regression
PENGOLAHAN DAN PENYAJIAN
Relationship with one independent variable
SIMPLE LINEAR REGRESSION
Simple Linear Regression
SIMPLE LINEAR REGRESSION
Simple Linear Regression
Introduction to Regression
Presentation transcript:

Bivariate Regression

Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two variables. It specifies one dependent (response) variable and one independent (predictor) variable. It specifies one dependent (response) variable and one independent (predictor) variable. This hypothesized relationship may be linear, quadratic, or whatever. This hypothesized relationship may be linear, quadratic, or whatever. McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  What is Bivariate Regression?

Bivariate Regression McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  Model Form

Regression Terminology Unknown parameters are  0 Intercept  1 Slope Unknown parameters are  0 Intercept  1 Slope The assumed model for a linear relationship is The assumed model for a linear relationship is y i =  0 +  1 x i +  i for all observations (i = 1, 2, …, n) The error term is not observable, is assumed normally distributed with mean of 0 and standard deviation . The error term is not observable, is assumed normally distributed with mean of 0 and standard deviation . McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  Models and Parameters

The population simple linear regression model: Y=  0 +  1 X +  Nonrandom or Random Systematic Component Component where Y is the dependent variable, the variable we wish to explain or predict X is the independent variable, also called the predictor variable  is the error term, the only random component in the model, and thus, the only source of randomness in Y.  0 is the intercept of the systematic component of the regression relationship.  1 is the slope of the systematic component. The conditional mean of Y: The population simple linear regression model: Y=  0 +  1 X +  Nonrandom or Random Systematic Component Component where Y is the dependent variable, the variable we wish to explain or predict X is the independent variable, also called the predictor variable  is the error term, the only random component in the model, and thus, the only source of randomness in Y.  0 is the intercept of the systematic component of the regression relationship.  1 is the slope of the systematic component. The conditional mean of Y: The Simple Linear Regression Model

The simple linear regression model gives an exact linear relationship between the expected or average value of Y, the dependent variable, and X, the independent or predictor variable: E[Yi]=  0 +  1 Xi Actual observed values of Y differ from the expected value by an unexplained or random error: Yi = E[Yi] +  i =  0 +  1 Xi +  i The simple linear regression model gives an exact linear relationship between the expected or average value of Y, the dependent variable, and X, the independent or predictor variable: E[Yi]=  0 +  1 Xi Actual observed values of Y differ from the expected value by an unexplained or random error: Yi = E[Yi] +  i =  0 +  1 Xi +  i X Y E[Y]=  0 +  1 X XiXi } }  1 = Slope 1  0 = Intercept YiYi { Error:  i Regression Plot Picturing the Simple Linear Regression Model

Estimation of a simple linear regression relationship involves finding estimated or predicted values of the intercept and slope of the linear regression line. The estimated regression equation: Y = b0 + b1X + e where b0 estimates the intercept of the population regression line,  0 ; b1 estimates the slope of the population regression line,  1; and e stands for the observed errors - the residuals from fitting the estimated regression line b0 + b1X to a set of n points. Estimation of a simple linear regression relationship involves finding estimated or predicted values of the intercept and slope of the linear regression line. The estimated regression equation: Y = b0 + b1X + e where b0 estimates the intercept of the population regression line,  0 ; b1 estimates the slope of the population regression line,  1; and e stands for the observed errors - the residuals from fitting the estimated regression line b0 + b1X to a set of n points Estimation: The Method of Least Squares

Fitting a Regression Line X Y Data X Y Three errors from a fitted line X Y Three errors from the least squares regression line X Errors from the least squares regression line are minimized

. { Y X Errors in Regression XiXiXiXi

Least Squares Regression

Ordinary Least Squares Formulas The ordinary least squares method (OLS) estimates the slope and intercept of the regression line so that the residuals are small. The ordinary least squares method (OLS) estimates the slope and intercept of the regression line so that the residuals are small. The sum of the residuals = 0 The sum of the residuals = 0 The sum of the squared residuals is SSE The sum of the squared residuals is SSE McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.

Ordinary Least Squares Formulas The OLS estimator for the slope is: The OLS estimator for the slope is: The OLS estimator for the intercept is: The OLS estimator for the intercept is: McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  Slope and Intercept or

Ordinary Least Squares Formulas We want to explain the total variation in Y around its mean (SST for Total Sums of Squares) We want to explain the total variation in Y around its mean (SST for Total Sums of Squares) The regression sum of squares (SSR) is the explained variation in Y The regression sum of squares (SSR) is the explained variation in Y McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  Assessing Fit

Ordinary Least Squares Formulas The error sum of squares (SSE) is the unexplained variation in Y The error sum of squares (SSE) is the unexplained variation in Y If the fit is good, SSE will be relatively small compared to SST. If the fit is good, SSE will be relatively small compared to SST. A perfect fit is indicated by an SSE = 0. A perfect fit is indicated by an SSE = 0. The magnitude of SSE depends on n and on the units of measurement. The magnitude of SSE depends on n and on the units of measurement. McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  Assessing Fit

Ordinary Least Squares Formulas R 2 is a measure of relative fit based on a comparison of SSR and SST. R 2 is a measure of relative fit based on a comparison of SSR and SST. McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  Coefficient of Determination 0 < R 2 < 1 Often expressed as a percent, an R 2 = 1 (i.e., 100%) indicates perfect fit.Often expressed as a percent, an R 2 = 1 (i.e., 100%) indicates perfect fit. In a bivariate regression, R 2 = (r) 2In a bivariate regression, R 2 = (r) 2

Tests for Significance The standard error (s yx ) is an overall measure of model fit. The standard error (s yx ) is an overall measure of model fit. McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  Standard Error of Regression If the fitted model’s predictions are perfect (SSE = 0), then s yx = 0. Thus, a small s yx indicates a better fit.If the fitted model’s predictions are perfect (SSE = 0), then s yx = 0. Thus, a small s yx indicates a better fit. Used to construct confidence intervals.Used to construct confidence intervals. Magnitude of s yx depends on the units of measurement of Y and on data magnitude.Magnitude of s yx depends on the units of measurement of Y and on data magnitude.

Tests for Significance Standard error of the slope: Standard error of the slope: McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  Confidence Intervals for Slope and Intercept Standard error of the intercept:Standard error of the intercept:

Tests for Significance Confidence interval for the true slope: Confidence interval for the true slope: McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  Confidence Intervals for Slope and Intercept Confidence interval for the true intercept:Confidence interval for the true intercept:

Tests for Significance If  1 = 0, then X cannot influence Y and the regression model collapses to a constant  0 plus random error. If  1 = 0, then X cannot influence Y and the regression model collapses to a constant  0 plus random error. The hypotheses to be tested are: The hypotheses to be tested are: McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  Hypothesis Tests

Tests for Significance A t test is used with = n – 2 degrees of freedom The test statistics for the slope and intercept are: A t test is used with = n – 2 degrees of freedom The test statistics for the slope and intercept are: McGraw-Hill/Irwin© 2007 The McGraw-Hill Companies, Inc. All rights reserved.  Hypothesis Tests t n-2 is obtained from Appendix D or Excel for a given .t n-2 is obtained from Appendix D or Excel for a given . Reject H 0 if t > t  or if p-value t  or if p-value < . Slope: Intercept: