Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.

Slides:



Advertisements
Similar presentations
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Advertisements

Chapter 12 Simple Linear Regression
Forecasting Using the Simple Linear Regression Model and Correlation
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Chapter 12 Simple Linear Regression
Chapter 10 Simple Regression.
9. SIMPLE LINEAR REGESSION AND CORRELATION
The Simple Regression Model
SIMPLE LINEAR REGRESSION
Chapter Topics Types of Regression Models
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Simple Linear Regression Analysis
SIMPLE LINEAR REGRESSION
Pertemua 19 Regresi Linier
Lecture 19 Simple linear regression (Review, 18.5, 18.8)
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
Lecture 15 Basics of Regression Analysis
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Simple Linear Regression Models
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 13 Multiple Regression
Regression Analysis Relationship with one independent variable.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Regression Analysis Deterministic model No chance of an error in calculating y for a given x Probabilistic model chance of an error First order linear.
Chapter 16 Multiple Regression and Correlation
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Multiple Regression.
The simple linear regression model and parameter estimation
Chapter 20 Linear and Multiple Regression
Regression Analysis AGEC 784.
REGRESSION G&W p
Regression Analysis Module 3.
Linear Regression and Correlation Analysis
Ch12.1 Simple Linear Regression
Relationship with one independent variable
Quantitative Methods Simple Regression.
Regression Analysis Week 4.
Multiple Regression.
Prepared by Lee Revere and John Large
Simple Linear Regression
PENGOLAHAN DAN PENYAJIAN
Simple Linear Regression
Relationship with one independent variable
SIMPLE LINEAR REGRESSION
SIMPLE LINEAR REGRESSION
Simple Linear Regression
Introduction to Regression
Nazmus Saquib, PhD Head of Research Sulaiman AlRajhi Colleges
Presentation transcript:

Regression Analysis Module 3

Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent variables. Regression is thus an explanation of causation. If the independent variable(s) sufficiently explain the variation in the dependent variable, the model can be used for prediction. Independent variable (x) Dependent variable

Simple Linear Regression Independent variable (x) Dependent variable (y) The output of a regression is a function that predicts the dependent variable based upon values of the independent variables. Simple regression fits a straight line to the data. y’ = b0 + b1X ± є b0 (y intercept) B1 = slope = ∆y/ ∆x є

Simple Linear Regression Independent variable (x) Dependent variable The function will make a prediction for each observed data point. The observation is denoted by y and the prediction is denoted by y. Zero Prediction: y Observation: y ^ ^

Simple Linear Regression For each observation, the variation can be described as: y = y + ε Actual = Explained + Error Zero Prediction error: ε ^ Prediction: y ^ Observation: y

Regression Independent variable (x) Dependent variable A least squares regression selects the line with the lowest total sum of squared prediction errors. This value is called the Sum of Squares of Error, or SSE.

Calculating SSR Independent variable (x) Dependent variable The Sum of Squares Regression (SSR) is the sum of the squared differences between the prediction for each observation and the population mean. Population mean: y

Regression Formulas The Total Sum of Squares (SST) is equal to SSR + SSE. Mathematically, SSR = ∑ ( y – y ) (measure of explained variation) SSE = ∑ ( y – y ) (measure of unexplained variation) SST = SSR + SSE = ∑ ( y – y ) (measure of total variation in y) ^ ^ 2 2

The Coefficient of Determination The proportion of total variation (SST) that is explained by the regression (SSR) is known as the Coefficient of Determination, and is often referred to as R. R = = The value of R can range between 0 and 1, and the higher its value the more accurate the regression model is. It is often referred to as a percentage. SSR SSR SST SSR + SSE 2 2 2

Standard Error of Regression The Standard Error of a regression is a measure of its variability. It can be used in a similar manner to standard deviation, allowing for prediction intervals. y ± 2 standard errors will provide approximately 95% accuracy, and 3 standard errors will provide a 99% confidence interval. Standard Error is calculated by taking the square root of the average prediction error. Standard Error = SSE n-k Where n is the number of observations in the sample and k is the total number of variables in the model √

The output of a simple regression is the coefficient β and the constant A. The equation is then: y = A + β * x + ε where ε is the residual error. β is the per unit change in the dependent variable for each unit change in the independent variable. Mathematically: β = ∆ y ∆ x

Multiple Linear Regression More than one independent variable can be used to explain variance in the dependent variable, as long as they are not linearly related. A multiple regression takes the form: y = A + β X + β X + … + β k Xk + ε where k is the number of variables, or parameters

Multicollinearity Multicollinearity is a condition in which at least 2 independent variables are highly linearly correlated. It will often crash computers. Example table of Correlations YX1X2 Y1.000 X X A correlations table can suggest which independent variables may be significant. Generally, an ind. variable that has more than a.3 correlation with the dependent variable and less than.7 with any other ind. variable can be included as a possible predictor.

Nonlinear Regression Nonlinear functions can also be fit as regressions. Common choices include Power, Logarithmic, Exponential, and Logistic, but any continuous function can be used.

Regression Output in Excel