Regression For the purposes of this class: –Does Y depend on X? –Does a change in X cause a change in Y? –Can Y be predicted from X? Y= mX + b Predicted.

Slides:



Advertisements
Similar presentations
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Advertisements

Chapter 12 Simple Linear Regression
Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Analysis of Variance Compares means to determine if the population distributions are not similar Uses means and confidence intervals much like a t-test.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Learning Objectives 1 Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Regression Regression: Mathematical method for determining the best equation that reproduces a data set Linear Regression: Regression method applied with.
LECTURE 3 Introduction to Linear Regression and Correlation Analysis
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Regression and Correlation
Chapter 12 Simple Regression
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Simple Linear Regression Analysis
Introduction to Probability and Statistics Linear Regression and Correlation.
Business Statistics - QBM117 Least squares regression.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Lecture 5: Simple Linear Regression
Correlation and Regression Analysis
Introduction to Linear Regression.  You have seen how to find the equation of a line that connects two points.
Introduction to Regression Analysis, Chapter 13,
Correlation and Regression
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Lecture 15 Basics of Regression Analysis
Objectives (BPS chapter 5)
Linear Regression.
Introduction to Linear Regression and Correlation Analysis
Simple Linear Regression. Correlation Correlation (  ) measures the strength of the linear relationship between two sets of data (X,Y). The value for.
Biostatistics Unit 9 – Regression and Correlation.
1 Chapter 3: Examining Relationships 3.1Scatterplots 3.2Correlation 3.3Least-Squares Regression.
Prior Knowledge Linear and non linear relationships x and y coordinates Linear graphs are straight line graphs Non-linear graphs do not have a straight.
Understanding Your Data Set Statistics are used to describe data sets Gives us a metric in place of a graph What are some types of statistics used to describe.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Regression Analysis. Scatter plots Regression analysis requires interval and ratio-level data. To see if your data fits the models of regression, it is.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Regression For the purposes of this class: –Does Y depend on X? –Does a change in X cause a change in Y? –Can Y be predicted from X? Y= mX + b Predicted.
AP STATISTICS LESSON 3 – 3 LEAST – SQUARES REGRESSION.
Statistical Analysis. Statistics u Description –Describes the data –Mean –Median –Mode u Inferential –Allows prediction from the sample to the population.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Scatterplot and trendline. Scatterplot Scatterplot explores the relationship between two quantitative variables. Example:
CHAPTER 3 INTRODUCTORY LINEAR REGRESSION. Introduction  Linear regression is a study on the linear relationship between two variables. This is done by.
Regression Regression relationship = trend + scatter
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
Objective: Understanding and using linear regression Answer the following questions: (c) If one house is larger in size than another, do you think it affects.
1 Regression Analysis The contents in this chapter are from Chapters of the textbook. The cntry15.sav data will be used. The data collected 15 countries’
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Creating a Residual Plot and Investigating the Correlation Coefficient.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
Agresti/Franklin Statistics, 1 of 88 Chapter 11 Analyzing Association Between Quantitative Variables: Regression Analysis Learn…. To use regression analysis.
Residuals Recall that the vertical distances from the points to the least-squares regression line are as small as possible.  Because those vertical distances.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Advanced Statistical Methods: Continuous Variables REVIEW Dr. Irina Tomescu-Dubrow.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Regression Analysis Deterministic model No chance of an error in calculating y for a given x Probabilistic model chance of an error First order linear.
Simple Linear Regression The Coefficients of Correlation and Determination Two Quantitative Variables x variable – independent variable or explanatory.
CORRELATION ANALYSIS.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 9 l Simple Linear Regression 9.1 Simple Linear Regression 9.2 Scatter Diagram 9.3 Graphical.
Linear Regression Essentials Line Basics y = mx + b vs. Definitions
Regression Analysis.
Statistics 101 Chapter 3 Section 3.
Linear Regression Special Topics.
Multiple Regression.
Correlation and Regression
Ch 4.1 & 4.2 Two dimensions concept
Chapter 14 Multiple Regression
Presentation transcript:

Regression For the purposes of this class: –Does Y depend on X? –Does a change in X cause a change in Y? –Can Y be predicted from X? Y= mX + b Predicted values Overall Mean Actual values

When analyzing a regression-type data set, the first step is to plot the data: XY The next step is to determine the line that ‘best fits’ these points. It appears this line would be sloped upward and linear (straight).

1) The regression line passes through the point (X avg, Y avg ). 2) Its slope is at the rate of “m” units of Y per unit of X, where m = regression coefficient (slope; y=mx+b) The line of best fit is the sample regression of Y on X, and its position is fixed by two results: (55, 138) Y = 1.24(X) slopeY-intercept Rise/Run

Testing the Regression Line for Significance An F-test is used based on Model, Error, and Total SOS. –Very similar to ANOVA Basically, we are testing if the regression line has a significantly different slope than a line formed by using just Y_avg. –If there is no difference, then that means that Y does not change as X changes (stays around the average value) To begin, we must first find the regression line that has the smallest Error SOS.

Independent Value Dependent Value Error SOS The regression line should pass through the overall average with a slope that has the smallest Error SOS (Error SOS = the distance between each point and predicted line: gives an index of the variability of the data points around the predicted line). overall average is the pivot point

For each X, we can predict Y:Y = 1.24(X) XY_ActualY_PredSOS Error Error SOS is calculated as the sum of (Y Actual – Y Predicted ) 2 This gives us an index of how scattered the actual observations are around the predicted line. The more scattered the points, the larger the Error SOS will be. This is like analysis of variance, except we are using the predicted line instead of the mean value.

Total SOS Calculated as the sum of (Y – Y avg ) 2 Gives us an index of how scattered our data set is around the overall Y average. Overall Y average Regression line not shown

XY_ActualY AverageSOS Total Total SOS gives us an index of how scattered the data points are around the overall average. This is calculated the same way for a single treatment in ANOVA. What happens to Total SOS when all of the points are close to the overall average? What happens when the points form a non-horizontal linear trend?

Model SOS Calculated as the Sum of (Y Predicted – Y avg ) 2 Gives us an index of how far away the predicted values are from the overall average value Distance between predicted Y and overall mean

Model SOS Gives us an index of how far away the predicted values are from the overall average value What happens to Model SOS when all of the predicted values are close to the average value? XY_PredY AverageSOS Model

All Together Now!! XY_ActualY_PredSOS Error Y_AvgSOS Total SOS Model SOS Error =  (Y_Actual – Y_Pred) 2 SOS Total =  (Y_Actual –Y_ Avg) 2 SOS Model =  (Y_Pred – Y_Avg) 2

Using SOS to Assess Regression Line Model SOS gives us an index on how ‘different’ the predicted values are from the average values. – Bigger Model SOS = more different –Tells us how different a sloped line is from a line made up only of Y_avg. –Remember, the regression line will pass through the overall average point. Error SOS gives us an index of how different the predicted values are from the actual values –More variability = larger Error SOS = large distance between predicted and actual values

Magic of the F-test The ratio of Model SOS to Error SOS (Model SOS divided by Error SOS) gives us an overall index (the F statistic) used to indicate the relative ‘difference’ between the regression line and a line with slope of zero (all values = Y_avg. –A large Model SOS and small Error SOS = a large F statistic. Why does this indicate a significant difference? –A small Model SOS and a large Error SOS = a small F statistic. Why does this indicate no significant difference?? Based on sample size and alpha level (P-value), each F statistic has an associated P-value. –P < 0.05 (Large F statistic) there is a significant difference between the regression line a the Y_avg line. –P ≥ 0.05 (Small F statistic) there is NO significant difference between the regression line a the Y_avg line.

Mean Model SOS Mean Error SOS Independent Value Dependent Value Basically, this is an index that tells us how different the regression line is from Y_avg, and the scatter of the data around the predicted values. = F

Y = 1.24(X) slopeY-intercept Rise/Run Use regression line to predict a specific number or a specific change.

Correlation (r): A nother measure of the mutual linear relationship between two variables. ‘r’ is a pure number without units or dimensions ‘r’ is always between –1 and 1 Positive values indicate that y increases when x does and negative values indicate that y decreases when x increases. –What does r = 0 mean? ‘r’ is a measure of intensity of association observed between x and y. –‘r’ does not predict – only describes associations between variables

r > 0 r < 0 r = 0 r is also called Pearson’s correlation coefficient.

R-square If we square r, we get rid of the negative value if it is negative and we get an index of how close the data points are to the regression line. Allows us to decide how much confidence we have in making a prediction based on our model. Is calculated as Model SOS / Total SOS

r 2 = Model SOS / Total SOS = Model SOS = Total SOS

= Model SOS = Total SOS r2 = Model SOS / Total SOS  numerator/denominator Small numerator Big denominator R 2 =

R-square and Prediction Confidence

Finally…….. If we have a significant relationship (based on the p-value), we can use the r-square value to judge how sure we are in making a prediction.