Download presentation
Presentation is loading. Please wait.
1
LEAST – SQUARES REGRESSION
AP STATISTICS LESSON 3 – 3 LEAST – SQUARES REGRESSION
2
Regression Line A regression line is a straight line that describes how a response variable y changes as an explanatory variable x changes. We often use a regression line to predict the value of y for a given value of x. Regression, unlike correlation, requires we have an explanatory variable and a response variable. LSRL – Is the abbreviation for least squares regression line. LSRL is a mathematical model.
3
Least – squares Regression Line
Error = observed – predicted To find the most effective model we must square the errors and sum them to find the least errors squared.
4
Least – squares Regression Line
The least – squares regression line of y on x is the line that makes the sum of the squares of the vertical distances of the data points from the line as small as possible.
5
Equation of the LSRL We have data on an explanatory variable x and a response variable y for n individuals. From the data, calculate the means x and y and the standard deviations sx and sy, and their correlation r.
6
What happened to y = mx+b?
y represents the observed (actual) values for y, and y represents the predicted values for y. We use y hat in the equation of the regression line to emphasize that the line gives predicted values for any x. When you are solving regression problems, be sure to distinguish between y and y. Hot tip: (x, y) is always a point on the regression line! ˆ ˆ
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.