Download presentation
Presentation is loading. Please wait.
Published byDwight Marsh Modified over 9 years ago
1
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data
2
Dependent (response variable) – y variable Independent (predictor, or explanatory variable) – x variable
3
The relationship y = a + bx is the equation of a straight line The value b called the slope of the line is the amount by which y increases when x increases by 1 unit The value a, called the intercept (or sometimes the y-intercept or vertical intercept) of the line is the height of the line above the value x = 0
4
Example x y 02468 0 5 10 15 y = 7 + 3x a = 7 x increases by 1 y increases by b = 3
5
Example y y = 17 - 4x x increases by 1 y changes by b = -4 (i.e., changes by –4) 02468 0 5 10 15 a = 17
6
Least Squares Lines The most widely used criterion for measuring the goodness of fit of a line y = a + bx to bivariate data (x 1, y 1 ),…,(x n, y n ) is the sum of the squared deviations about the line The line that gives the best fit to data is the one that minimizes this sum; it is called the least-squares line or the sample regression line
7
Coefficients a and b The slope of the least – squares line is: The y intercept is: We write the equation of the least-squares line as: Where the ^ above y (read as y-hat) is a prediction of y resulting from the substitution of a particular x value into the equation
8
Calculating Formula for the Slope of the Least-Squares Line
9
Example: Greyhound
10
Calculations
11
Classwork
12
Activity: Exploring Correlation and Regression
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.