Least Squares Regression Chapter 3.2
Interpreting a Regression Line Requires explanatory and response variables Describes how y will change as x changes. Interpreting a Regression Line Y = a + bx b = slope ; a = y-intercept (when x = 0) Interpreting slope: “y” increases /decreases on average by “b” for each additional “x” Interpreting y-intercept: “y” is predicted to be “a” when no “x” are present.
Example of interpreting a regression line: Fat gain = 3.505 – 0.00344(NEA change) The slope b = -0.00344 shows us that fat gained goes down on average by 0.00344 kilograms for each additional calorie of NEA. The y-intercept a = 3.505 tells us that if there are no NEA calories present, we still have a predicted value of 3.505 kilograms of fat gain.
Extrapolation is the use of a regression line for prediction outside the range of values (often not accurate so you should avoid this). LSRL – the line that makes the sum of the squared vertical distances of the data points from the line as small as possible.
Three Ways to Find LSRL When given the summary statistics: use formula packet with: Slope: b= r(Sy/Sx) Y-int: a=Ӯ-b