3.2 Least Squares Regression Line. Regression Line Describes how a response variable changes as an explanatory variable changes Formula sheet: Calculator.

Slides:



Advertisements
Similar presentations
Chapter 3 Bivariate Data
Advertisements

LSRL Least Squares Regression Line
CHAPTER 3 Describing Relationships
Haroon Alam, Mitchell Sanders, Chuck McAllister- Ashley, and Arjun Patel.
Regression, Residuals, and Coefficient of Determination Section 3.2.
Lesson Least-Squares Regression. Knowledge Objectives Explain what is meant by a regression line. Explain what is meant by extrapolation. Explain.
AP STATISTICS LESSON 3 – 3 LEAST – SQUARES REGRESSION.
Correlation tells us about strength (scatter) and direction of the linear relationship between two quantitative variables. In addition, we would like to.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
A medical researcher wishes to determine how the dosage (in mg) of a drug affects the heart rate of the patient. DosageHeart rate
3.2 - Least- Squares Regression. Where else have we seen “residuals?” Sx = data point - mean (observed - predicted) z-scores = observed - expected * note.
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 3 Describing Relationships 3.2 Least-Squares.
Examining Bivariate Data Unit 3 – Statistics. Some Vocabulary Response aka Dependent Variable –Measures an outcome of a study Explanatory aka Independent.
CHAPTER 5 Regression BPS - 5TH ED.CHAPTER 5 1. PREDICTION VIA REGRESSION LINE NUMBER OF NEW BIRDS AND PERCENT RETURNING BPS - 5TH ED.CHAPTER 5 2.
Correlation tells us about strength (scatter) and direction of the linear relationship between two quantitative variables. In addition, we would like to.
Chapter 3-Examining Relationships Scatterplots and Correlation Least-squares Regression.
AP STATISTICS Section 3.2 Least Squares Regression.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 3 Association: Contingency, Correlation, and Regression Section 3.3 Predicting the Outcome.
LEAST-SQUARES REGRESSION 3.2 Least Squares Regression Line and Residuals.
CHAPTER 3 Describing Relationships
Least Squares Regression Lines Text: Chapter 3.3 Unit 4: Notes page 58.
Unit 4 Lesson 3 (5.3) Summarizing Bivariate Data 5.3: LSRL.
3.2 - Residuals and Least Squares Regression Line.
Chapter 5 Lesson 5.2 Summarizing Bivariate Data 5.2: LSRL.
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
Describing Bivariate Relationships. Bivariate Relationships When exploring/describing a bivariate (x,y) relationship: Determine the Explanatory and Response.
Chapter 3 LSRL. Bivariate data x – variable: is the independent or explanatory variable y- variable: is the dependent or response variable Use x to predict.
Chapter 5 LSRL. Bivariate data x – variable: is the independent or explanatory variable y- variable: is the dependent or response variable Use x to predict.
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Unit 4 LSRL.
LSRL.
Least Squares Regression Line.
LEAST – SQUARES REGRESSION
Statistics 101 Chapter 3 Section 3.
Linear Regression Special Topics.
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 5 LSRL.
LSRL Least Squares Regression Line
Chapter 3.2 LSRL.
Module 12 Math 075.
Describing Bivariate Relationships
AP Stats: 3.3 Least-Squares Regression Line
Least-Squares Regression
Least Squares Regression Line LSRL Chapter 7-continued
CHAPTER 3 Describing Relationships
Least-Squares Regression
^ y = a + bx Stats Chapter 5 - Least Squares Regression
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 5 LSRL.
Chapter 5 LSRL.
Chapter 5 LSRL.
Least-Squares Regression
Least-Squares Regression
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Chapter 14 Inference for Regression
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
3.2 – Least Squares Regression
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
A medical researcher wishes to determine how the dosage (in mg) of a drug affects the heart rate of the patient. Find the correlation coefficient & interpret.
Ch 9.
9/27/ A Least-Squares Regression.
Review of Chapter 3 Examining Relationships
CHAPTER 3 Describing Relationships
Presentation transcript:

3.2 Least Squares Regression Line

Regression Line Describes how a response variable changes as an explanatory variable changes Formula sheet: Calculator version:

Slope Formula Sheet Interpretation: how will the predicted response variable change for one increase in the explanatory variable?

Y-Intercept Formula Sheet Interpretation: what is the predicted response variable if there is no explanatory variable? Mathematically - needed! Realistically - might not make sense! Sometimes the explanatory variable might not make sense being zero

Interpret the slope and the y-intercept from the given least squares regression line in context of the problem. Determine if the y- intercept is realistic for this problem, explain. (I will write the equation on the board)

Extrapolation When using a regression line to predict a variable outside the range of the data gathered Unreliable predictions!

Multiple Choice Problems

Let's do p. 160!

3.2 - Least- Squares Regression (Residuals)

Residual definition

Where else have we seen “residuals?” Sx = data point - mean (observed - predicted) z-scores = observed - expected * note: this is just the numerator of these calculations Remember:AP

Below is the LSRL for sprint time (seconds) and the long jump distance (inches) Find and interpret the residual for John who had a time of 8.09 seconds and a jump of 151 inches. predicted long jump distance = (sprint time) residual = observed - predicted 151 residual = inches John jumped much farther than what was predicted by our least squares regression line. He jumped almost 70 inches farther, based on his sprint time

So why least squared regression line? Graph (0,0), (0,2), (2,2), and (2,4) and find the least squares regression line. Then find the residuals. Windows - find the sum of the square of the residuals Door - find the sum of the absolute value of the residuals Now, what if I said the least squares regression line was y = x? y = x? Windows find the sum of the square of the residuals Door - find the sum of the absolute value of the residuals

Stop notes for today Homework is p193 #43,45,47,53 Activity - "Matching Descriptions to Scatterplots" Homework hint: you will need to be familiar with the formulas on your sheet to write the LSRL

Residual Plots a scatterplot of the residuals against the explanatory variable. used to help assess the strength of your regression line

Residual Plots with Normal Probability Plots we want the graphs to be linear to support the Normality of our data. with Residual Plots we want the residuals to be very scattered so our data is can be model with a linear regression. Remember: Correlation does NOT assess linearity, just strength and direction!

What’s a Good Residual Plot? No obvious pattern - the LSRL would be in the middle of the data, some data above and some below Relatively small residuals - the data points are close to the LSRL

Do the following residual plots support or refute a linear model?

ssk2xqLJNuePfgeyx44Hy

How to Graph? Take each data point and determine the residual Plot the residuals versus the explanatory variable i.e. (explanatory data, residual) explanatory variable residual use the same numbers as your scatterplot

Calculator Construction If you have a lot of data, follow the instructions on page 178 to construct your residual plot (you will also have to have done the technology corner on p. 170)

What is Standard Deviation? the average squared distance a data point is from the mean Is there a s x ? Is there a s y ? So why not s? (standard deviation of residuals)

Standard Deviation of Residuals gives the approximate size of an “average” or “typical” prediction error from our LSRL formula on page 177 Why divide by n-2?