Least Squares Regression Chapter 3.2

Slides:



Advertisements
Similar presentations
Least Squares Regression
Advertisements

AP Statistics.  Least Squares regression is a way of finding a line that summarizes the relationship between two variables.
Regression, Residuals, and Coefficient of Determination Section 3.2.
Lesson Least-Squares Regression. Knowledge Objectives Explain what is meant by a regression line. Explain what is meant by extrapolation. Explain.
Least-Squares Regression: Linear Regression Section 3.2 Reference Text: The Practice of Statistics, Fourth Edition. Starnes, Yates, Moore.
CHAPTER 5 Regression BPS - 5TH ED.CHAPTER 5 1. PREDICTION VIA REGRESSION LINE NUMBER OF NEW BIRDS AND PERCENT RETURNING BPS - 5TH ED.CHAPTER 5 2.
Least Squares Regression Remember y = mx + b? It’s time for an upgrade… A regression line is a line that describes how a response variable y changes as.
Residuals Recall that the vertical distances from the points to the least-squares regression line are as small as possible.  Because those vertical distances.
SWBAT: Calculate and interpret the equation of the least-squares regression line Do Now: If data set A of (x, y) data has correlation r = 0.65, and a second.
AP Statistics Section 3.2 A Regression Lines. Linear relationships between two quantitative variables are quite common. Correlation measures the direction.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
3.2 - Residuals and Least Squares Regression Line.
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 3: Describing Relationships Section 3.2 Least-Squares Regression.
THE SAT ESSAY: IS LONGER BETTER?  In March of 2005, Dr. Perelmen from MIT reported, “It appeared to me that regardless of what a student wrote, the longer.
Response Variable: measures the outcome of a study (aka Dependent Variable) Explanatory Variable: helps explain or influences the change in the response.
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
CHAPTER 3 Describing Relationships
Unit 4 LSRL.
LSRL.
Least Squares Regression Line.
LEAST – SQUARES REGRESSION
Statistics 101 Chapter 3 Section 3.
CHAPTER 3 Describing Relationships
Linear Regression Special Topics.
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Least-Squares Regression
Chapter 5 LSRL.
LSRL Least Squares Regression Line
Chapter 3.2 LSRL.
Data Analysis and Statistical Software I ( ) Quarter: Autumn 02/03
AP Stats: 3.3 Least-Squares Regression Line
Ice Cream Sales vs Temperature
Least Squares Regression Line LSRL Chapter 7-continued
Chapter 3: Describing Relationships
^ y = a + bx Stats Chapter 5 - Least Squares Regression
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Least-Squares Regression
The SAT essay: Is Longer Better?
Least Squares Regression
Least-Squares Regression
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Least-Squares Regression
Chapter 5 LSRL.
Chapter 5 LSRL.
Chapter 5 LSRL.
Least-Squares Regression
Chapter 3: Describing Relationships
Least-Squares Regression
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Section 3.2: Least Squares Regressions
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
9/27/ A Least-Squares Regression.
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Presentation transcript:

Least Squares Regression Chapter 3.2

Interpreting a Regression Line Requires explanatory and response variables Describes how y will change as x changes. Interpreting a Regression Line Y = a + bx b = slope ; a = y-intercept (when x = 0) Interpreting slope: “y” increases /decreases on average by “b” for each additional “x” Interpreting y-intercept: “y” is predicted to be “a” when no “x” are present.

Example of interpreting a regression line: Fat gain = 3.505 – 0.00344(NEA change) The slope b = -0.00344 shows us that fat gained goes down on average by 0.00344 kilograms for each additional calorie of NEA. The y-intercept a = 3.505 tells us that if there are no NEA calories present, we still have a predicted value of 3.505 kilograms of fat gain.

Extrapolation is the use of a regression line for prediction outside the range of values (often not accurate so you should avoid this). LSRL – the line that makes the sum of the squared vertical distances of the data points from the line as small as possible.

Three Ways to Find LSRL When given the summary statistics: use formula packet with: Slope: b= r(Sy/Sx) Y-int: a=Ӯ-b