MAT 1000 Mathematics in Today's World. Last Time.

Slides:



Advertisements
Similar presentations
Section 4.2. Correlation and Regression Describe only linear relationship. Strongly influenced by extremes in data. Always plot data first. Extrapolation.
Advertisements

7.1 Seeking Correlation LEARNING GOAL
Chapter 6: Exploring Data: Relationships Lesson Plan
Warm up Use calculator to find r,, a, b. Chapter 8 LSRL-Least Squares Regression Line.
How Math can Help Solve Crimes
BPS - 5th Ed. Chapter 51 Regression. BPS - 5th Ed. Chapter 52 u Objective: To quantify the linear relationship between an explanatory variable (x) and.
Least-Squares Regression: Linear Regression
Mathematical Modeling. What is Mathematical Modeling? Mathematical model – an equation, graph, or algorithm that fits some real data set reasonably well.
Introduction to Linear Regression.  You have seen how to find the equation of a line that connects two points.
Basic Practice of Statistics - 3rd Edition
Haroon Alam, Mitchell Sanders, Chuck McAllister- Ashley, and Arjun Patel.
MAT 1000 Mathematics in Today's World. Last Time We saw how to use the mean and standard deviation of a normal distribution to determine the percentile.
Chapter 5 Regression. Chapter 51 u Objective: To quantify the linear relationship between an explanatory variable (x) and response variable (y). u We.
Chapter 5 Regression. Chapter outline The least-squares regression line Facts about least-squares regression Residuals Influential observations Cautions.
 Pg : 3b, 6b (form and strength)  Page : 10b, 12a, 16c, 16e.
2.4: Cautions about Regression and Correlation. Cautions: Regression & Correlation Correlation measures only linear association. Extrapolation often produces.
Chapter 6: Exploring Data: Relationships Lesson Plan Displaying Relationships: Scatterplots Making Predictions: Regression Line Correlation Least-Squares.
Notes Bivariate Data Chapters Bivariate Data Explores relationships between two quantitative variables.
Least-Squares Regression: Linear Regression Section 3.2 Reference Text: The Practice of Statistics, Fourth Edition. Starnes, Yates, Moore.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
Notes Bivariate Data Chapters Bivariate Data Explores relationships between two quantitative variables.
BPS - 3rd Ed. Chapter 51 Regression. BPS - 3rd Ed. Chapter 52 u Objective: To quantify the linear relationship between an explanatory variable (x) and.
Chapter 5 Regression BPS - 5th Ed. Chapter 51. Linear Regression  Objective: To quantify the linear relationship between an explanatory variable (x)
BPS - 5th Ed. Chapter 51 Regression. BPS - 5th Ed. Chapter 52 u Objective: To quantify the linear relationship between an explanatory variable (x) and.
Introduction to regression 3D. Interpretation, interpolation, and extrapolation.
Relationships If we are doing a study which involves more than one variable, how can we tell if there is a relationship between two (or more) of the.
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 3 Describing Relationships 3.2 Least-Squares.
WARM-UP Do the work on the slip of paper (handout)
Chapter 5 Regression. u Objective: To quantify the linear relationship between an explanatory variable (x) and response variable (y). u We can then predict.
AP STATISTICS LESSON 4 – 2 ( DAY 1 ) Cautions About Correlation and Regression.
April 1 st, Bellringer-April 1 st, 2015 Video Link Worksheet Link
Section 2.6 – Draw Scatter Plots and Best Fitting Lines A scatterplot is a graph of a set of data pairs (x, y). If y tends to increase as x increases,
Chapter 2 Examining Relationships.  Response variable measures outcome of a study (dependent variable)  Explanatory variable explains or influences.
SWBAT: Calculate and interpret the residual plot for a line of regression Do Now: Do heavier cars really use more gasoline? In the following data set,
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Chapter 10 Correlation and Regression 10-2 Correlation 10-3 Regression.
Statistics: Unlocking the Power of Data Lock 5 STAT 250 Dr. Kari Lock Morgan Simple Linear Regression SECTION 2.6 Least squares line Interpreting coefficients.
1. Plot the data. What kind of growth does it exhibit? (plot by hand but you may use calculators to confirm answers.) 2. Use logs to transform the data.
Does adding a fuel additive help gasoline mileage in automobiles? Use Linear Regression to analyze the following data: Amount of STP fuel additive added.
Section 1.6 Fitting Linear Functions to Data. Consider the set of points {(3,1), (4,3), (6,6), (8,12)} Plot these points on a graph –This is called a.
Unit 4 Lesson 3 (5.3) Summarizing Bivariate Data 5.3: LSRL.
BPS - 3rd Ed. Chapter 51 Regression. BPS - 3rd Ed. Chapter 52 u To describe the change in Y per unit X u To predict the average level of Y at a given.
Method 3: Least squares regression. Another method for finding the equation of a straight line which is fitted to data is known as the method of least-squares.
Correlation Example: School closed for a week and nobody else got Swine Flu. Correlation: closing school stopped it spreading! But did the closure really.
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
Unit 5: Regression & Correlation Week 1. Data Relationships Finding a relationship between variables is what we’re looking for when extracting data from.
CCSS.Math.Content.8.SP.A.1 Construct and interpret scatter plots for bivariate measurement data to investigate patterns of association between two quantities.
Chapter 3 Unusual points and cautions in regression.
Warm-up Get a sheet of computer paper/construction paper from the front of the room, and create your very own paper airplane. Try to create planes with.
Scatter Plots and Correlation Coefficients
Statistics 200 Lecture #6 Thursday, September 8, 2016
Lesson 4.5 Topic/ Objective: To use residuals to determine how well lines of fit model data. To use linear regression to find lines of best fit. To distinguish.
LSRL.
Least Squares Regression Line.
Cautions about Correlation and Regression
Chapter 3.2 LSRL.
Ice Cream Sales vs Temperature
Least Squares Regression Line LSRL Chapter 7-continued
EQ: How well does the line fit the data?
two variables two sets of data
Cautions about Correlation and Regression
Least-Squares Regression
Residuals and Residual Plots
Bivariate Data credits.
Least-Squares Regression
Chapter 5 LSRL.
Chapter 5 LSRL.
Homework: pg. 180 #6, 7 6.) A. B. The scatterplot shows a negative, linear, fairly weak relationship. C. long-lived territorial species.
3.3 Cautions Correlation and Regression Wisdom Correlation and regression describe ONLY LINEAR relationships Extrapolations (using data to.
Ch 9.
Bivariate Data.
Presentation transcript:

MAT 1000 Mathematics in Today's World

Last Time

Today First: a warning about interpreting correlation. We will also talk about least-squares regression. This is a way to calculate the line that is the “best fit” for the data, in other words: a line that is a good approximation of the scatterplot. The reason least-squares regression is important is that it allows us to make predictions where we don’t have any data— these predictions will be based on the pattern the data gives us.

“Correlation is not causation” You may have heard this expression before. What does it mean? Correlation is good evidence for a cause and effect relationship between two variables. If there is such a relationship, the variables will have a strong correlation. On the other hand, variables can have a strong correlation even though there is no cause and effect relationship.

“Correlation is not causation” Example Ice cream sales are correlated with drowning deaths. Obviously not a cause and effect relationship. In this case the explanation is that ice cream sales and drowning deaths are both related to the weather. More ice cream is sold in the summer, and more people go swimming in the summer. We call this relationship between ice cream sales and drowning deaths “mutual response.”

“Correlation is not causation” Correlation may not even be due to mutual response. Example (The Pirate Effect) The number of pirates is correlated with global average temperature: over the past few centuries the number of pirates has decreased, and global average temperatures have increased. Is global warming caused by lack of pirates? This is just a coincidence. People call this kind of relationship a “nonsense correlation.” For more nonsense correlations:

Approximating scatterplots Last time we calculated the correlation between the heights and weights of five male adults. Here is that same data as a scatterplot.

Approximating scatterplots If you had to draw by hand a line that approximated the shape of this scatterplot, you could end up with any number of lines.

Approximating scatterplots For example, maybe you would draw this line

Approximating scatterplots Or this one

Approximating scatterplots But there is only one “least-squares regression line:”

Review of linear functions

Now connect these two points with a line

The least-squares regression line

Note that none of the data actually lies on the line. For a line to be the least-squares regression line the distance from all of the data to the line must be as small as possible. Nevertheless, the line need not (and usually does not) contain any of the data values.

Predictions The most important application of least-squares regression lines is for making predictions. If a scatterplot has a linear form, this suggests an underlying pattern. Mathematically, that pattern is exactly the least-squares regression line. We can then make predictions based on the pattern we see in the data we’ve collected.

Predictions

One danger in using least-squares regression for predictions is extrapolation. Within the range of our data, the least-squares regression line should give reasonable predictions. But, if we plug in numbers too far outside that range, the predictions may no longer be reasonable. In our original height and weight data, the heights range from 67 inches to 77 inches. We can be confident that our least-squares regression line gives reasonable predictions for any height in this range.

Predictions