©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture 1 Lecture 12: more Chapter 5, Section 3 Relationships between Two.

Slides:



Advertisements
Similar presentations
Chapter 8 Linear Regression © 2010 Pearson Education 1.
Advertisements

Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. Relationships Between Quantitative Variables Chapter 5.
The Practice of Statistics
Relationships Between Quantitative Variables
CHAPTER 3 Describing Relationships
Ch 2 and 9.1 Relationships Between 2 Variables
Agresti/Franklin Statistics, 1 of 88 Chapter 11 Analyzing Association Between Quantitative Variables: Regression Analysis Learn…. To use regression analysis.
ASSOCIATION: CONTINGENCY, CORRELATION, AND REGRESSION Chapter 3.
1 Chapter 3: Examining Relationships 3.1Scatterplots 3.2Correlation 3.3Least-Squares Regression.
Chapter 3 concepts/objectives Define and describe density curves Measure position using percentiles Measure position using z-scores Describe Normal distributions.
Notes Bivariate Data Chapters Bivariate Data Explores relationships between two quantitative variables.
Copyright © 2014 Pearson Education, Inc. All rights reserved Chapter 4 Regression Analysis: Exploring Associations between Variables.
Notes Bivariate Data Chapters Bivariate Data Explores relationships between two quantitative variables.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 12: Analyzing the Association Between Quantitative Variables: Regression Analysis Section.
Objective: Understanding and using linear regression Answer the following questions: (c) If one house is larger in size than another, do you think it affects.
Relationships If we are doing a study which involves more than one variable, how can we tell if there is a relationship between two (or more) of the.
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 3 Describing Relationships 3.2 Least-Squares.
STA291 Statistical Methods Lecture LINEar Association o r measures “closeness” of data to the “best” line. What line is that? And best in what terms.
Agresti/Franklin Statistics, 1 of 88 Chapter 11 Analyzing Association Between Quantitative Variables: Regression Analysis Learn…. To use regression analysis.
Chapter 2 Examining Relationships.  Response variable measures outcome of a study (dependent variable)  Explanatory variable explains or influences.
Chapter 14: Inference for Regression. A brief review of chapter 4... (Regression Analysis: Exploring Association BetweenVariables )  Bi-variate data.
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Business Statistics for Managerial Decision Making
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 3 Association: Contingency, Correlation, and Regression Section 3.3 Predicting the Outcome.
LEAST-SQUARES REGRESSION 3.2 Least Squares Regression Line and Residuals.
CHAPTER 3 Describing Relationships
©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture1 Lecture 35: Chapter 13, Section 2 Two Quantitative Variables Interval.
©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture 1 Lecture 7: Chapter 4, Section 3 Quantitative Variables (Summaries,
©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture 1 Lecture 11: Chapter 5, Section 3 Relationships between Two Quantitative.
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
1 Chapter 12: Analyzing Association Between Quantitative Variables: Regression Analysis Section 12.1: How Can We Model How Two Variables Are Related?
Describing Relationships. Least-Squares Regression  A method for finding a line that summarizes the relationship between two variables Only in a specific.
Week 2 Normal Distributions, Scatter Plots, Regression and Random.
Statistics 200 Lecture #6 Thursday, September 8, 2016
Describing Relationships
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
LSRL Least Squares Regression Line
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Elementary Statistics: Looking at the Big Picture
Chapter 3: Describing Relationships
Least-Squares Regression
Chapter 3 Describing Relationships Section 3.2
Chapter 3: Describing Relationships
Elementary Statistics: Looking at the Big Picture
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Least-Squares Regression
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Warmup A study was done comparing the number of registered automatic weapons (in thousands) along with the murder rate (in murders per 100,000) for 8.
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
3.2 – Least Squares Regression
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Presentation transcript:

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture 1 Lecture 12: more Chapter 5, Section 3 Relationships between Two Quantitative Variables; Regression  Equation of Regression Line; Residuals  Effect of Explanatory/Response Roles  Unusual Observations  Sample vs. Population  Time Series; Additional Variables

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.2 Looking Back: Review  4 Stages of Statistics Data Production (discussed in Lectures 1-4) Displaying and Summarizing  Single variables: 1 cat,1 quan (discussed Lectures 5-8)  Relationships between 2 variables: Categorical and quantitative (discussed in Lecture 9) Two categorical (discussed in Lecture 10) Two quantitative Probability Statistical Inference

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.3 Review  Relationship between 2 quantitative variables Display with scatterplot Summarize:  Form: linear or curved  Direction: positive or negative  Strength: strong, moderate, weak If form is linear, correlation r tells direction and strength. Also, equation of least squares regression line lets us predict a response for any explanatory value x.

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.4 Least Squares Regression Line Summarize linear relationship between explanatory (x) and response (y) values with line that minimizes sum of squared prediction errors (called residuals).  Slope: predicted change in response y for every unit increase in explanatory value x  Intercept: where best-fitting line crosses y-axis (predicted response for x=0?)

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.5 Example: Least Squares Regression Line  Background: Car-buyer used software to regress price on age for 14 used Grand Am’s.  Question: What do the slope (-1,288) and intercept (14,690) tell us?  Response: Slope: For each additional year in age, predict price down by $1,288. Intercept: Best-fitting line crosses y-axis at y=$14,690. Practice: 5.70f p.203

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.6 Example: Least Squares Regression Line  Background: Car-buyer used software to regress price on age for 14 used Grand Am’s.  Question: What do the slope (-1,288) and intercept (14,690) tell us?  Response: Slope: For each additional year in age, predict price ___________________ Intercept: Best-fitting line ________________________ Practice: 5.70f p.203

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.7 Example: Extrapolation  Background: Car-buyer used software to regress price on age for 14 used Grand Am’s.  Question: Should we predict a new Grand Am to cost $14,690-$1,288(0)=$14,690?  Response: No, that’s extrapolation. Line was constructed from used cars (age > 0). Practice: 5.54c p.197

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.8 Example: Extrapolation  Background: Car-buyer used software to regress price on age for 14 used Grand Am’s.  Question: Should we predict a new Grand Am to cost $14,690-$1,288(0)=$14,690?  Response: Practice: 5.54c p.197

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.9 Definition  Extrapolation: using the regression line to predict responses for explanatory values outside the range of those used to construct the line.

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.10 Example: More Extrapolation  Background: A regression of 17 male students’ weights (lbs.) on heights (inches) yields the equation  Question: What weight does the line predict for a 20-inch-long infant?  Response: (20)=-264 pounds! Practice: 5.36e p.193

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.11 Example: More Extrapolation  Background: A regression of 17 male students’ weights (lbs.) on heights (inches) yields the equation  Question: What weight does the line predict for a 20-inch-long infant?  Response: Practice: 5.36e p.193

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.12 Expressions for slope and intercept Consider slope and intercept of the least squares regression line  Slope: so if x increases by a standard deviation, predict y to increase by r standard deviations |r| close to 1: y responds closely to x |r| close to 0: y hardly responds to x

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.13 Expressions for slope and intercept Consider slope and intercept of the least squares regression line  Slope: so if x increases by a standard deviation, predict y to increase by r standard deviations  Intercept: so when predict  the line passes through the point of averages

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.14 Example: Individual Summaries on Scatterplot  Background: Car-buyer plotted price vs. age for 14 used Grand Ams [(4, 13,000), (8, 4,000), etc.]  Question: Guess the means and sds of age and price?  Response: Age has approx. mean 5 yrs, sd 2 yrs; price has approx. mean $8,000, sd $2,000. Practice: 5.50a-d p.196

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.15 Example: Individual Summaries on Scatterplot  Background: Car-buyer plotted price vs. age for 14 used Grand Ams [(4, 13,000), (8, 4,000), etc.]  Question: Guess the means and sds of age and price?  Response: Age has approx. mean yrs, sd yrs; age price Practice: 5.50a-d p

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.16 Example: Individual Summaries on Scatterplot  Background: Car-buyer plotted price vs. age for 14 used Grand Ams [(4, 13,000), (8, 4,000), etc.]  Question: Guess the means and sds of age and price?  Response: Age has approx. mean 5 yrs, sd 2 yrs; price has approx. mean, sd age price Practice: 5.50a-d p.196 $8,000$2,000.

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.17 Example: Individual Summaries on Scatterplot  Background: Car-buyer plotted price vs. age for 14 used Grand Ams [(4, 13,000), (8, 4,000), etc.]  Question: Guess the means and sds of age and price?  Response: Age has approx. mean __ yrs, sd __ yrs; price has approx. mean $_______, sd $_______. Practice: 5.50a-d p.196

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.18 Definitions  Residual: error in using regression line to predict y given x. It equals the vertical distance observed minus predicted which can be written  s: denotes typical residual size, calculated as Note: s just “averages” out the residuals

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.19 Example: Considering Residuals  Background: Car-buyer regressed price on age for 14 used Grand Ams [(4, 13,000), (8, 4,000), etc.].  Question: What does s = 2,175 tell us?  Response: Regression line predictions not perfect: x=4  predict =14,686-1,290(4)=9,526; actual y=13,000  prediction error =13,000-9,526= +3,474 x=8  predict =14,686-1,290(8) = 4,366; actual y=4,000  prediction error = 4,000-4,366= -366, etc. Typical size of 14 prediction errors is s = 2,175 (dollars) Practice: 5.56a p.197

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.20 Example: Considering Residuals  Background: Car-buyer regressed price on age for 14 used Grand Ams [(4, 13,000), (8, 4,000), etc.].  Question: What does s = 2,175 tell us?  Response: Regression line predictions not perfect: x=4  predict = actual y=13,000  prediction error = x=8  predict = actual y=4,000  prediction error = Typical size of 14 prediction errors is _________ (dollars) Practice: 5.56a p.197

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.21 Example: Considering Residuals Typical size of 14 prediction errors is s = 2,175 (dollars): Some points’ vertical distance from line more, some less; 2,175 is typical distance.

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.22 Example: Residuals and their Typical Size s  Background: For a sample of schools, regressed average Math SAT on average Verbal SAT average Math SAT on % of teachers w. advanced degrees  Question: How are s = 7.08 (left) and s = 26.2 (right) consistent with the values of the correlation r?  Response: On left ; relation is strong and typical error size is small (only 7.08). A Closer Look: If output reports R-sq, take its square root (+ or - depending on slope) to find r. Practice: 570i-k p.203

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.23 Example: Residuals and their Typical Size s  Background: For a sample of schools, regressed average Math SAT on average Verbal SAT average Math SAT on % of teachers w. advanced degrees  Question: How are s = 7.08 (left) and s = 26.2 (right) consistent with the values of the correlation r?  Response: On left ; relation is ________ and typical error size is________(only 7.08). A Closer Look: If output reports R-sq, take its square root (+ or - depending on slope) to find r. Practice: 570i-k p.203

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.24 Example: Residuals and their Typical Size s  Background: For a sample of schools, regressed average Math SAT on average Verbal SAT average Math SAT on % of teachers w. advanced degrees  Question: How are s = 7.08 (left) and s = 26.2 (right) consistent with the values of the correlation r?  Response: On right ; relation is weak and typical error size is large (26.2). Looking Back: r based on averages is overstated; strength of relationship for individual students would be less. Smaller s  better predictions Practice: 570i-k p.203

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.25 Example: Residuals and their Typical Size s  Background: For a sample of schools, regressed average Math SAT on average Verbal SAT average Math SAT on % of teachers w. advanced degrees  Question: How are s = 7.08 (left) and s = 26.2 (right) consistent with the values of the correlation r?  Response: On right r = _____________________________; relation is ________ and typical error size is________ (26.2). Looking Back: r based on averages is overstated; strength of relationship for individual students would be less. Smaller s  better predictions Practice: 570i-k p.203

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.26 Example: Typical Residual Size s close to or 0  Background: Scatterplots show relationships… Price per kilogram vs. price per lb. for groceries Students’ final exam score vs. (number) order handed in  Questions: Which has s=0? Which has s close to ?  Responses: Plot on left has s=0: no prediction errors. Plot on right: s close to. (Regressing on x doesn’t help; regression line approx. horizontal.) Regression line approx. same as line at average y-value. Practice: 5.58 p.198

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.27 Example: Typical Residual Size s close to or 0  Background: Scatterplots show relationships… Price per kilogram vs. price per lb. for groceries Students’ final exam score vs. (number) order handed in  Questions: Which has s = 0? Which has s close to ?  Responses: Plot on left has s =____: no prediction errors. Plot on right: s close to_____. (Regressing on x doesn’t help; regression line is approximately horizontal.) Regression line approx. same as line at average y-value. Practice: 5.58 p.198

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.28 Example: Typical Residual Size s close to  Background: Football Season Scores Regression Analysis: Steelers versus Opponents The regression equation is Steelers = Opponents S = Descriptive Statistics: Steelers Variable N Mean Median TrMean StDev SE Mean Steelers Variable Minimum Maximum Q1 Q3 Steelers Question: Since s = and = 9.66 are very close, do you expect |r| close to 0 or 1? Response: r must be close to zero (it is in fact 0.04). Practice: 5.59 p.198

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.29 Example: Typical Residual Size s close to  Background: Football Season Scores Regression Analysis: Steelers versus Opponents The regression equation is Steelers = Opponents S = Descriptive Statistics: Steelers Variable N Mean Median TrMean StDev SE Mean Steelers Variable Minimum Maximum Q1 Q3 Steelers Question: Since s = and = 9.66 are very close, do you expect |r| close to 0 or 1? Response: r must be close to __________________ Practice: 5.59 p.198

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.30 Explanatory/Response Roles in Regression Our choice of roles, explanatory or response, does not affect the value of the correlation r, but it does affect the regression line.

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.31 Example: Regression Line when Roles are Switched  Background: Compare regression of y on x (left) and regression of x on y (right) for same 4 points:  Question: Do we get the same line regressing y on x as we do regressing x on y?  Response: The lines are very different. Regressing y on x: slight negative slope Regressing x on y: steep negative slope Context needed; consider variables and their roles before regressing. Practice: 5.60b p.198

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.32 Example: Regression Line when Roles are Switched  Background: Compare regression of y on x (left) and regression of x on y (right) for same 4 points:  Question: Do we get the same line regressing y on x as we do regressing x on y?  Response: The lines are very different. Regressing y on x: ____________ slope Regressing x on y: ____________ slope Context needed; consider variables and their roles before regressing. Practice: 5.60b p.198

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.33 Definitions  Outlier: (in regression) point with unusually large residual  Influential observation: point with high degree of influence on regression line.

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.34 Example: Outliers and Influential Observations  Background: Exploring relationship between orders for new planes and fleet size. (r = +0.69)  Question: Are Southwest and JetBlue outliers or influential?  Response: Southwest: very influential (omit it  slope changes a lot) JetBlue: outlier (large residual; omit it  r increases to +0.97) Southwest is not an outlier. Practice: 5.70d p.203

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.35 Example: Outliers and Influential Observations  Background: Exploring relationship between orders for new planes and fleet size. (r=+0.69)  Question: Are Southwest and JetBlue outliers or influential?  Response: Southwest: ______________ (omit it  slope changes a lot) JetBlue:______(large residual; omit it  r increases to +0.97) Southwest is not an outlier. Practice: 5.70d p.203

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.36 Example: Outliers and Influential Observations  Background: Exploring relationship between orders for new planes and fleet size. (r = +0.69)  Question: How does Minitab classify JetBlue and Southwest?  Response: JetBlue: outlier (marked “R” in MINITAB) Southwest: very influential (marked “X” in MINITAB) Influential observations tend to be extreme in horizontal direction. Practice: 5.70g p.203

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.37 Example: Outliers and Influential Observations  Background: Exploring relationship between orders for new planes and fleet size. (r = +0.69)  Question: How does Minitab classify Southwest and JetBlue?  Response: Southwest: _______________(marked ___ in Minitab) JetBlue: _________________(marked ___ in Minitab) Influential observations tend to be extreme in horizontal direction. Practice: 5.70g p.203

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.38 Definitions  Slope : how much response y changes in general (for entire population) for every unit increase in explanatory variable x  Intercept : where the line that best fits all explanatory/response points (for entire population) crosses the y-axis Looking Back: Greek letters often refer to population parameters.

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.39 Line for Sample vs. Population  Sample: line best fitting sampled points: predicted response is  Population: line best fitting all points in population from which given points were sampled: mean response is A larger sample helps provide more evidence of a relationship between two quantitative variables in the general population.

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.40 Example: Role of Sample Size  Background: Relationship between ages of students’ mothers and fathers; both scatterplots have r = +0.78, but sample size is over 400 (on left) or just 5 (on right):  Question: Which plot provides more evidence of strong positive relationship in population?  Response: Plot on left (larger n). Can believe configuration on right occurred by chance. Practice: 5.64 p.200

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.41 Example: Role of Sample Size  Background: Relationship between ages of students’ mothers and fathers; both scatterplots have r = +0.78, but sample size is over 400 (on left) or just 5 (on right):  Question: Which plot provides more evidence of strong positive relationship in population?  Response: Plot on ______________ Can believe configuration on _______ occurred by chance. Practice: 5.64 p.200

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.42 Time Series If explanatory variable is time, plot one response for each time value and “connect the dots” to look for general trend over time, also peaks and troughs.

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.43 Example: Time Series  Background: Time series plot shows average daily births each month in year 2000 in the U.S.:  Question: Where do you see a peak or a trough?  Response: Trough in April, peak in Aug/Sept. Peak in August/September, 9 months after December Trough in April, 9 months after July Practice: 5.66 p.201

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.44 Example: Time Series  Background: Time series plot shows average daily births each month in year 2000 in the U.S.:  Question: Where do you see a peak or a trough? Response: Trough in ______, peak in ___________ Practice: 5.66 p.201

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.45 Example: Time Series  Background: Time series plot of average daily births in U.S.  Questions: How can we explain why there are… Conceptions in U.S.: fewer in July, more in December? Conceptions in Europe: more in summer, fewer in winter?  Response: Difficult to explain… A Closer Look: Statistical methods can’t always explain “why”, but at least they help understand “what” is going on. Practice: 5.66 p.201

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.46 Example: Time Series  Background: Time series plot of average daily births in U.S.  Questions: How can we explain why there are… Conceptions in U.S.: fewer in July, more in December? Conceptions in Europe: more in summer, fewer in winter?  Response: A Closer Look: Statistical methods can’t always explain “why”, but at least they help understand “what” is going on. Practice: 5.66 p.201

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.47 Additional Variables in Regression  Confounding Variable: Combining two groups that differ with respect to a variable that is related to both explanatory and response variables can affect the nature of their relationship.  Multiple Regression: More advanced treatments consider impact of not just one but two or more quantitative explanatory variables on a quantitative response.

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.48 Example: Additional Variables  Background: A regression of phone time (in minutes the day before) and weight shows a negative relationship.  Questions: Do heavy people talk on the phone less? Do light people talk more?  Response: Gender is confounding variable  regress separately for males and females ➔ no relationship Practice: 5.113a p.219

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.49 Example: Additional Variables  Background: A regression of phone time (in minutes the day before) and weight shows a negative relationship.  Questions: Do heavy people talk on the phone less? Do light people talk more?  Response: ________ is confounding variable  regress separately for ___________________ ➔ no relationship Practice: 5.113a p.219

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.50 Example: Multiple Regression  Background: We used a car’s age to predict its price.  Question: What additional quantitative variable would help predict a car’s price?  Response: miles driven (among other possibilities) Practice: 5.69b-d p.201

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.51 Example: Multiple Regression  Background: We used a car’s age to predict its price.  Question: What additional quantitative variable would help predict a car’s price?  Response: Practice: 5.69b-d p.201

©2011 Brooks/Cole, Cengage Learning Elementary Statistics: Looking at the Big Picture L12.52 Lecture Summary (Regression)  Equation of regression line Interpreting slope and intercept Extrapolation  Residuals: typical size is s  Line affected by explanatory/response roles  Outliers and influential observations  Line for sample or population; role of sample size  Time series  Additional variables