Stat 512 – Lecture 16 Two Quantitative Variables (Ch. 9)

Slides:



Advertisements
Similar presentations
AP Statistics Section 3.2 C Coefficient of Determination
Advertisements

Objectives 10.1 Simple linear regression
Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Simple Linear Regression and Correlation
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Chapter 3 Bivariate Data
Warm up Use calculator to find r,, a, b. Chapter 8 LSRL-Least Squares Regression Line.
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. Relationships Between Quantitative Variables Chapter 5.
Scatter Diagrams and Linear Correlation
Relationships Between Quantitative Variables
Chapter 12 Simple Regression
Stat 512 – Lecture 18 Multiple Regression (Ch. 11)
Lecture 19: Tues., Nov. 11th R-squared (8.6.1) Review
Stat 217 – Day 26 Regression, cont.. Last Time – Two quantitative variables Graphical summary  Scatterplot: direction, form (linear?), strength Numerical.
Linear Regression and Correlation Analysis
Stat 512 – Lecture 17 Inference for Regression (9.5, 9.6)
Stat 217 – Day 25 Regression. Last Time - ANOVA When?  Comparing 2 or means (one categorical and one quantitative variable) Research question  Null.
Stat 512 – Lecture 15 Two-way ANOVA (Ch. 12). Last Time – Analysis of Variance (ANOVA) When: Want to compare two or more population/true treatment means.
Lecture 19 Simple linear regression (Review, 18.5, 18.8)
Review for Exam 2 Some important themes from Chapters 6-9 Chap. 6. Significance Tests Chap. 7: Comparing Two Groups Chap. 8: Contingency Tables (Categorical.
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Linear Regression/Correlation
Review for Final Exam Some important themes from Chapters 9-11 Final exam covers these chapters, but implicitly tests the entire course, because we use.
Correlation & Regression
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Inference for Linear Regression Conditions for Regression Inference: Suppose we have n observations on an explanatory variable x and a response variable.
Ch 3 – Examining Relationships YMS – 3.1
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Applied Quantitative Analysis and Practices LECTURE#22 By Dr. Osman Sadiq Paracha.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
Correlation and Regression Used when we are interested in the relationship between two variables. NOT the differences between means or medians of different.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Production Planning and Control. A correlation is a relationship between two variables. The data can be represented by the ordered pairs (x, y) where.
BIOL 582 Lecture Set 11 Bivariate Data Correlation Regression.
Regression. Population Covariance and Correlation.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
CORRELATION: Correlation analysis Correlation analysis is used to measure the strength of association (linear relationship) between two quantitative variables.
Regression Analysis Relationship with one independent variable.
Lecture 9-1 Analysis of Variance
Relationships If we are doing a study which involves more than one variable, how can we tell if there is a relationship between two (or more) of the.
Copyright ©2011 Brooks/Cole, Cengage Learning Inference about Simple Regression Chapter 14 1.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
STA291 Statistical Methods Lecture LINEar Association o r measures “closeness” of data to the “best” line. What line is that? And best in what terms.
Examining Bivariate Data Unit 3 – Statistics. Some Vocabulary Response aka Dependent Variable –Measures an outcome of a study Explanatory aka Independent.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Simple Linear Regression In the previous lectures, we only focus on one random variable. In many applications, we often work with a pair of variables.
Lecture 10: Correlation and Regression Model.
LECTURE 9 Tuesday, 24 FEBRUARY STA291 Fall Administrative 4.2 Measures of Variation (Empirical Rule) 4.4 Measures of Linear Relationship Suggested.
 Find the Least Squares Regression Line and interpret its slope, y-intercept, and the coefficients of correlation and determination  Justify the regression.
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Linear Regression Day 1 – (pg )
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 3 Association: Contingency, Correlation, and Regression Section 3.3 Predicting the Outcome.
^ y = a + bx Stats Chapter 5 - Least Squares Regression
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics Seventh Edition By Brase and Brase Prepared by: Lynn Smith.
Get out p. 193 HW and notes. LEAST-SQUARES REGRESSION 3.2 Interpreting Computer Regression Output.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
Regression Analysis Presentation 13. Regression In Chapter 15, we looked at associations between two categorical variables. We will now focus on relationships.
Topics
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Relationship with one independent variable
AP Stats: 3.3 Least-Squares Regression Line
Chapter 3 Describing Relationships Section 3.2
Relationship with one independent variable
Correlation and Regression
Homework: pg. 180 #6, 7 6.) A. B. The scatterplot shows a negative, linear, fairly weak relationship. C. long-lived territorial species.
Honors Statistics Review Chapters 7 & 8
Presentation transcript:

Stat 512 – Lecture 16 Two Quantitative Variables (Ch. 9)

Last Time With one quantitative response and one qualitative explanatory, can use one-way ANOVA to compare the population/true treatment means This procedure is easily extendable to any number of qualitative explanatory variables  EV 1: Adoptive SES H 0 :  Adopt High =  Adopt Low  EV 2: Biological SES H 0 :  Bio High =  Bio Low Two-way ANOVA, General Linear Model

Two-way ANOVA The “effect” of the SES of the adoptive parents is statistically significant (p-value =.010) even after adjusting for the stronger “effect” of the SES of the biological parents (p-value =.001)

Last Time When have multiple explanatory variable (“factors”), can also consider the interaction between these variables

Last Time When have “paired” or “dependent” samples, the blocking variable can be incorporated into the model as well Example: Chip melting times  H 0 :  B  MC  SS H a : at least one  differs  Compare n 1 =11 n 2 =11 n 3 =17 “completely randomized design” 37 subjects butterscotch milk chocolate semi-sweet Compare melting times random “randomized block design” 13 subjects b-mc-ss b-ss-mc mc-b-ss Compare melting times random ss-b-mc ss-mc-b mc-ss-b

Last Time “Repeated measures” analyses are just like taking the differences first in “paired samples.” If you want to compare results within blocks or within subjects (instead of across and ignoring the pairing), include that variable in the ANOVA

Practice Problem With random assignment to distinct groups (milked by machine or human), will consider independent With any correspondence, relationship between units, will consider dependent  Litter mates  “Split plots”  Both calculators on sample problem

Example 12.6: Positive and Negative Influences on Children (p. 463) “Children are exposed to many influences in their daily lives. What kind of influence does each of the following have on children? 1. Movies, 2. Programs on network television, 3. Rock music”  -2=very negative, -1=negative, 0=neutral, 1=positive, 2= very positive Research question: Are the population mean responses identical for the three influences?  H 0 :    TV  R H a : at least one  differs

Example 12.6: Positive and Negative Influences on Children Influence SubjectMoviesTVRock

Example 12.6: Positive and Negative Influences on Children While different people do seem to tend to give significantly different ratings (p-value =.003), once we adjust for that, we do not have super convincing evidence of an “influence effect” (p-value =.101).

Example 1: Airline Costs Best prediction of cost?  Sample mean, Another explanatory variable  Distance

Describing the association between two quantitative variables Moderate, positive, linear relationship

Describing the association between two quantitative variables Which is stronger? r =.444 r = -.265

Describing the association between two quantitative variables Moderate, positive, linear relationship r =.439

Modeling the relationship How decide on the best line Residual = observed - predicted

Example 2: Height vs. Foot Length Least Squares Regression applet The “least squares line” finds the equation for the line that minimizes the sum of the squared residuals  Trying to minimize “prediction errors”  Using squared residuals means there will be a unique equation that does this

Example 2: Height vs. Foot Length Interpretation of slope: For each additional cm in height, we predict an additional inches taller Interpretation of intercept: If someone has 0cm foot, predict inches tall!  Not always meaningful in every context!

“Resistance” Remove or change point and see if line changes dramatically The least squares line is not resistant to extreme observations  Especially those that are extreme in the explanatory variable (often a stronger determinant than the size of the residual)

R2R2 If predict everyone to have the same height, lots of “unexplained” variation (SSE = ) If take explanatory variable into account, much less “unexplained” variation (SSE = 235)

Example 1: Airline costs Each flight has a ‘set up’ cost of $151 and each additional mile of travels is associated with an predicted increase in cost of about 7 cents. 19.3% of the variability in airfare is explained by this regression on distance (still lots of unexplained variability) Might investigate further while the cost for ACK was so much higher than expected

For Thursday PP 14 in Blackboard by 3 pm Finishing up HW 7 Continue reading in Ch. 9 By next Tuesday – another project report  Narrowed in on 2 “research questions” and which statistical methods you think will answer them…