12.1 4.5.2018.

Slides:



Advertisements
Similar presentations
Chapter 12: Inference for Proportions BY: Lindsey Van Cleave.
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Chapter 12 Inference for Linear Regression
10-3 Inferences.
Copyright © 2010 Pearson Education, Inc. Chapter 27 Inferences for Regression.
Copyright © 2010 Pearson Education, Inc. Slide
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
CHAPTER 24: Inference for Regression
Objectives (BPS chapter 24)
Session 2. Applied Regression -- Prof. Juran2 Outline for Session 2 More Simple Regression –Bottom Part of the Output Hypothesis Testing –Significance.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Chapter 10 Simple Regression.
Stat 217 – Day 25 Regression. Last Time - ANOVA When?  Comparing 2 or means (one categorical and one quantitative variable) Research question  Null.
Business Statistics - QBM117 Interval estimation for the slope and y-intercept Hypothesis tests for regression.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression Analysis
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
Hypothesis Testing in Linear Regression Analysis
Understanding Multivariate Research Berry & Sanders.
Inference for Regression
Inferences for Regression
Confidence Intervals for the Regression Slope 12.1b Target Goal: I can perform a significance test about the slope β of a population (true) regression.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
Lesson Inference for Regression. Knowledge Objectives Identify the conditions necessary to do inference for regression. Explain what is meant by.
1 Lecture 4 Main Tasks Today 1. Review of Lecture 3 2. Accuracy of the LS estimators 3. Significance Tests of the Parameters 4. Confidence Interval 5.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.3 Using Multiple Regression to Make Inferences.
Chapter 22: Comparing Two Proportions. Yet Another Standard Deviation (YASD) Standard deviation of the sampling distribution The variance of the sum or.
The Single-Sample t Test Chapter 9. t distributions >Sometimes, we do not have the population standard deviation. (that’s actually really common). >So.
Chapter 10 Inference for Regression
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 12 More About Regression 12.1 Inference for.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Inference with Regression. Suppose we have n observations on an explanatory variable x and a response variable y. Our goal is to study or predict the.
Chapter 9 Minitab Recipe Cards. Contingency tests Enter the data from Example 9.1 in C1, C2 and C3.
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 12: More About Regression Section 12.1 Inference for Linear Regression.
Chapter 26: Inference for Slope. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other.
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 12 More About Regression 12.1 Inference for.
Chapter 15 Inference for Regression. How is this similar to what we have done in the past few chapters?  We have been using statistics to estimate parameters.
Inference for Linear Regression
CHAPTER 12 More About Regression
Lecture #25 Tuesday, November 15, 2016 Textbook: 14.1 and 14.3
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
How regression works The Right Questions about Statistics:
Least Square Regression
Inferences for Regression
Least Square Regression
CHAPTER 12 More About Regression
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
The Practice of Statistics in the Life Sciences Fourth Edition
Inference for Regression Lines
CHAPTER 29: Multiple Regression*
CHAPTER 26: Inference for Regression
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
LESSON 24: INFERENCES USING REGRESSION
Chapter 10 Correlation and Regression
BASIC REGRESSION CONCEPTS
Chapter 12 Review Inference for Regression
Simple Linear Regression
CHAPTER 12 More About Regression
Inference for Regression
CHAPTER 12 More About Regression
Chapter Outline Inferences About the Difference Between Two Population Means: s 1 and s 2 Known.
Inferences for Regression
Warm Up A 2007 study investigated a possible link between declining bird populations and electromagnetic radiation. It looked at sparrow density (per hectare)
Presentation transcript:

12.1 4.5.2018

Inference for Linear Regression Today we will apply inference procedures to linear regression We have been using inference procedures for the past several chapters Confidence intervals Hypothesis tests Linear regression we covered in chapter 3 It has been awhile

Linear Regression Refresher The idea behind linear regression is to estimate a line of best fit between two variables Independent variable and dependent variable How many units the dependent variable changes when the independent variable changes by one unit

Linear Regression Refresher Old faithful Duration of an eruption vs the time before the next eruption Slope is 10.36, and y-intercept is 33.97 𝑦 =33.97+10.36𝑥 𝑖𝑛𝑡𝑒𝑟𝑣𝑎𝑙 =33.97+10.36(𝑑𝑢𝑟𝑎𝑡𝑖𝑜𝑛)

Inference So when we do a linear regression using a sample of data, we are really ESTIMATING the true population values (slope and y-intercept) We don’t know the population values But the estimates from the regression are unbiased estimators for the true population value Does not mean that they are exactly correct So when we do a regression on sample data, we get a regression line 𝑦 =𝑎+𝑏𝑥 a is an unbiased estimator of the true population y-intercept (sometimes called α) b is an unbiased estimator of the true population slope (sometimes called β)

Sample vs Population Sample regression equation: 𝑦 =𝑎+𝑏𝑥 Population regression equation: y=α+β𝑥 a estimates α b estimates β

Sampling Distribution So if we want a sampling distribution for the slope, we already have our unbiased estimate i.e. the mean of the sampling distribution Whatever our estimated slope is But we also need the standard deviation of the sampling distribution Because we don’t know it, we estimate it Called a standard error

Standard Error of the Slope The good news is that we rarely need to use this When we perform a regression, it is included in the computer output But not when you do it on your calculator

Standard Error of the Slope 𝑆𝐸 𝑏 =.0002018

Confidence Interval for the Slope

Example

Example 90% confidence interval: -0.0034415 ± (1.761)(0.0007414) (-0.004747, -0.002136) Interpretation: We are 90% confident that one additional calorie of non-exercise activity corresponds to a decrease in fat gain of between 0.004747 kg and 0.002136 kg.

You try The following regression uses the number of Peruvian anchovies caught (millions of metric tons) per year with the price (US $) of fish meal in that year. Calculate a 95% confidence interval for the true slope of the regression line Predictor Coef SE Coef T P Constant 452.12 36.82 12.28 0.000 Catch -29.402 5.091 -5.78 0.000 S=71.6866 R-Sq= 73.5% n=14

Answers -29.402 ± (2.179)(5.091) -29.402 ± 11.093 (-40.495, -18.309)

Example Hypotheses: 𝐻 0 : β=0 𝐻 𝑎 : β>0 Check conditions—for now let’s assume that they are met (we’ll talk about this in a minute) Test statistic: 1.4929−0 0.4870 = 3.07 P-value: tcdf(3.07, BIG, 36)= 0.002

Look back at the regression output. P-value=0.002 Look back at the regression output. It gives us the test statistic, and it gives us a (wrong) p-value Why is the p-value wrong?

Back to the Anchovies Does the number of fish caught affect the price? What is the test statistic? What is the p-value? What is our interpretation?

Back to the Anchovies What is the test statistic? t=-5.78 What is the p-value? Very small What is our interpretation? We reject the null hypothesis that the true slope is zero, we conclude instead that the true slope (affect) is different from zero

Anchovies Again (last example) Now let’s test a hypothesis different from zero Let’s test whether the slope is below -20 𝐻 0 : β=−20 𝐻 𝑎 : β<−20 Test statistic: −29.402+20 5.091 =−1.847 P-value: .0448