Lesson 14 - 1 Testing the Significance of the Least Squares Regression Model.

Slides:



Advertisements
Similar presentations
Section 9.3 Inferences About Two Means (Independent)
Advertisements

Lesson Inferences Between Two Variables. Objectives Perform Spearman’s rank-correlation test.
Objectives (BPS chapter 24)
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
© 2010 Pearson Prentice Hall. All rights reserved Two Sample Hypothesis Testing for Means from Independent Groups.
Two Sample Hypothesis Testing for Proportions
Chapter 10 Simple Regression.
Stat 112 – Notes 3 Homework 1 is due at the beginning of class next Thursday.
SIMPLE LINEAR REGRESSION
Chapter 9: Correlation and Regression
8-4 Testing a Claim About a Mean
Chapter 11: Inference for Distributions
SIMPLE LINEAR REGRESSION
Chapter 12 Section 1 Inference for Linear Regression.
Lecture 5 Correlation and Regression
SIMPLE LINEAR REGRESSION
1 Chapter 9 Inferences from Two Samples In this chapter we will deal with two samples from two populations. The general goal is to compare the parameters.
Correlation and Regression
Inference for regression - Simple linear regression
Claims about a Population Mean when σ is Known Objective: test a claim.
Copyright © Cengage Learning. All rights reserved. 13 Linear Correlation and Regression Analysis.
Review for Exam 2 (Ch.6,7,8,12) Ch. 6 Sampling Distribution
Lesson Confidence Intervals about a Population Standard Deviation.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Inference about Two Population Standard Deviations.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 2 – Slide 1 of 25 Chapter 11 Section 2 Inference about Two Means: Independent.
Correlation and Regression
BPS - 3rd Ed. Chapter 211 Inference for Regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 11 Inferences About Population Variances n Inference about a Population Variance n.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Lesson Comparing Two Proportions. Knowledge Objectives Identify the mean and standard deviation of the sampling distribution of p-hat 1 – p-hat.
Elementary Statistics Correlation and Regression.
1 Section 9-4 Two Means: Matched Pairs In this section we deal with dependent samples. In other words, there is some relationship between the two samples.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Section Inference about Two Means: Independent Samples 11.3.
Lesson Inference for Regression. Knowledge Objectives Identify the conditions necessary to do inference for regression. Explain what is meant by.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Comparing Three or More Means ANOVA (One-Way Analysis of Variance)
+ Chapter 12: More About Regression Section 12.1 Inference for Linear Regression.
Copyright © Cengage Learning. All rights reserved. 13 Linear Correlation and Regression Analysis.
Lesson Testing Claims about a Population Proportion.
Interval Estimation and Hypothesis Testing Prepared by Vera Tabakova, East Carolina University.
Lesson Inference about Two Means - Dependent Samples.
Lesson Comparing Two Means. Knowledge Objectives Describe the three conditions necessary for doing inference involving two population means. Clarify.
1 Objective Compare of two population variances using two samples from each population. Hypothesis Tests and Confidence Intervals of two variances use.
Applied Quantitative Analysis and Practices LECTURE#25 By Dr. Osman Sadiq Paracha.
© Copyright McGraw-Hill 2004
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 1 – Slide 1 of 26 Chapter 11 Section 1 Inference about Two Means: Dependent Samples.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Lecture Slides Elementary Statistics Twelfth Edition
© 2010 Pearson Prentice Hall. All rights reserved Chapter Hypothesis Tests Regarding a Parameter 10.
BPS - 5th Ed. Chapter 231 Inference for Regression.
Chapter 15 Inference for Regression. How is this similar to what we have done in the past few chapters?  We have been using statistics to estimate parameters.
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
Inference about Two Means - Independent Samples
Review of Chapter 12 Significance Tests in Practice
Inference for Regression
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Lesson Comparing Two Means.
Multiple Regression Models
Statistical Inference about Regression
Inference in Linear Models
Chapter 12 Review Inference for Regression
CHAPTER 12 More About Regression
Chapter 14 Inference for Regression
Inference for Regression
Inference for Regression
Confidence and Prediction Intervals
Presentation transcript:

Lesson Testing the Significance of the Least Squares Regression Model

Objectives Understand the requirements of the least- squares regression model Compute the standard error of the estimate Verify that the residuals are normally distributed Conduct inference on the slope and intercept Construct a confidence interval about the slope of the least-squares regression model

Vocabulary Bivariate normal distribution – one variable is normally distributed given any value of the other variable and the second variable is normally distributed given any value of the first variable Jointly normally distributed – same as bivariate normal distribution

Least-Squares Regression Model y i = β 0 + β 1 x i + ε i where y i is the value of the response variable for the i th individual β 0 and β 1 are the parameters to be estimated based on the sample data x i is the value of the explanitory variable for the i th individual ε i is a random error term with mean 0 and variance σ² εi = σ² The error terms are independent and normally distributed I = 1, …, n where n is the sample size (number of ordered pairs in the data set)

Requirements for Inferences The mean of the responses depends linearly on the explanatory variable –Verify linearity with a scatter plot (as in Chapter 4) The response variables are normally distributed with the same standard deviation –We plot the residuals against the values of the explanatory variable –If the residuals are spread evenly about a horizontal line drawn at 0, then the requirement of constant variance is satisfied –If the residuals increasingly spread outward (or decreasingly contract inward) about that line at 0, then the requirement of constant variance may not be satisfied

Hypothesis Tests Only after requirements are checked can we proceed with inferences on the slope, β 1, and the intercept, β 0 Tests: Two-TailedLeft TailedRight Tailed H 0 : β 1 = 0 H 1 : β 1 ≠ 0H 1 : β 1 < 0H 1 : β 1 > 0 Note: these procedures are considered robust (in fact for large samples (n > 30), inferential procedures regarding b 1 can be used with significant departures for normality)

Note: degrees of freedom = n – 2 H 0 : β 1 = 0 t α/2 for two-tailed (≠0) t α for one-tailed (>0 or <0) (b 1 – β i ) b 1 t 0 = = s e s b (x i – x) 2 Σ Test Statistic

(y i – y i ) 2 residuals 2 s e = = n – 2 n – 2 ΣΣ Note: by divide by n – 2 because we have estimated two parameters, β 0 and β 1 Standard Error of the Estimate

Conclusions from Test Rejecting the null hypothesis means that for The two-tailed alternative hypothesis, H 1 : β 1 ≠ 0 –The slope is significantly different from 0 –There is a significant linear relationship between the variables x and y The left-tailed alternative hypothesis, H 1 : β 1 < 0 –The slope is significantly less than 0 –There is a significant negative linear relationship between the variables x and y The right-tailed alternative hypothesis, H 1 : β 1 > 0 –The slope is significantly greater than 0 –There is a significant positive linear relationship between the variables x and y

Hypothesis Testing on β 1 Steps for Testing a Claim Regarding the Population Mean with σ Known 0. Test Feasible (requirements) 1.Determine null and alternative hypothesis (and type of test: two tailed, or left or right tailed) 2.Select a level of significance α based on seriousness of making a Type I error 3.Calculate the test statistic 4.Determine the p-value or critical value using level of significance (hence the critical or reject regions) 5.Compare the critical value with the test statistic (also known as the decision rule) 6.State the conclusion

Example

note: t α/2 degrees of freedom = n – 2 pre-conditions: 1)data randomly obtained 2)residuals normally distributed 3)constant error variance s e Lower bound = b 1 – t α/ = b 1 - t α/2 · s b 1 (x i – x) 2 Σ s e Upper bound = b 1 + t α/ = b 1 + t α/2 · s b 1 (x i – x) 2 Σ Confidence Intervals for β1 Confidence intervals are of the form Point estimate ± margin of error

Using TI Enter explanatory variable in L1 and the response variable in L2 Press STAT, highlight TESTS and select E:LinRegTTest Be sure Xlist is L1 and Ylist is L2. Make sure that Freq is set to 1. Set the direction of the alternative hypothesis. Highlight calculate and ENTER.

Summary and Homework Summary –Confidence intervals and prediction intervals quantify the accuracy of predicted values from least- squares regression lines –Confidence intervals for a mean response measure the accuracy of the mean response of all the individuals in a population –Prediction intervals for an individual response measure the accuracy of a single individual’s predicted value Homework –pg ; 1, 2, 3, 4, 7, 12, 13, 18