Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.

Slides:



Advertisements
Similar presentations
Chapter 9: Simple Regression Continued
Advertisements

Simple Linear Regression Analysis
Kin 304 Regression Linear Regression Least Sum of Squares
BA 275 Quantitative Business Methods
Stat 112: Lecture 7 Notes Homework 2: Due next Thursday The Multiple Linear Regression model (Chapter 4.1) Inferences from multiple regression analysis.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Statistics Measures of Regression and Prediction Intervals.
Objectives (BPS chapter 24)
Simple Linear Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Chapter 10 Simple Regression.
Lecture 19: Tues., Nov. 11th R-squared (8.6.1) Review
Simple Linear Regression Analysis
Regression Chapter 10 Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania.
Pertemua 19 Regresi Linier
Simple Linear Regression and Correlation
Chapter 7 Forecasting with Simple Regression
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression Analysis
Correlation & Regression
Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) =  +  1 X 1 +  +  k.
Active Learning Lecture Slides
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
Chapter 11 Simple Regression
Lesson Confidence Intervals about a Population Standard Deviation.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Lesson Inference for Regression. Knowledge Objectives Identify the conditions necessary to do inference for regression. Explain what is meant by.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 13 Multiple Regression
Copyright ©2011 Brooks/Cole, Cengage Learning Inference about Simple Regression Chapter 14 1.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Warm-up Ch.11 Inference for Linear Regression Day 2 1. Which of the following are true statements? I. By the Law of Large Numbers, the mean of a random.
Lesson 9 - R Chapter 9 Review.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
ANOVA, Regression and Multiple Regression March
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Lesson Testing the Significance of the Least Squares Regression Model.
REGRESSION G&W p
Multiple Regression and Model Building
Correlation and Regression
CHAPTER 29: Multiple Regression*
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Managerial Economics in a Global Economy
LESSON 24: INFERENCES USING REGRESSION
Multiple Regression Models
Chapter 14 Inference for Regression
Chapter 12 Review Inference for Regression
Simple Linear Regression
Chapter 14 Inference for Regression
Inference for Regression
Confidence and Prediction Intervals
Presentation transcript:

Lesson 14 - R Chapter 14 Review

Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review exercises Use the technology to compute statistical data in the chapter

Problem 1 To perform inference in a least-squares model, the distribution of the response variable y, for a specific explanatory variable x, must 1)Have a standard deviation of 1 2)Have a normal distribution 3)Have a mean equal to its standard deviation 4)Have all of the above

Problem 2 A hypothesis test for the slope, in a least- squares regression model, can be performed using 1)The mean, compared to the median distribution 2)A sum of squares, compared to a chi-square distribution 3)A t-score, compared to the t-distribution 4)The ratio with the intercept, compared to the F distribution

Problem 3 A researcher has found an appropriate least- squares regression model. To predict the mean value of the response variable y, for a specific explanatory variable x, she should 1)Perform a hypothesis test on the intercept 2)Calculate a prediction interval for an individual response 3)Perform a hypothesis test on the slope 4)Calculate a confidence interval for the mean response

Problem 4 For a least-squares regression model, as a researcher collects more and more points (larger values of n), the confidence interval for a mean response 1)Should, in general, become narrower 2)Should, in general, become more normal 3)Should, in general, become more symmetric 4)Should, in general, become closer to the prediction interval for an individual response

Problem 5 A multiple regression model differs from a simple linear regression model in that the multiple regression model 1)Does not require linear relationships 2)Can not be analyzed using inferential techniques 3)Involves more than one explanatory variable 4)Involves more than one response variable

Problem 6 In developing the “best” multiple regression model 1)None of the explanatory variables should ever be dropped to reach the final model 2)If a specific explanatory variable has a large correlation with the response variable, it should be included in the final model 3)Each explanatory variable in the final model should have a coefficient that is significantly different from 0 4)Only explanatory variables with positive correlations should be included

Things to Remember In the Least Square Regression model: –R² gives us the % of the variation in the response variable explained by the model –the mean of the response variable depends on the linearity of the explanatory variable –the residuals must be normally distributed with constant error variance –the ε i will be normally distributed (0,σ²) –we are estimating two values (b 0 b 1 ) and therefore lose 2 degrees of freedom in the t-statistic –the prediction interval for an individual response will be wider than the confidence interval for a mean response –procedures are robust

Things to Remember In Multiple Regression models: –the adjusted R² gives us the % of the variation in the response variable explained by the model R² is adjusted based on # of sample and number of explanatory variables in the model –multicollinearity may be a problem if a high linear correlation exists between explanatory variables Rule: |correlation| > 0.7 then multicollinearity possible –the procedure used in our book for multiple regression modeling is called backwards step-wise regression Rule: remove explanatory variable with highest p-value and then rerun the model check adjusted R² values

Summary and Homework Summary –In a least squares regression model, we can test if the slope and the intercept differ significantly from 0 –We can compute confidence and prediction intervals to describe predicted values of the response variable y –We can include multiple explanatory variables x to form a multiple regression model Homework –pg 784 – 787; 3, 5, 7