1 G89.2229 Lect 5W Regression assumptions Inferences about regression Predicting Day 29 Anxiety Analysis of sets of variables: partitioning the sums of.

Slides:



Advertisements
Similar presentations
Topic 12: Multiple Linear Regression
Advertisements

13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Ch11 Curve Fitting Dr. Deshi Ye
1 Chapter 2 Simple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Chapter 12 Simple Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
Chapter 10 Simple Regression.
Chapter 12 Simple Regression
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Lecture 9: One Way ANOVA Between Subjects
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Linear Regression Example Data
Simple Linear Regression and Correlation
Introduction to Regression Analysis, Chapter 13,
Relationships Among Variables
Review Guess the correlation. A.-2.0 B.-0.9 C.-0.1 D.0.1 E.0.9.
Simple Linear Regression Models
Econ 3790: Business and Economics Statistics
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
Introduction to Linear Regression
1 G Lect 6M Comparing two coefficients within a regression equation Analysis of sets of variables: partitioning the sums of squares Polynomial curve.
An alternative approach to testing for a linear association The Analysis of Variance (ANOVA) Table.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
14- 1 Chapter Fourteen McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved.
Chapter 13 Multiple Regression
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
1 G Lect 3M Regression line review Estimating regression coefficients from moments Marginal variance Two predictors: Example 1 Multiple regression.
Jump to first page Inferring Sample Findings to the Population and Testing for Differences.
1 G Lect 4W Multiple regression in matrix terms Exploring Regression Examples G Multiple Regression Week 4 (Wednesday)
ENGR 610 Applied Statistics Fall Week 11 Marshall University CITE Jack Smith.
Analysis of variance approach to regression analysis … an (alternative) approach to testing for a linear association.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Stats Methods at IC Lecture 3: Regression.
Multiple Regression.
Lecture 11: Simple Linear Regression
Chapter 20 Linear and Multiple Regression
Statistics for Managers using Microsoft Excel 3rd Edition
Regression 11/6.
Regression 10/29.
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
Regression model with multiple predictors
Quantitative Methods Simple Regression.
Review of Chapter 3 where Multiple Linear Regression Model:
Multiple Regression.
Simple Linear Regression
Model Comparison: some basic concepts
Multiple Regression Chapter 14.
Simple Linear Regression
Chapter Fourteen McGraw-Hill/Irwin
Chapter Thirteen McGraw-Hill/Irwin
MGS 3100 Business Analysis Regression Feb 18, 2016
F test for Lack of Fit The lack of fit test..
Presentation transcript:

1 G Lect 5W Regression assumptions Inferences about regression Predicting Day 29 Anxiety Analysis of sets of variables: partitioning the sums of squares Predicting Day 29 Anger with two Day 28 support measures, after adjusting for Day 28 Mood G Multiple Regression Week 5 (Wednesday)

2 G Lect 5W Usual OLS Regression Assumptions Needed for unbiased estimates »Model is properly specified Linear model? Selection characteristics included as IVs? Reliable IVs? Needed for efficient estimates »Independent observations »Homoscedastic residuals Needed for inference »Independent residuals »Homoscedastic residuals »Normally distributed residuals

3 G Lect 5W Inferences about regression Suppose we believe that reaction time, Y, is inverse-linearly related to amount of cereal subjects eat. »Y=A+BX+e, where B<0 We collect data from 20 students on the grams(x10) of cereal they ate, and we measure their reaction time in identifying ambiguous stimuli. Suppose we obtain estimates of Â=453 and B=-.6 Question: Is there really evidence that X and Y are related? »Can we reject H 0 : B=0? (In this case H 0 is credible!) ˆ

4 G Lect 5W B estimates are random variables Even if B is truly zero, it is unlikely that B will be zero »The least squares criterion guarantees that B will fit even chance association between Y and X. »Especially for small samples, chance associations can be striking. Example of chance results: ˆ ˆ

5 G Lect 5W The compelling nature of Random patterns Formal statistical inference methods tell us how often to expect such striking patterns by chance alone. Two approaches »Wald test (ratio of B to sd B ) »ANOVA test ˆ ˆ

6 G Lect 5W Multiple Regression Inference: Single variables Y = B 0 + B 1 X 1 + B 2 X B q X q + e Formal question: What can be said about an individual coefficient, B q in the context of the full model (i.e. “adjusting for X 1, X 2,..., X q-1 ”) »Test null hypothesis, H 0 : B q = 0 »Compute 95% CI, (L q,U q ) around B q »How much variance in Y does X q account, given that some variance is already fitted by X 1, X 2,..., X q-1 ? Example from CCWA:Does gender add to the prediction of salary when experience and productivity are included in the model?

7 G Lect 5W Example: Predicting Depressed mood day 29 In bar exam study, let's revisit the prediction of depression on day 29 as a function of depression and anxiety on day 28. What can we say about »The relation of anxiety28 to depression29 when depression 28 is adjusted? »The residual distribution? »Homoscedasticity? »Adequacy of the linear model? »Alternative scaling of depression?

8 G Lect 5W Multiple Regression Inference: Fit of whole equation Example: Suppose that outcome is productivity of workgroups in a corporation and X’s are characteristics of work setting, such as space/employee, ambient noise level, distance to restrooms, etc. Y = B 0 + B 1 X 1 + B 2 X B q X q + e What can be said about the whole set of variables (i.e., X 1, X 2,..., X q ) in relation to Y? »Test the null hypothesis, H 0 : B 1 = B 2 =... =B q =0 »Alternative formulation, H 0 : R 2 =0

9 G Lect 5W Decomposition of Regression and Residual Variance Step 1: Estimate regression coefficients using OLS and compute predicted (fitted) values of Y (Y). Step 2: Estimate Regression Sums of Squares as  (Y-Y) 2, MSR=SSR/df Step 3: Estimate Residual Sums of Squares as  e 2, MSE=SSE/df Under H 0, MSR/MSE is distributed as central F on (q,n-q-1) df SourcedfSSMS Regression q  (Y-Y) 2 SSR/q Residualn-q-1  e 2 SSE /(n-q-1) ^ ^ ^ ^ ^ ¯ ¯

10 G Lect 5W Test of Incremental R 2 due to X q Hierarchical Regression »Fit reference model with X 1, X 2,...,X q-1 Determine Regression Sums of Squares This determines R 2 of reference model »Fit expanded model with X q added to reference model Determine increase in Regression Sums of Squares (SS q ) »on 1 df for single predictor X q Determines R 2 increment »“semipartial squared correlation” Determine Sums of Squares & Mean Squares for residual from expanded model »MSE is mean square for residual »on (n-q-1) degrees of freedom »Under null hypothesis, H 0 :B q =0 MS q is simply fitted random variation MS q /MSE ~ F[1, (n-q-1)]

11 G Lect 5W Example: Predicting Anger on Day 29 with Day 28 Measures Does Anger on day 28 improve the fit of Anger on day 29 after four other moods have been included in the model? Do two emotional support variables on day 28 improve the fit of Anger 29 after five moods have been included?

12 G Lect 5W Numerical Results