Full Model: contain ALL coefficients of interest

Slides:



Advertisements
Similar presentations
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Advertisements

Forecasting Using the Simple Linear Regression Model and Correlation
Hypothesis Testing Steps in Hypothesis Testing:
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Objectives (BPS chapter 24)
Chapter 12 Simple Linear Regression
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Chapter Topics Types of Regression Models
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
BCOR 1020 Business Statistics
Chapter 7 Forecasting with Simple Regression
Hypothesis tests for slopes in multiple linear regression model Using the general linear test and sequential sums of squares.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Lecture 5 Correlation and Regression
Introduction to Linear Regression and Correlation Analysis
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved OPIM 303-Lecture #9 Jose M. Cruz Assistant Professor.
1 1 Slide Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Lecture 9: ANOVA tables F-tests BMTRY 701 Biostatistical Methods II.
Solutions to Tutorial 5 Problems Source Sum of Squares df Mean Square F-test Regression Residual Total ANOVA Table Variable.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Chapter 13 Multiple Regression
Regression Analysis Relationship with one independent variable.
Lack of Fit (LOF) Test A formal F test for checking whether a specific type of regression function adequately fits the data.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
The general linear test approach to regression analysis.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Université d’Ottawa / University of Ottawa 2001 Bio 4118 Applied Biostatistics L12.1 Lecture 12: Generalized Linear Models (GLM) What are they? When do.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Lecture 11: Simple Linear Regression
Chapter 20 Linear and Multiple Regression
Correlation and Simple Linear Regression
Essentials of Modern Business Statistics (7e)
Linear Regression Prof. Andy Field.
John Loucks St. Edward’s University . SLIDES . BY.
Regression model with multiple predictors
Relationship with one independent variable
Chapter 13 Simple Linear Regression
Lecture 12 More Examples for SLR More Examples for MLR 9/19/2018
9/19/2018 ST3131, Lecture 6.
Business Statistics Multiple Regression This lecture flows well with
Cases of F-test Problems with Examples
CHAPTER 29: Multiple Regression*
Prediction and Prediction Intervals
Review of Chapter 3 where Multiple Linear Regression Model:
Prepared by Lee Revere and John Large
24/02/11 Tutorial 3 Inferential Statistics, Statistical Modelling & Survey Methods (BS2506) Pairach Piboonrungroj (Champ)
PENGOLAHAN DAN PENYAJIAN
Model Comparison: some basic concepts
Review of Chapter 2 Some Basic Concepts: Sample center
Relationship with one independent variable
Interpretation of Regression Coefficients
Introduction to Regression
St. Edward’s University
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Full Model: contain ALL coefficients of interest Lecture 10 Review of Lectures 8 and 9 Examples Review of Lectures 8 and 9 Three Basic Concepts: Full Model: contain ALL coefficients of interest Reduced Model: contain PART of the coefficients of interest Nested Model : one model is a SUBMODEL of the other one 1/16/2019

Review of Lectures 8 and 9 Steps for Model Comparison : RM H0: The RM is adequate vs FM H1: The FM is adequate Step1: Fit the FM and get SSE (in the ANOVA table) df (in the ANOVA table) R_sq (under the Coefficient Table) Step 2: Fit the RM and get SSE, df, and R_sq. Step 3: Compute F-statistic: Step 4: Conclusion: Reject H0 if F>F(r,df(SSE,F),alpha) Can’t Reject H0 otherwise. 1/16/2019

Review of Lectures 8 and 9 Special Case: ANOVA Table (Analysis of Variance) Source Sum of Squares df Mean Square F-test P-value Regression SSR p MSR=SSR/p F=MSR/MSE Residuals SSE n-p-1 MSE=SSE/(n-p-1) Total SST n-1 1/16/2019

All can be tested using F-test. Review of Lectures 8 and 9 Common Cases for Model Testing Case 1: ALL NON-intercept coefficients are zero Case 2: SOME of the coefficients are zero Case 3: SOME of the coefficients are EQUAL to each other Case 4: Other specified CONSTRAINTS on coefficients All can be tested using F-test. 1/16/2019

Examples ANOVA Table Source df F-test Regression 1848.76 Residual Problem 3.5 (Page 76, textbook) Table 3.11 shows the regression output, with some numbers erased, when a simple regression model relating a response variable Y to a predictor variable X1 is fitted based on 20 observations. Complete the 13 missing numbers, then compute Var(Y) and Var(X1). ANOVA Table Source Sum of Squares df Mean Square F-test Regression 1848.76 Residual Total Coefficient Table Variable Coefficients s.e. T-test P-value Constant -23.4325 12.74 .0824 X1 .1528 8.32 <.0001 n= R^2= Ra^2= S= df 1/16/2019

Examples 1/16/2019

Examples 1/16/2019

Examples Problem 3.12 (Page 78, textbook) Table 3.14 shows the regression output of a MLR model relating the beginning salaries in dollars of employees in a given company to the following predictor variables: Sex (X1): An indicator variable(man=1, woman=0) Education(X2): Years of Schooling at the time of hire Experience(X3): Number of months previous work experience Months(X4): Number of months with the company In (a)-(b) below, specify the null and alternative hypotheses the test used, and your conclusion using a 5% level of significance. 1/16/2019

Examples Table 3.14 ANOVA Table Source df F-test Regression 23665352 4 Sum of Squares df Mean Square F-test Regression 23665352 4 5916338 22.98 Residual 22657938 88 257477 Total 46323290 92 Coefficient Table Variable Coefficients s.e. T-test P-value Constant 3526.4 327.7 10.76 .000 Sex 722.5 117.8 6.13 Education 90.02 24.69 3.65 Experience 1.2690 .5877 2.16 .034 Month 23.406 5.201 4.50 n=93 R^2=.515 Ra^2=.489 S=507.4 Df=88 1/16/2019

Examples Conduct the F-test for the overall fit of the regression (F(4,88,.05)<2.53) Test H0: vs H1: Statistic F= df=( , ) Conclusion: H0, the overall fit is significant. Is there a positive linear relationship between Salary and Experience, after accounting for the effect of the variables Sex, Education, and Months. Test H0: vs H1: Statistic T= P-value= Conclusion: H0. The positive relationship is significant at 5% significance level. 1/16/2019

Examples (c) What salary would you forecast for a man with 12 years of Education, 10 months of Experience, and 15 months with the company? (d) What salary would you forecast, on average, for a man with 12 years of Education, 10 months of Experience, and 15 months with the company? 1/16/2019

Examples (e) What salary would you forecast, on average, for a woman with 12 years of Education, 10 months of Experience, and 15 months with the company? Problem 3.13 (Page 79, textbook) Consider the regression model that generated output in Table 31.4 to be a Full Model. Now consider the Reduced Model in which Salary is regression on only Education . The ANOVA table obtained when fitting this model is shown in Table 3.15. Conduct a single test to compare the Full and Reduced Models. What conclusion can be drawn from the result of the test? (Use 5% significant level). 1/16/2019

Statistic SSE(R )= df(R )= SSE(F)= df(F)= F= df=( , ). Examples Table 3.15 ANOVA Table Source Sum of Squares df Mean Square F-test Regression 7862535 1 18.60 Residual 38460756 91 422646 Total 46323291 92 Test H0: vs H1: Statistic SSE(R )= df(R )= SSE(F)= df(F)= F= df=( , ). Conclusion: H0. The Reduced Model is significant 1/16/2019