Lecture 12 More Examples for SLR More Examples for MLR 9/19/2018

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Qualitative predictor variables
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Inference for Regression
Pengujian Parameter Regresi Pertemuan 26 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Objectives (BPS chapter 24)
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Part 18: Regression Modeling 18-1/44 Statistics and Data Analysis Professor William Greene Stern School of Business IOMS Department Department of Economics.
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Simple Linear Regression Analysis
Descriptive measures of the strength of a linear association r-squared and the (Pearson) correlation coefficient r.
Part 3: Regression and Correlation 3-1/41 Regression Models Professor William Greene Stern School of Business IOMS Department Department of Economics.
Active Learning Lecture Slides
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Chapter 13: Inference in Regression
Understanding Multivariate Research Berry & Sanders.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 1 Slide © 2003 Thomson/South-Western Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination.
Introduction to Linear Regression
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
1 Lecture 4 Main Tasks Today 1. Review of Lecture 3 2. Accuracy of the LS estimators 3. Significance Tests of the Parameters 4. Confidence Interval 5.
Copyright ©2011 Nelson Education Limited Linear Regression and Correlation CHAPTER 12.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Chapter 13 Multiple Regression
Foundations of Sociological Inquiry Statistical Analysis.
732G21/732G28/732A35 Lecture 4. Variance-covariance matrix for the regression coefficients 2.
Chapter 9 Minitab Recipe Cards. Contingency tests Enter the data from Example 9.1 in C1, C2 and C3.
Chapter 12 Inference on the Least-squares Regression Line; ANOVA 12.3 One-way Analysis of Variance.
Chapter 26: Inference for Slope. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other.
Analysis of variance approach to regression analysis … an (alternative) approach to testing for a linear association.
Inference for Linear Regression
Lecture #25 Tuesday, November 15, 2016 Textbook: 14.1 and 14.3
23. Inference for regression
Chapter 14 Introduction to Multiple Regression
Chapter 20 Linear and Multiple Regression
Inference for Least Squares Lines
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Introduction to Regression Lecture 6.2
Least Square Regression
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
Least Square Regression
9/19/2018 ST3131, Lecture 6.
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
Lecture 18 Outline: 1. Role of Variables in a Regression Equation
The Practice of Statistics in the Life Sciences Fourth Edition
Cases of F-test Problems with Examples
Properties of the LS Estimates Inference for Individual Coefficients
I271B Quantitative Methods
Inference for Regression Lines
Solutions for Tutorial 3
Solutions of Tutorial 10 SSE df RMS Cp Radjsq SSE1 F Xs c).
Prediction and Prediction Intervals
Multiple Regression BPS 7e Chapter 29 © 2015 W. H. Freeman and Company.
Tutorial 8 Table 3.10 on Page 76 shows the scores in the final examination F and the scores in two preliminary examinations P1 and P2 for 22 students in.
Solution 9 1. a) From the matrix plot, 1) The assumption about linearity seems ok; 2).The assumption about measurement errors can not be checked at this.
Full Model: contain ALL coefficients of interest
Multiple Regression Chapter 14.
Interpretation of Regression Coefficients
Simple Linear Regression
SIMPLE LINEAR REGRESSION
SIMPLE LINEAR REGRESSION
Solutions of Tutorial 9 SSE df RMS Cp Radjsq SSE1 F Xs c).
Chapter Fourteen McGraw-Hill/Irwin
Problems of Tutorial 9 (Problem 4.12, Page 120) Download the “Data for Exercise ” from the class website. The data consist of 1 response variable.
Essentials of Statistics for Business and Economics (8e)
Chapter Thirteen McGraw-Hill/Irwin
Presentation transcript:

Lecture 12 More Examples for SLR More Examples for MLR 9/19/2018 ST3131, Lecture 12

More Examples for SLR Example 1(3.10 Page 77) One may wonder if people of similar heights tend to marry each other. For this purpose, a sample of newly married couples was selected. Let X=height of a husband, Y=height of the wife. The data can be downloadable from the class website. Let Wife=beta0+beta1 husband. Test both the intercept and the slope are zero. Is it true that in general, a tall man expects a tall woman as his wife? Can you verify this statistically? 9/19/2018 ST3131, Lecture 12

9/19/2018 ST3131, Lecture 12 Results for: P049.txt Regression Analysis: Wife versus Husband The regression equation is Wife = 41.9 + 0.700 Husband Predictor Coef SE Coef T P Constant 41.93 10.66 3.93 0.000 Husband 0.69965 0.06106 11.46 0.000 S = 5.928 R-Sq = 58.3% R-Sq(adj) = 57.8% Analysis of Variance Source DF SS MS F P Regression 1 4613.7 4613.7 131.29 0.000 Residual Error 94 3303.3 35.1 Total 95 7917.0 Descriptive Statistics: Wife, Husband Variable N Mean Median TrMean StDev SE Mean Wife 96 163.90 164.50 164.10 9.13 0.93 Husband 96 174.32 175.50 174.49 9.96 1.02 Variable Minimum Maximum Q1 Q3 Wife 141.00 181.00 158.00 170.75 Husband 152.00 192.00 166.25 182.75 9/19/2018 ST3131, Lecture 12

(c) Suppose that Tom is 165cm tall, what is the estimated height of his future wife? Give a 95% prediction interval for his future wife height. (d) Suppose that Bill is 270cm tall, expectedly, how tall is his future wife? Give a 95% confidence interval for the expected height of his wife. 9/19/2018 ST3131, Lecture 12

More Examples for MLR Example 1 (3.3, page 75) table 3.10 shows the scores in the final examination F and the scores in two preliminary examinations P1 and P2 for 22 students in a statistics course. The data can be found in the class website. (a). Fit each of the following models to the data: 9/19/2018 ST3131, Lecture 12

9/19/2018 ST3131, Lecture 12 Results for: P076.txt Regression Analysis: F versus P1 The regression equation is F = - 22.3 + 1.26 P1 Predictor Coef SE Coef T P Constant -22.34 11.56 -1.93 0.068 P1 1.2605 0.1399 9.01 0.000 S = 5.081 R-Sq = 80.2% R-Sq(adj) = 79.2% Analysis of Variance Source DF SS MS F P Regression 1 2094.7 2094.7 81.14 0.000 Residual Error 20 516.3 25.8 Total 21 2611.1 9/19/2018 ST3131, Lecture 12

9/19/2018 ST3131, Lecture 12 Regression Analysis: F versus P2 The regression equation is F = - 1.85 + 1.00 P2 Predictor Coef SE Coef T P Constant -1.854 7.562 -0.25 0.809 P2 1.00427 0.09059 11.09 0.000 S = 4.275 R-Sq = 86.0% R-Sq(adj) = 85.3% Analysis of Variance Source DF SS MS F P Regression 1 2245.6 2245.6 122.89 0.000 Residual Error 20 365.5 18.3 Total 21 2611.1 9/19/2018 ST3131, Lecture 12

9/19/2018 ST3131, Lecture 12 Regression Analysis: F versus P1, P2 The regression equation is F = - 14.5 + 0.488 P1 + 0.672 P2 Predictor Coef SE Coef T P Constant -14.501 9.236 -1.57 0.133 P1 0.4883 0.2330 2.10 0.050 P2 0.6720 0.1793 3.75 0.001 S = 3.953 R-Sq = 88.6% R-Sq(adj) = 87.4% Analysis of Variance Source DF SS MS F P Regression 2 2314.3 1157.1 74.07 0.000 Residual Error 19 296.8 15.6 Total 21 2611.1 9/19/2018 ST3131, Lecture 12

(c ) Which variable individually , P1 or P2 is a better predictor of F? (d) Which of the 3 models would you use to predict the final examination scores for a student who scored 78 and 85 on the first and second preliminary exams, respectively? What is your prediction in this case? 9/19/2018 ST3131, Lecture 12

Salary=annual salary in thousands of dollars Example 2 (3.11, page 79) To decide whether a company is discriminating against women, the following data were collected from the company’s records: Salary=annual salary in thousands of dollars Qualification=an index of employee qualification Sex= 1 for male employee 0 for female employee. Two linear models were fit to the data and the regression outputs are: Model 1: Salary=Constant+Beta1 Qualification+Beta2 Sex+ error Variable Coefficient s.e. T-test P-value constant 20009.5 .8244 24271 <.0001 Qualification .935253 .0500 18.7 Sex .224337 .4681 .479 .6329 Model 2: Qualification=Constant+Beta1 Sex+Beta2 Salary+ error Variable Coefficient s.e. T-test P-value constant -16744.4 896.4 -18.7 <.0001 Qualification .850979 .4349 1.96 .0532 Sex .836991 .0448 18.7 9/19/2018 ST3131, Lecture 12

Suppose that the usual regression assumptions hold. Are men paid more than equally qualified women? Are men less qualified than equally paid women? Do you detect any inconsistency in the above results? Explain. Which model would you advocate if you were the defense lawyer? Explain. 9/19/2018 ST3131, Lecture 12

Example 3 (3.15, page 80) Consider the two models: Develop an F-test for testing the above hypotheses. Let p=1 (SLR) and construct a data set Y and X1 such that H0 is not rejected at the 5% significance level. 9/19/2018 ST3131, Lecture 12

(c ) What does the null hypothesis indicate in this case? (d) Compute the appropriate value of R_square that relates the above two models. 9/19/2018 ST3131, Lecture 12