Text Exercise 5.20 (a) (b) 1 for man-made shade X 1 = 0 otherwise Y =  0 +  1 X 1 +  2 X 2 +  or E(Y) =  0 +  1 X 1 +  2 X 2 Do this by first defining.

Slides:



Advertisements
Similar presentations
Pengujian Parameter Regresi Pertemuan 26 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Advertisements

Correlation and regression Dr. Ghada Abo-Zaid
Multiple Regression Analysis
Text Exercise 4.43 (a) 1 for level A X = 0 otherwise Y =  0 +  1 X +  or E(Y) =  0 +  1 X  0 =  1 = the mean of Y for level B the amount that the.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Recall the hypothesis test we considered last time in Class Exercise #6(a)-(f) in Class Handout #3:
Additional HW Exercise 9.1 (a) A state government official is interested in the prevalence of color blindness among drivers in the state. In a random sample.
Text Exercise 1.38 (a) (b) (Hint: Find the probability of the event in question of occurring.) In the statement of this exercise, you are instructed to.
Additional HW Exercise (a) The breaking strength in pounds of force is being studied for two metal alloys, labeled S and T, and for different temperatures.
Chapter 11 Multiple Regression.
Additional HW Exercise 12.9 (a) The amount of air pressure necessary to crack tubing manufactured by a company is of interest. Mean pressure in hundreds.
Multiple Linear Regression
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
BCOR 1020 Business Statistics
Sections 6.7, 6.8, 7.7 (Note: The approach used here to present the material in these sections is substantially different from the approach used in the.
Statistics for the Social Sciences Psychology 340 Fall 2013 Thursday, November 21 Review for Exam #4.
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
HAWKES LEARNING SYSTEMS math courseware specialists Copyright © 2010 by Hawkes Learning Systems/Quant Systems, Inc. All rights reserved. Chapter 14 Analysis.
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide © 2003 Thomson/South-Western Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved OPIM 303-Lecture #9 Jose M. Cruz Assistant Professor.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
1 1 Slide Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide © 2004 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
QMS 6351 Statistics and Research Methods Regression Analysis: Testing for Significance Chapter 14 ( ) Chapter 15 (15.5) Prof. Vera Adamchik.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Randomized Block Design (Kirk, chapter 7) BUSI 6480 Lecture 6.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
6/2/2016Slide 1 To extend the comparison of population means beyond the two groups tested by the independent samples t-test, we use a one-way analysis.
SW388R6 Data Analysis and Computers I Slide 1 Multiple Regression Key Points about Multiple Regression Sample Homework Problem Solving the Problem with.
Chapter 13 Multiple Regression
Analysis of Variance (One Factor). ANOVA Analysis of Variance Tests whether differences exist among population means categorized by only one factor or.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
VI. Regression Analysis A. Simple Linear Regression 1. Scatter Plots Regression analysis is best taught via an example. Pencil lead is a ceramic material.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 14 Comparing Groups: Analysis of Variance Methods Section 14.1 One-Way ANOVA: Comparing.
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Text Exercise 12.2 (a) (b) (c) Construct the completed ANOVA table below. Answer this part by indicating what the f test statistic value is, what the appropriate.
ONE-WAY BETWEEN-GROUPS ANOVA Psyc 301-SPSS Spring 2014.
© 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Chapter 16 Multiple Regression and Correlation
Analysis of Variance STAT E-150 Statistical Methods.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Lecture Slides Elementary Statistics Tenth Edition and the.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Economics 173 Business Statistics Lecture 18 Fall, 2001 Professor J. Petry
REGRESSION AND CORRELATION SIMPLE LINEAR REGRESSION 10.2 SCATTER DIAGRAM 10.3 GRAPHICAL METHOD FOR DETERMINING REGRESSION 10.4 LEAST SQUARE METHOD.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 9 l Simple Linear Regression 9.1 Simple Linear Regression 9.2 Scatter Diagram 9.3 Graphical.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Lecture 11: Simple Linear Regression
Multiple Regression Equations
CHAPTER 3 Analysis of Variance (ANOVA)
Essentials of Modern Business Statistics (7e)
Simple Linear Regression
Statistics for Business and Economics (13e)
Quantitative Methods Simple Regression.
Slides by JOHN LOUCKS St. Edward’s University.
The least squares line is derived in Section 4.2 by minimizing
Inference for Relationships
SIMPLE LINEAR REGRESSION
Introduction to Regression
Presentation transcript:

Text Exercise 5.20 (a) (b) 1 for man-made shade X 1 = 0 otherwise Y =  0 +  1 X 1 +  2 X 2 +  or E(Y) =  0 +  1 X 1 +  2 X 2 Do this by first defining the appropriate dummy variable(s), and then writing a regression model. You should realize that there is only one qualitative independent variable to identify. type of shade (man-made, tree, none) 1 for tree shade X 2 = 0 otherwise Check some answers brefore submitting Homework #17. Homework #17Score____________/ 10Name ______________

(c)Do this by writing a statement interpreting each parameter in the model of part (b).  0 =  1 =  2 = the mean of Y for no shade the amount that the mean milk production for man-made shade exceeds the mean milk production for no shade the amount that the mean milk production for tree shade exceeds the mean milk production for no shade

Additional HW Exercise 5.2 (a) (b) The mean length of fish is being studied for North Lake, Blue Lake, and Harvey Lake. A 0.05 significance level is chosen for a hypothesis test to see if there is any evidence that mean length of fish is not the same for the three lakes. Fish are randomly selected from each lake, and the lengths in inches are recorded as follows: North Blue Harvey After defining the appropriate dummy variable(s), write a regression model for the prediction of Y = “length of fish” from “lake”. 1 for North Lake X 1 = 0 otherwise 1 for Blue Lake X 2 = 0 otherwise Y =  0 +  1 X 1 +  2 X 2 +  or E(Y) =  0 +  1 X 1 +  2 X 2 With the model defined in part (a), write the formula for the population mean length of fish for each lake. mean for North =  0 +  1 mean for Blue =  0 +  2 mean for Harvey = 00

(c) Do each of the following calculations: y N  = y B  = y H  = y  = SSR = SST = SSE = ( ) / 5 =16 ( ) / 5 =14 ( ) / 5 =12 ( … ) / 15 = 14 (5)(16 – 14) 2 + (5)(14 – 14) 2 + (5)(12 – 14) 2 =40 (13 – 14) 2 + (17 – 14) 2 +,,, + (13 – 14) 2 + (11 – 14) 2 = – 40 =48

1.-continued (d) (e) Complete the one-way ANOVA table displayed. SourcedfSSMSfP-value Error Total Lake < P < 0.05 Summarize the results (Step 4) of the f test to see if there is sufficient evidence that mean length of fish is not the same for the for the three lakes at the 0.05 level. have sufficient evidence to reject H 0. We conclude that mean length of fish is not the same for the for the three lakes (0.025 < P < 0.05). Since f 2,12 = 5.00 and f 2,12;0.05 =3.89, we

(f)Based on the results in part (e), decide whether or not a multiple comparison method is necessary. If no, then explain why not; if yes, then use Tukey’s HSD multiple comparison method. Since we have rejected H 0, we need a multiple comparison method to identify for which lakes mean length of fish is significantly different. q  (k,v) =  NB = q(k,v)q(k,v) s—2 s—2 1 1 — + — n N n B = k = s = v =3  4 = 2 12 q 0.05 (3,12) = ——  — + — 5 5 =  NH =  BH = q(k,v)q(k,v) s—2 s—2 1 1 — + — n N n H = ——  — + — 5 5 = q(k,v)q(k,v) s—2 s—2 1 1 — + — n B n H = ——  — + — 5 5 = 3.4 y N  =16 y B  =14 y H  =12 yN –yN –y B  = yN –yN –y H  = yB –yB – With  = 0.05, we conclude that mean length of fish is larger for North Lake than for Harvey Lake.

Additional HW Exercise 5.3 (a) (b) A study is being conducted to see if the drying time for a brand of outdoor paint can be predicted from temperature in O F and wind velocity in miles per hour. Data from random observation has been stored in the SPSS data file drypaint. drtm =  0 +  1 (tmp) +  2 (wnd) +  Write a first-order model for the prediction of drying time from temperature and wind velocity. Use the Analyze > Regression > Linear options in SPSS to obtain SPSS output displaying the ANOVA table and coefficients in the least squares prediction equation for the first-order model in part (a). To have the mean and standard deviation displayed for the dependent and independent variables, click on the Statistics button, and select the Descriptives option. Title the output to identify the homework exercise (Additional HW Exercise part (b)), your name, today’s date, and the course number (Math 214). Use the File > Print Preview options to see if any editing is needed before printing the output. Attach the printed copy to this assignment before submission.

(c) Summarize the results (Step 4) of the f test to see if there is sufficient evidence that the prediction of drying time from temperature and wind velocity is significant at the 0.05 level. Since f 2,19 = and f 2,19;0.05 = 3.52, we have sufficient evidence to reject H 0. We conclude that the prediction of drying time from temperature and wind velocity is significant (P < 0.01). OR (P < 0.001)

Additional HW Exercise continued (d) (e) In order to see if we can improve prediction, the complete second order model is now considered. Write a complete second-order model for the prediction of drying time using the complete second order model with independent variables temperature and wind velocity. drtm =  0 +  1 (tmp) +  2 (wnd) +  12 (tmp)(wnd) +  11 (tmp) 2 +  22 (wnd) 2 +  Use SPSS to create three new variables, one named tmp_wnd equal to the product of the variables temperature and wind velocity, one named tmp2 equal to the square of temperature, and one named wnd2 equal to the square of wind velocity. Use the Analyze > Regression > Linear options in SPSS to obtain SPSS output displaying the ANOVA table and coefficients in the least squares prediction equation for the complete second-order model in part (d). Since you only need the ANOVA table and the coefficients in the least squares prediction equation, you may choose to delete all other sections of the SPSS output. Title the output to identify the homework exercise (Additional HW Exercise 5.3- part (e)), your name, today’s date, and the course number (Math 214).

(f) Summarize the results (Step 4) of the f test to see if there is sufficient evidence that the prediction of drying time using the complete second order model with independent variables temperature and wind velocity is significant at the 0.05 level. Since f 5,16 = and f 5,16;0.05 = 2.85, we have sufficient evidence to reject H 0. We conclude that the prediction of drying time using the complete second order model with independent variables temperature and wind velocity is significant (P < 0.01). OR (P < 0.001) Use the File > Print Preview options to see if any editing is needed before printing the output. Attach the printed copy to this assignment before submission.

Additional HW Exercise continued (g) Calculate the partial f statistic for the hypothesis test to see if there is sufficient evidence that adding temperature times wind velocity, temperature squared, and wind velocity squared after temperature and wind velocity, is significant at the 0.05 level. SSR(tmp, wnd, tmp_wnd, tmp2, wnd2) = SSR(tmp, wnd) = SSR(tmp_wnd, tmp2, wnd2 | tmp, wnd) = MSR(tmp_wnd, rain2, temp2 | tmp, wnd) = MSE(tmp, wnd, tmp_wnd, tmp2, wnd2) = Numerator degrees of freedom for f statistic = Denominator degrees of freedom for f statistic = – = / (5 – 2) = f 3,16 = ———– =