Exam 4 – Optional Times for Final Two options for completing Exam 4 Thursday (12/4/14) – The regularly scheduled time Tuesday (12/9/14) – The optional.

Slides:



Advertisements
Similar presentations
Multiple Regression Analysis
Advertisements

Chapter 4 Part I - Introduction to Simple Linear Regression Applied Management Science for Decision Making, 2e © 2014 Pearson Learning Solutions Philip.
Elementary Statistics Larson Farber 9 Correlation and Regression.
Copyright (c) Bani K. Mallick1 STAT 651 Lecture #18.
© 2000 Prentice-Hall, Inc. Chap Multiple Regression Models.
Multiple Regression Models. The Multiple Regression Model The relationship between one dependent & two or more independent variables is a linear function.
Section 10-3 Chapter 10 Correlation and Regression Correlation
Correlation and Regression. Correlation What type of relationship exists between the two variables and is the correlation significant? x y Cigarettes.
Multiple Regression and Correlation Analysis
Stat 112: Lecture 9 Notes Homework 3: Due next Thursday
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2015 Room 150 Harvill.
© 2001 Prentice-Hall, Inc.Chap 14-1 BA 201 Lecture 23 Correlation Analysis And Introduction to Multiple Regression (Data)Data.
Correlation and Linear Regression Chapter 13 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
Correlation and Linear Regression
Correlation and Linear Regression
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Correlation and Linear Regression Chapter 13 Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, November 19 Chi-Squared Test of Independence.
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2015 Room 150 Harvill.
Lecture 14 Multiple Regression Model
Multiple Linear Regression and Correlation Analysis
Multiple Regression Analysis Multivariate Analysis.
Multiple Regression Analysis
Lecturer’s desk INTEGRATED LEARNING CENTER ILC 120 Screen Row A Row B Row C Row D Row E Row F Row G Row.
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2015 Room 150 Harvill.
Stage Screen Row B Gallagher Theater Row R Lecturer’s desk Row A Row B Row C
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2015 Room 150 Harvill.
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2015 Room 150 Harvill.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Elementary Statistics Correlation and Regression.
Stage Screen Row B Gallagher Theater Row R Lecturer’s desk Row A Row B Row C
Stage Screen Row B Gallagher Theater Row R Lecturer’s desk Row A Row B Row C
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2015 Room 150 Harvill.
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2015 Room 150 Harvill.
Lecturer’s desk Physics- atmospheric Sciences (PAS) - Room 201 s c r e e n Row A Row B Row C Row D Row E Row F Row G Row H Row A
Modern Languages Row A Row B Row C Row D Row E Row F Row G Row H Row J Row K Row L Row M
Lecturer’s desk Physics- atmospheric Sciences (PAS) - Room 201 s c r e e n Row A Row B Row C Row D Row E Row F Row G Row H Row A
Linear Regression and Correlation Chapter GOALS 1. Understand and interpret the terms dependent and independent variable. 2. Calculate and interpret.
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Fall 2015 Room 150 Harvill.
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Fall 2015 Room 150 Harvill.
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Fall 2015 Room 150 Harvill.
Lecturer’s desk Physics- atmospheric Sciences (PAS) - Room 201 s c r e e n Row A Row B Row C Row D Row E Row F Row G Row H Row A
Modern Languages Row A Row B Row C Row D Row E Row F Row G Row H Row J Row K Row L Row M
© 2000 Prentice-Hall, Inc. Chap Chapter 10 Multiple Regression Models Business Statistics A First Course (2nd Edition)
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Linear Regression and Correlation Chapter 13.
Correlation and Regression. O UTLINE Introduction  10-1 Scatter plots.  10-2 Correlation.  10-3 Correlation Coefficient.  10-4 Regression.
BNAD 276: Statistical Inference in Management Spring 2016 Green sheets.
BNAD 276: Statistical Inference in Management Spring 2016 Green sheets.
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2016 Room 150 Harvill.
Just one quick favor… Please use your phone or laptop Please take just a minute to complete Course Evaluations online….. Check your for a link or.
Correlation and Linear Regression
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2016 Room 150 Harvill.
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Spring 2017 Room 150 Harvill Building 9:00 - 9:50 Mondays, Wednesdays.
Please hand in Project 4 To your TA.
Linear Regression Using Excel
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Spring 2017 Room 150 Harvill Building 9:00 - 9:50 Mondays, Wednesdays.
Modern Languages Projection Booth Screen Stage Lecturer’s desk broken
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Fall 2017 Room 150 Harvill Building 10: :50 Mondays, Wednesdays.
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Fall 2017 Room 150 Harvill Building 10: :50 Mondays, Wednesdays.
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Spring 2018 Room 150 Harvill Building 9:00 - 9:50 Mondays, Wednesdays.
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2016 Room 150 Harvill.
BNAD 276: Statistical Inference in Management Spring 2016
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Fall 2016 Room 150 Harvill Building 10: :50 Mondays, Wednesdays.
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Spring 2017 Room 150 Harvill Building 9:00 - 9:50 Mondays, Wednesdays.
Lecturer’s desk Projection Booth Screen Screen Harvill 150 renumbered
Lecturer’s desk Projection Booth Screen Screen Harvill 150 renumbered
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Spring 2019 Room 150 Harvill Building 9:00 - 9:50 Mondays, Wednesdays.
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Spring 2019 Room 150 Harvill Building 9:00 - 9:50 Mondays, Wednesdays.
Presentation transcript:

Exam 4 – Optional Times for Final Two options for completing Exam 4 Thursday (12/4/14) – The regularly scheduled time Tuesday (12/9/14) – The optional later time Must sign up to take Exam 4 on Tuesday (12/2) Only need to take one exam – these are two optional times Sign up today to take Exam 4 at later date: December 9 th No need to sign up if you are taking it at regular time (December 4 th )

Today 12/2/14 Use this as your study guide Multiple Regression Review Exam 4 Review Teacher Evaluations

MGMT 276: Statistical Inference in Management Fall, 2014 Green sheets

Reminder Talking or whispering to your neighbor can be a problem for us – please consider writing short notes. A note on doodling

Exam 4 – Optional Times for Final Two options for completing Exam 4 Thursday (12/4/14) – The regularly scheduled time Tuesday (12/9/14) – The optional later time Must sign up to take Exam 4 on Tuesday (12/2) Only need to take one exam – these are two optional times

Before our next exam (December 4 th ) Lind (10 – 12) Chapter 13: Linear Regression and Correlation Chapter 14: Multiple Regression Chapter 15: Chi-Square Plous (2, 3, & 4) Chapter 17: Social Influences Chapter 18: Group Judgments and Decisions Schedule of readings Study Guide online (shorter and longer version)

Homework: No more homework!! My last name starts with a letter somewhere between A. A – D B. E – L C. M – R D. S – Z Please click in

Multiple regression equations Can use variables to predict behavior of stock market probability of accident amount of pollution in a particular well quality of a wine for a particular year which candidates will make best workers

Y’ = b 1 X 1 + b 2 X 2 + b 3 X 3 + a Measured current workers – the best workers tend to have highest “success scores”. (Success scores range from 1 – 1,000) Try to predict which applicants will have the highest success score. We have found that these variables predict success: Age (X 1 ) Niceness (X 2 ) Harshness (X 3 ) According to your research, age has only a small effect on success, while workers’ attitude has a big effect. Turns out, the best workers have high “niceness” scores and low “harshness” scores. Your results are summarized by this regression formula: Both 10 point scales Niceness (10 = really nice) Harshness (10 = really harsh) Success score = (1)( Age ) + (20)( Nice ) + (-75)( Harsh ) Y’ = b 1 X 1 + b 2 X 2 + b 3 X 3 + a Can use variables to predict which candidates will make best workers

Y’ = b 1 X 1 + b 2 X 2 + b 3 X 3 + a According to your research, age has only a small effect on success, while workers’ attitude has a big effect. Turns out, the best workers have high “niceness” scores and low “harshness” scores. Your results are summarized by this regression formula: Success score = (1)( Age ) + (20)( Nice ) + (-75)( Harsh ) + 700

Y’ is the dependent variable “Success score” is your dependent variable. X 1 X 2 and X 3 are the independent variables “Age”, “Niceness” and “Harshness” are the independent variables. Each “b” is called a regression coefficient. Each “b” shows the change in Y for each unit change in its own X (holding the other independent variables constant). a is the Y-intercept Y’ = b 1 X 1 + b 2 X 2 + b 3 X 3 + a According to your research, age has only a small effect on success, while workers’ attitude has a big effect. Turns out, the best workers have high “niceness” scores and low “harshness” scores. Your results are summarized by this regression formula: Success score = (1)( Age ) + (20)( Nice ) + (-75)( Harsh ) + 700

14-13 The Multiple Regression Equation – Interpreting the Regression Coefficients b 1 = The regression coefficient for age (X 1 ) is “1” The coefficient is positive and suggests a positive correlation between age and success. As the age increases the success score increases. The numeric value of the regression coefficient provides more information. If age increases by 1 year and hold the other two independent variables constant, we can predict a 1 point increase in the success score. Y’ = b 1 X 1 + b 2 X 2 + b 3 X 3 + a Success score = (1)(Age) + (20)(Nice) + (-75)(Harsh) + 700

14-14 The Multiple Regression Equation – Interpreting the Regression Coefficients b 2 = The regression coefficient for age (X 2 ) is “20” The coefficient is positive and suggests a positive correlation between niceness and success. As the niceness increases the success score increases. The numeric value of the regression coefficient provides more information. If the “niceness score” increases by one, and hold the other two independent variables constant, we can predict a 20 point increase in the success score. Y’ = b 1 X 1 + b 2 X 2 + b 3 X 3 + a Success score = (1)(Age) + (20)(Nice) + (-75)(Harsh) + 700

14-15 The Multiple Regression Equation – Interpreting the Regression Coefficients b 3 = The regression coefficient for age (X 3 ) is “-75” The coefficient is negative and suggests a negative correlation between harshness and success. As the harshness increases the success score decreases. The numeric value of the regression coefficient provides more information. If the “harshness score” increases by one, and hold the other two independent variables constant, we can predict a 75 point decrease in the success score. Y’ = b 1 X 1 + b 2 X 2 + b 3 X 3 + a Success score = (1)(Age) + (20)(Nice) + (-75)(Harsh) + 700

Here comes Victoria, her scores are as follows: Age = 30 Niceness = 8 Harshness = 2 What would we predict her “success index” to be? Y’ = = Prediction line: Y’ = b 1 X 1 + b 2 X 2 + b 3 X 3 + a Y’ = 1X X X Y' = (1)(Age) + (20)(Nice) + (-75)(Harsh) We predict Victoria will have a Success Index of 740 Y’ = 740 (1)(30) + (20)(8) - 75(2) Y' = (1)(Age) + (20)(Nice) + (-75)(Harsh) + 700

Here comes Victor, his scores are as follows: Here comes Victoria, her scores are as follows: Age = 30 Niceness = 8 Harshness = 2 What would we predict her “success index” to be? Y’ = = We predict Victor will have a Success Index of 175 Prediction line: Y’ = b 1 X 1 + b 2 X 2 + b 3 X 3 + a Y’ = 1X X X Y' = (1)(Age) + (20)(Nice) + (-75)(Harsh) Y’ = 740 (1)(30) + (20)(8) - 75(2) Y' = (1)(Age) + (20)(Nice) + (-75)(Harsh) Age = 35 Niceness = 2 Harshness = 8 We predict Victoria will have a Success Index of 740 What would we predict his “success index” to be? Y’ = Y’ = 175 (1)(35) + (20)(2) - 75(8) Y' = (1)(Age) + (20)(Nice) + (-75)(Harsh) + 700

We predict Victor will have a Success Index of 175 We predict Victoria will have a Success Index of 740 Can use variables to predict which candidates will make best workers Who will we hire?

14-19 Can we predict heating cost? Three variables are thought to relate to the heating costs: (1) the mean daily outside temperature, (2) the number of inches of insulation in the attic, and (3) the age in years of the furnace. To investigate, Salisbury's research department selected a random sample of 20 recently sold homes. It determined the cost to heat each home last January Multiple Linear Regression - Example

14-21 The Multiple Regression Equation – Interpreting the Regression Coefficients b 1 = The regression coefficient for mean outside temperature (X 1 ) is The coefficient is negative and shows a negative correlation between heating cost and temperature. As the outside temperature increases, the cost to heat the home decreases. The numeric value of the regression coefficient provides more information. If we increase temperature by 1 degree and hold the other two independent variables constant, we can estimate a decrease of $4.583 in monthly heating cost.

14-22 The Multiple Regression Equation – Interpreting the Regression Coefficients b 2 = The regression coefficient for mean attic insulation (X 2 ) is The coefficient is negative and shows a negative correlation between heating cost and insulation. The more insulation in the attic, the less the cost to heat the home. So the negative sign for this coefficient is logical. For each additional inch of insulation, we expect the cost to heat the home to decline $14.83 per month, regardless of the outside temperature or the age of the furnace.

14-23 The Multiple Regression Equation – Interpreting the Regression Coefficients b 3 = The regression coefficient for mean attic insulation (X 3 ) is The coefficient is positive and shows a negative correlation between heating cost and insulation. As the age of the furnace goes up, the cost to heat the home increases. Specifically, for each additional year older the furnace is, we expect the cost to increase $6.10 per month.

Applying the Model for Estimation What is the estimated heating cost for a home if: the mean outside temperature is 30 degrees, there are 5 inches of insulation in the attic, and the furnace is 10 years old?

Multiple regression equations Prediction line Y’ = b 1 X 1 + b 2 X 2 + b 3 X 3 + a Very often we want to select students or employees who have the highest probability of success in our school or company. Andy is an administrator at a paralegal program and he wants to predict the Grade Point Average (GPA) for the incoming class. He thinks these independent variables will be helpful in predicting GPA. High School GPA (X 1 ) SAT - Verbal (X 2 ) SAT - Mathematical (X 3 ) Andy completes a multiple regression analysis and comes up with this regression equation: Y’ = 1.2X X X Y’ = 1.2 gpa sat verb sat math -.411

Here comes Victoria, her scores are as follows: High School GPA = 3.81 SAT Verbal = 500 SAT Mathematical = 600 What would we predict her GPA to be in the paralegal program? Y’ = 1.2 (3.81) (500) (600) Y’ = Y’ = 1.2 gpa sat verb sat math Predict Victor’s GPA, his scores are as follows: High School GPA = 2.63 SAT - Verbal = 469 SAT - Mathematical = 440 Y’ = 1.2 (2.63) (469) (440) Y’ = = Y’ = 1.2 gpa sat verb sat math We predict Victor will have a GPA of = 2.66 Prediction line: Y’ = b 1 X 1 + b 2 X 2 + b 3 X 3 + a Y’ = 1.2X X X We predict Victoria will have a GPA of 3.812

Average Temperature Heating Cost r(18) = r(18) = Insulation Heating Cost r(18) = r(18) = Age of Furnace Heating Cost r(18) = r(18) =

Average Temperature Heating Cost r(18) = r(18) = Insulation Heating Cost r(18) = r(18) = Age of Furnace Heating Cost r(18) = r(18) =

x x x 3 Y’ =

x x x 3 Y’ =

x x x 3 Y’ =

x x x 3 Y’ =

x x x 3 Y’ =

(30) (5) (10) Y’ = Y’ = = $ Calculate the predicted heating cost using the new value for the age of the furnace Use the regression coefficient for the furnace ($6.10), to estimate the change

(30) (5) (10) Y’ = Y’ = = $ $ Calculate the predicted heating cost using the new value for the age of the furnace Use the regression coefficient for the furnace ($6.10), to estimate the change (30) (5) (10) Y’ = Y’ = = $ (30) (5) (11) Y’ = Y’ = = $ These differ by only one year but heating cost changed by $ – = 6.10

High School GPA GPA r(7) = 0.50 r(7) = SAT (Verbal) GPA r(7) = r(7) = SAT (Mathematical) GPA r(7) = r(7) =

High School GPA GPA r(7) = 0.50 r(7) = SAT (Verbal) GPA r(7) = r(7) = SAT (Mathematical) GPA r(7) = r(7) =

High School GPA GPA r(7) = 0.50 r(7) = SAT (Verbal) GPA r(7) = r(7) = SAT (Mathematical) GPA r(7) = r(7) =

High School GPA GPA r(7) = 0.50 r(7) = SAT (Verbal) GPA r(7) = r(7) = SAT (Mathematical) GPA r(7) = r(7) =

No

Yes No

No Yes No

No Yes No

No Yes No High School GPA

No Yes No High School GPA x x x 1 Y’ =

(460) (430) (2.8) Y’ = x x x 1 Y’ = =

(460) (430) (3.8) Y’ = x x x 1 Y’ = =

Yes, use the regression coefficient for the HS GPA (1.2), to estimate the change = = 1.2

Today we will be reviewing for the test using clicker questions. Please note these will not appear on the class website

Just one quick favor… Please take just a minute to fill these out…..

I'll be sitting outside Thank you for a wonderful semester! and good luck with your studies See you at the final exam.