Regression Part 3 of 3 Parts Multiple Regression Overview Examples Hypothesis Tests MR ANOVA Table Interpretation Indicator Variables Assumptions Homework.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

Stat 112: Lecture 7 Notes Homework 2: Due next Thursday The Multiple Linear Regression model (Chapter 4.1) Inferences from multiple regression analysis.
Hypothesis Testing Steps in Hypothesis Testing:
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Inference for Regression
Correlation and regression Dr. Ghada Abo-Zaid
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Inference for Regression 1Section 13.3, Page 284.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
By Wendiann Sethi Spring  The second stages of using SPSS is data analysis. We will review descriptive statistics and then move onto other methods.
Chapter 10 Simple Regression.
Multiple Regression Involves the use of more than one independent variable. Multivariate analysis involves more than one dependent variable - OMS 633 Adding.
Multivariate Data Analysis Chapter 4 – Multiple Regression.
SIMPLE LINEAR REGRESSION
Simple Linear Regression Analysis
Topic 3: Regression.
Introduction to Probability and Statistics Linear Regression and Correlation.
Data Analysis Statistics. Levels of Measurement Nominal – Categorical; no implied rankings among the categories. Also includes written observations and.
Business Statistics - QBM117 Statistical inference for regression.
Multiple Regression Research Methods and Statistics.
Correlation and Regression Analysis
Summary of Quantitative Analysis Neuman and Robson Ch. 11
Introduction to Regression Analysis, Chapter 13,
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Relationships Among Variables
Statistical hypothesis testing – Inferential statistics II. Testing for associations.
Lecture 5 Correlation and Regression
Leedy and Ormrod Ch. 11 Gray Ch. 14
Example of Simple and Multiple Regression
SIMPLE LINEAR REGRESSION
Statistics for the Social Sciences Psychology 340 Fall 2013 Thursday, November 21 Review for Exam #4.
The Chi-Square Distribution 1. The student will be able to  Perform a Goodness of Fit hypothesis test  Perform a Test of Independence hypothesis test.
Introduction to Linear Regression and Correlation Analysis
Inference for regression - Simple linear regression
Chapter 13: Inference in Regression
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Hypothesis of Association: Correlation
Correlation & Regression
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Educational Research Chapter 13 Inferential Statistics Gay, Mills, and Airasian 10 th Edition.
ITEC6310 Research Methods in Information Technology Instructor: Prof. Z. Yang Course Website: c6310.htm Office:
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Regression & Correlation. Review: Types of Variables & Steps in Analysis.
Copyright ©2011 Brooks/Cole, Cengage Learning Inference about Simple Regression Chapter 14 1.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
HAWKES LEARNING SYSTEMS Students Matter. Success Counts. Copyright © 2013 by Hawkes Learning Systems/Quant Systems, Inc. All rights reserved. Section 12.3.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 14 Comparing Groups: Analysis of Variance Methods Section 14.3 Two-Way ANOVA.
28. Multiple regression The Practice of Statistics in the Life Sciences Second Edition.
BASIC STATISTICAL CONCEPTS Chapter Three. CHAPTER OBJECTIVES Scales of Measurement Measures of central tendency (mean, median, mode) Frequency distribution.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Linear Regression and Correlation Chapter GOALS 1. Understand and interpret the terms dependent and independent variable. 2. Calculate and interpret.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Interpretation of Common Statistical Tests Mary Burke, PhD, RN, CNE.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
REGRESSION G&W p
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
Basic Statistics Overview
CHAPTER 29: Multiple Regression*
Multiple Regression Models
SIMPLE LINEAR REGRESSION
MGS 3100 Business Analysis Regression Feb 18, 2016
Introductory Statistics
Presentation transcript:

Regression Part 3 of 3 Parts Multiple Regression Overview Examples Hypothesis Tests MR ANOVA Table Interpretation Indicator Variables Assumptions Homework Up Next -Model Building -MR Interaction

Multiple Regression Overview Study of how several variables are related or associated with one dependent variable. For example applications review chapter problem scenarios. Several independent variables may be interval/continuous or categorical/dummy/indicator. Interval and categorical variables may both be included simultaneously in the MR model. Predict or estimate the value of one variable based on the others.

Example Application From Keller, Statistics for Management & Economics, 8E. The Multiple Regression Model Four (4) continuous/interval independent variables.

Interpreting the Regression Coefficients Testing the Regression Coefficients

SourcedfSSMSF RegressionkSSR MSR=SSR/1 F*=MSR/MSE Errorn-k-1SSE MSE=SSE/(n-2) Fcrit=F ,k,n-k-1 Totaln-1SST ANOVA Table for Basic Multiple Regression

Indicator/Dummy/Categorical Independent Variables in Multiple Regression Categorical variables represent data that can be organized into groups. Remember categorical variables can also be either nominal or ordinal. With categorical data we may count up (in whole numbers) the number of observations that fall in a group or category. Examples: ethnicity, gender, class grade, classification, rank, yes/no data groups, bond rating, number of automobiles a family owns, income group (low/med/high), political affiliation, age bracket... For each categorical family, we use the number of categories minus 1 variables in the MR model.

From Keller, Statistics for Management & Economics, 8E. Example: Many university students obtain summer jobs. A business analyst wants to determine whether students in different degree programs earn different amounts. A random sample of 5 students in the BA, BSc, and BBA programs provided earnings data (in thousands) from their summer job. Can the analyst infer that students in different degree programs earn significantly different amounts in their summer jobs? means

Interpreting

14.49 Page

M1M1 M2M2 M2 not significant, not appropriate to interpret. M1, for a given proficiency score, the end of training score of a trainee who has been trained by the traditional method will have an estimated mean score that is 22.3 points below someone who has been web-based trained. Plug in M1=0 and M2=0, we get the estimated MR model for the base group, web-trained. Plug in M1=1 and M2=0, we get the estimated MR model for the first group, traditional-trained. Plug in M1=0 and M2=1, we get the estimated MR model for the second group, CD-trained. This model estimates final scores for a given proficiency score for someone web-trained, the base. This model estimates final scores for a given proficiency score for someone traditionally-trained. Notice, final scores estimates are 22.3 points lower (than the base/web) for subjects trained in this way. This model estimates final scores for a given proficiency score for someone CDROM-trained. Notice, final scores estimates are points higher (than the base/web) for subjects trained in this way. However, since this coefficient was not significant, it is not appropriate to interpret or use this (third) model.

M1M1 M2M2 To obtain a 95% CI for a point estimate:

There are certain assumptions that must be met or need to be near met for our method of least squares and results to be valid. e~iid(0,  2 ) LINE L – Linearity I – Independence of errors N – Normality E – Equal Variance Assumptions

Linearity Only for the interval/continuous variable(s)

Error variable Normality Run Normality Tests Plot Histogram of the residuals/errors Plot residuals vs Row There are 30 rows of data where we can obtain Y-hat and Y. And obtain the errors or residuals.

Error variance constant/equal Plot residuals vs. y-hat Technically, for all levels of predicted score, many error should be near 0, some errors positive (a bit above 0), some errors negative ( a bit below 0), and just a few errors way negative or way positive. This distribution or spread of errors should be about the same at all levels p y-hat or all rows.

Errors Independent Durbin-Watson test, page 494+ Plot residuals vs each i.v. The errors should be independent and neither positively correlated (for each successive row increasing) nor negatively correlated (for each successive row decreasing). Also plot errors vs each independent variable. okay not okay The accuracy of the model should be the same at any profic level. In this plot, higher profic scores produce greater positive errors, the model gets increasingly worse.

Multicollinearity Model Building page 584+

Homework (#5) Multiple Regression a. b. c. Use the MR model to predict the cost for a restaurant with a summated rating of 60 that is located in the city. Use NCSS to obtain the CI. d. Perform a basic MR assumption coherence/violation analysis. e. f. Interpret the regression equation and regression coefficients.