Multiple Regression Example A hospital administrator wished to study the relation between patient satisfaction (Y) and the patient’s age (X 1 ), severity.

Slides:



Advertisements
Similar presentations
Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:
Advertisements

Chapter 5 Multiple Linear Regression
Chapter 7: Multiple Regression II Ayona Chatterjee Spring 2008 Math 4813/5813.
Kin 304 Regression Linear Regression Least Sum of Squares
Tests of Significance for Regression & Correlation b* will equal the population parameter of the slope rather thanbecause beta has another meaning with.
The mortality associated with body size and muscle mass, fat mass and abdominal obesity in patients receiving hemodialysis Date: 2012/12/21 實習生:余萍 指導老師:蕭佩珍營養師.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Lab 3 Skinfold Estimations of Body Composition. Skinfold Measurement Is simple and relatively accurate Requires a minimal amount of equipment Can be used.
Topic 15: General Linear Tests and Extra Sum of Squares.
Unit 7: Statistical control in depth: Correlation and collinearity Unit 7 / Page 1© Andrew Ho, Harvard Graduate School of Education.
Multiple Regression. Outline Purpose and logic : page 3 Purpose and logic : page 3 Parameters estimation : page 9 Parameters estimation : page 9 R-square.
Regression With Categorical Variables. Overview Regression with Categorical Predictors Logistic Regression.
Data on morphological features of judokas of various weight categories are often used to select athletes and determine the course of their training. Such.
Multiple Regression [ Cross-Sectional Data ]
1 Regression Analysis Modeling Relationships. 2 Regression Analysis Regression Analysis is a study of the relationship between a set of independent variables.
Partial Regression Plots. Life Insurance Example: (nknw364.sas) Y = the amount of life insurance for the 18 managers (in $1000) X 1 = average annual income.
Multiple Regression Predicting a response with multiple explanatory variables.
MULTIPLE REGRESSION. OVERVIEW What Makes it Multiple? What Makes it Multiple? Additional Assumptions Additional Assumptions Methods of Entering Variables.
Statistics 350 Lecture 21. Today Last Day: Tests and partial R 2 Today: Multicollinearity.
Multiple Regression Involves the use of more than one independent variable. Multivariate analysis involves more than one dependent variable - OMS 633 Adding.
Multiple Regression and Correlation Analysis
Quantitative Business Analysis for Decision Making Multiple Linear RegressionAnalysis.
Objectives of Multiple Regression
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Beefcake Anthony Piro Todd Stringer. The MaxLife 1000 A health calculator designed to aid the personal trainer in creating an appropriate exercise routine.
Statistics: Unlocking the Power of Data Lock 5 STAT 250 Dr. Kari Lock Morgan Multiple Regression SECTIONS 10.1, 10.3 (?) Multiple explanatory variables.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Math 260 Final Project //name //major //picture (optional)
Completing the ANOVA From the Summary Statistics.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.3 Using Multiple Regression to Make Inferences.
©2006 Thomson/South-Western 1 Chapter 14 – Multiple Linear Regression Slides prepared by Jeff Heyl Lincoln University ©2006 Thomson/South-Western Concise.
CORRELATION: Correlation analysis Correlation analysis is used to measure the strength of association (linear relationship) between two quantitative variables.
Created by Erin Hodgess, Houston, Texas
Confidence Intervals vs. Prediction Intervals Confidence Intervals – provide an interval estimate with a 100(1-  ) measure of reliability about the mean.
Multiple Regression INCM 9102 Quantitative Methods.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
Multiple Regression  Similar to simple regression, but with more than one independent variable R 2 has same interpretation R 2 has same interpretation.
Regression Analysis: Part 2 Inference Dummies / Interactions Multicollinearity / Heteroscedasticity Residual Analysis / Outliers.
FIXED AND RANDOM EFFECTS IN HLM. Fixed effects produce constant impact on DV. Random effects produce variable impact on DV. F IXED VS RANDOM EFFECTS.
EXTRA SUMS OF SQUARES  Extra sum squares  Decomposition of SSR  Usage of Extra sum of Squares  Coefficient of partial determination.
Statistical Data Analysis 2010/2011 M. de Gunst Lecture 9.
APPENDIX A. Measuring tape position for abdominal (waist) circumference WIHS Q x Q Form 7: Physical Exam – 04/01/02 Page 11 of 16.
Statistics: Unlocking the Power of Data Lock 5 STAT 250 Dr. Kari Lock Morgan Multiple Regression SECTIONS 10.1, 10.3 Multiple explanatory variables (10.1,
Using SPSS Note: The use of another statistical package such as Minitab is similar to using SPSS.
Statistics 350 Lecture 19. Today Last Day: R 2 and Start Chapter 7 Today: Partial sums of squares and related tests.
Bivariate Correlation and Regression. For Quantitative variables measured on the Interval Level Which one is the First? –Correlation or Regression.
Venn diagram shows (R 2 ) the amount of variance in Y that is explained by X. Unexplained Variance in Y. (1-R 2 ) =.36, 36% R 2 =.64 (64%)
13 Nonparametric Methods Introduction So far the underlying probability distribution functions (pdf) are assumed to be known, such as SND, t-distribution,
Multiple Regression Numeric Response variable (y) p Numeric predictor variables (p < n) Model: Y =  0 +  1 x 1 +  +  p x p +  Partial Regression.
Term Project Math 1040-SU13-Intro to Stats SLCC McGrade-Group 4.
Introduction Many problems in Engineering, Management, Health Sciences and other Sciences involve exploring the relationships between two or more variables.
DSCI 346 Yamasaki Lecture 6 Multiple Regression and Model Building.
A set of equations involving the same variables A solution is a collection of values that makes each equation true. Solving a system = finding all solutions.
8.4 Elimination using Multiplication To solve systems by elimination with a Multiplication twist.
Stats Methods at IC Lecture 3: Regression.
Analysis and Interpretation: Multiple Variables Simultaneously
Regression Analysis.
Correlation, Bivariate Regression, and Multiple Regression
Multiplication table. x
John Loucks St. Edward’s University . SLIDES . BY.
Multiple Regression.
Multiple Linear Regression.
Design and data analisis in Psychology I
Problems of Tutorial 9 1. Consider a data set consisting of 1 response variable (Y) and 4 predictor variables (X1, X2, X3 and X4) with n=40. The following.
Introduction to Logistic Regression
Problems of Tutorial Consider a data set consisting of 1 response variable (Y) and 4 predictor variables (X1, X2, X3 and X4) with n=40. The following.
If the chance of having a boy or girl were the same, would you expect there to be more days on which at least 60% of the babies born were boys in a large.
Statistics 350 Lecture 18.
Cases. Simple Regression Linear Multiple Regression.
Presentation transcript:

Multiple Regression Example A hospital administrator wished to study the relation between patient satisfaction (Y) and the patient’s age (X 1 ), severity of illness (X 2 ), and anxiety level (X 3 ). The administrator randomly selected 23 patients a collected the following data where larger values of Y, X 2, and X 3 are, respectively, associated with more satisfaction, increased severity of illness, and more anxiety. The data is of the form (X 1, X 2, X 3,Y).

Backward Elimination

Forward Selection

Reduced Sets of  j ’s

All “Possible” Models; X 1,X 2 Only

Multicollinearity Example The following data is a portion of that from a study of the relation of the amount of body fat (Y) to the predictor variables (X 1 ) Tricep skinfold thickness, (X 2 ) Thigh circumference, and (X 3 ) Midarm circumference based on a sample of 20 healthy females years old.

The L.S. regression coefficients for X 1 and X 2 of various models are given in the table

Hence, the regression coefficient of one variable depends upon which other variables are in the model and which ones are not. Therefore, a regression coefficient does not reflect any inherent effect of particular predictor variable on the response variable (Only a partial effect, given what other variables are included)