Chapter 9 Analyzing Data Multiple Variables. Basic Directions Review page 180 for basic directions on which way to proceed with your analysis Provides.

Slides:



Advertisements
Similar presentations
Partial and Semipartial Correlation
Advertisements

13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Correlation and Linear Regression.
Soc 3306a Lecture 6: Introduction to Multivariate Relationships Control with Bivariate Tables Simple Control in Regression.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Regression single and multiple. Overview Defined: A model for predicting one variable from other variable(s). Variables:IV(s) is continuous, DV is continuous.
Correlation Chapter 9.
FACTORIAL ANOVA.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 14 Using Multivariate Design and Analysis.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Multiple Regression Involves the use of more than one independent variable. Multivariate analysis involves more than one dependent variable - OMS 633 Adding.
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Correlation and Regression Analysis
Multivariate Data Analysis Chapter 4 – Multiple Regression.
Linear Regression and Correlation
Multiple Regression and Correlation Analysis
Elaboration Elaboration extends our knowledge about an association to see if it continues or changes under different situations, that is, when you introduce.
Multiple Regression – Basic Relationships
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Simple Linear Regression Analysis
Chapter 9: Correlational Research. Chapter 9. Correlational Research Chapter Objectives  Distinguish between positive and negative bivariate correlations,
Review for Final Exam Some important themes from Chapters 9-11 Final exam covers these chapters, but implicitly tests the entire course, because we use.
Beyond Bivariate: Exploring Multivariate Analysis.
Chapter 15 – Elaborating Bivariate Tables
Correlation and Regression
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
Soc 3306a Lecture 10: Multivariate 3 Types of Relationships in Multiple Regression.
Chapter 6 & 7 Linear Regression & Correlation
Soc 3306a Multiple Regression Testing a Model and Interpreting Coefficients.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
PS 225 Lecture 21 Relationships between 3 or More Variables.
Chapter 16 Data Analysis: Testing for Associations.
September 18-19, 2006 – Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development Conducting and interpreting multivariate analyses.
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Correlation. Correlation Analysis Correlations tell us to the degree that two variables are similar or associated with each other. It is a measure of.
Multivariate Analysis Richard LeGates URBS 492. The Elaboration Model History –Developed by Paul Lazarfeld at Columbia in 1946 –Based on Stouffers’ research.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
AP Statistics HW: p. 165 #42, 44, 45 Obj: to understand the meaning of r 2 and to use residual plots Do Now: On your calculator select: 2 ND ; 0; DIAGNOSTIC.
ANOVA, Regression and Multiple Regression March
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Advanced Statistical Methods: Continuous Variables REVIEW Dr. Irina Tomescu-Dubrow.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
DTC Quantitative Research Methods Regression I: (Correlation and) Linear Regression Thursday 27 th November 2014.
Using SPSS Note: The use of another statistical package such as Minitab is similar to using SPSS.
Multiple Regression David A. Kenny January 12, 2014.
Applied Quantitative Analysis and Practices LECTURE#28 By Dr. Osman Sadiq Paracha.
Chapter 8 Relationships Among Variables. Outline What correlational research investigates Understanding the nature of correlation What the coefficient.
Venn diagram shows (R 2 ) the amount of variance in Y that is explained by X. Unexplained Variance in Y. (1-R 2 ) =.36, 36% R 2 =.64 (64%)
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Multiple Independent Variables POLS 300 Butz. Multivariate Analysis Problem with bivariate analysis in nonexperimental designs: –Spuriousness and Causality.
Introduction Many problems in Engineering, Management, Health Sciences and other Sciences involve exploring the relationships between two or more variables.
Regression. Why Regression? Everything we’ve done in this class has been regression: When you have categorical IVs and continuous DVs, the ANOVA framework.
Lecture 10 Regression Analysis
REGRESSION G&W p
Correlation, Bivariate Regression, and Multiple Regression
The Correlation Coefficient (r)
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Regression.
Multiple Regression – Part II
Regression Part II.
The Correlation Coefficient (r)
Presentation transcript:

Chapter 9 Analyzing Data Multiple Variables

Basic Directions Review page 180 for basic directions on which way to proceed with your analysis Provides statistical decision steps based upon the level of measurement for your independent and dependent variables

Elaboration ‘Models’ An association has been found to be statistically significant Consider controlling for variables that would serve as plausible explanations Run chi-square or other comparable tests

Partialling When a control variable is introduced, that is deemed first-order partialling Should you add a second, 2 nd order, and so on The original bivariate relationship is called the zero-order relationship Good for replicating patterns Can use minitab stat > tables and put in multiple variables of interest Don’t use too many – keep it clean

Spurious Relationships If you introduce a third variable (a control) and the relationship that existed in the bivariate setting is now non-significant or even less strong… then, the original relationship is spurious Consider the ice cream and murder example

Specification Specification: when the control variable leads to only ‘some’ of the values of the test variable to become non-significant or weakened It is called specification because there is a determination of which relationship holds

Suppressing Relationships If there is no relationship or a very weak one, introduce control variable to see if the ‘weak’ relationship continues Could be that the variables involved are suppressor variables Within this structure you can also identify the intervening variables: the one that was keeping the original relationship weak

Partial Correlations When a correlation exists between two variables, X and Y, the correlation may be explained by a third variable that is correlated with both X and Y. A partial correlation is used to control for the effect of a third variable when examining the correlation between X and Y. If the correlation between X and Y is reduced, the third variable is responsible for the effect.

Two-Way ANOVA ANOVA can be used for factorial designs: ones that employ more than one IV (or factor). The factorial design is very popular in the social sciences. The big advantage over single variable designs is that it can provide some unique and relevant information about how variables interact or combine in the effect they have on the DV. A two way factorial design tells us about two main effects and the interaction.

Two-Way ANOVA The effects Treatment Effect: a difference in population means Main Effect: a difference in population means for a factor collapsed over the levels of all other factors in the design Interaction: occurs when the effect on one factor is not the same at the levels of another Select: Stat > ANOVA > Two-Way ANOVA

Multiple R Multiple correlation finds the correlation coefficient (r) for every pair of variables The multiple correlation coefficient, R, is the correlation coefficient between the observed values of Y and the predicted values of Y. The value of R will always be positive and will take on a value between zero and one. The direction of the multivariate relationship between the independent and dependent variables can be observed in the sign, positive or negative, of the regression weights.

Multiple R The interpretation of R is similar to the interpretation of the correlation coefficient, the closer the value of R to one, the greater the linear relationship between the independent variables and the dependent variable.

Multiple Regression Multiple regression finds the linear equation that best predicts the value of one of the variables (the dependent variable) from the others.

Multiple Regression Y = a + bX + cZ + e The coefficients (a, b, and c) are chosen so that the sum of squared errors is minimized. The estimation technique is then called least squares or ordinary least squares (OLS).

Multiple Regression The predictors in a regression equation have no order and one cannot be said to enter before the other. Generally in interpreting a regression equation, it makes no scientific sense to speak of the variance due to a given predictor. Measures of variance depend on the order of entry in step-wise regression and on the correlation between the predictors. The semi-partial correlation or unique variance has little interpretative utility.

Multiple Regression The standard test of a specified regression coefficient is to determine if the multiple correlation significantly declines when the predictor variable is removed from the equation and the other predictor variables remain. Test is given by the t or F next to the coefficient.