Chapter 18 Four Multivariate Techniques Angela Gillis & Winston Jackson Nursing Research: Methods & Interpretation.

Slides:



Advertisements
Similar presentations
Computing in Archaeology Session 12. Multivariate statistics © Richard Haddlesey
Advertisements

Regression Greg C Elvers.
Multiple Regression Fenster Today we start on the last part of the course: multivariate analysis. Up to now we have been concerned with testing the significance.
© McGraw-Hill Higher Education. All Rights Reserved. Chapter 2F Statistical Tools in Evaluation.
Chapter 17 Overview of Multivariate Analysis Methods
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 14 Using Multivariate Design and Analysis.
Statistics for the Social Sciences
Statistics II: An Overview of Statistics. Outline for Statistics II Lecture: SPSS Syntax – Some examples. Normal Distribution Curve. Sampling Distribution.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Multiple Regression Involves the use of more than one independent variable. Multivariate analysis involves more than one dependent variable - OMS 633 Adding.
Multivariate Data Analysis Chapter 4 – Multiple Regression.
An Introduction to Logistic Regression
Elaboration Elaboration extends our knowledge about an association to see if it continues or changes under different situations, that is, when you introduce.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Multiple Regression 2 Sociology 5811 Lecture 23 Copyright © 2005 by Evan Schofer Do not copy or distribute without permission.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Review Regression and Pearson’s R SPSS Demo
Relationships Among Variables
Dummies (no, this lecture is not about you) POL 242 Renan Levine February 13/15, 2007.
Chapter 8: Bivariate Regression and Correlation
Example of Simple and Multiple Regression
Business Research Methods William G. Zikmund Chapter 24 Multivariate Analysis.
Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, November 19 Chi-Squared Test of Independence.
Statistics for the Social Sciences Psychology 340 Fall 2013 Thursday, November 21 Review for Exam #4.
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
LEARNING PROGRAMME Hypothesis testing Intermediate Training in Quantitative Analysis Bangkok November 2007.
Chapter 15 Correlation and Regression
Chapter Eleven A Primer for Descriptive Statistics.
Understanding Regression Analysis Basics. Copyright © 2014 Pearson Education, Inc Learning Objectives To understand the basic concept of prediction.
Soc 3306a Multiple Regression Testing a Model and Interpreting Coefficients.
Statistics for the Social Sciences Psychology 340 Fall 2013 Correlation and Regression.
Basics of Regression Analysis. Determination of three performance measures Estimation of the effect of each factor Explanation of the variability Forecasting.
Correlation is a statistical technique that describes the degree of relationship between two variables when you have bivariate data. A bivariate distribution.
Chapter 17 Partial Correlation and Multiple Regression and Correlation.
Basic Statistics Correlation Var Relationships Associations.
Multiple Regression Lab Chapter Topics Multiple Linear Regression Effects Levels of Measurement Dummy Variables 2.
Discriminant Analysis Discriminant analysis is a technique for analyzing data when the criterion or dependent variable is categorical and the predictor.
 Muhamad Jantan & T. Ramayah School of Management, Universiti Sains Malaysia Data Analysis Using SPSS.
Educ 200C Wed. Oct 3, Variation What is it? What does it look like in a data set?
Regression analysis and multiple regression: Here’s the beef* *Graphic kindly provided by Microsoft.
Data Lab #8 July 23, 2008 Ivan Katchanovski, Ph.D. POL 242Y-Y.
Chapter 16 Data Analysis: Testing for Associations.
11/23/2015Slide 1 Using a combination of tables and plots from SPSS plus spreadsheets from Excel, we will show the linkage between correlation and linear.
Multivariate Data Analysis Chapter 1 - Introduction.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
1 Psych 5510/6510 Chapter 14 Repeated Measures ANOVA: Models with Nonindependent Errors Part 1 (Crossed Designs) Spring, 2009.
Dummy Variables; Multiple Regression July 21, 2008 Ivan Katchanovski, Ph.D. POL 242Y-Y.
Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
 Seeks to determine group membership from predictor variables ◦ Given group membership, how many people can we correctly classify?
Chapter Seventeen Copyright © 2004 John Wiley & Sons, Inc. Multivariate Data Analysis.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Chapter 17 Basic Multivariate Techniques Winston Jackson and Norine Verberg Methods: Doing Social Research, 4e.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
BINARY LOGISTIC REGRESSION
Bivariate & Multivariate Regression Analysis
Multiple Regression.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Chapter 15 Linear Regression
Multiple Regression – Part II
Statistics II: An Overview of Statistics
Prediction/Regression
Prediction/Regression
Multiple Regression – Split Sample Validation
Individual Assignment 6
Introduction to Regression
3 basic analytical tasks in bivariate (or multivariate) analyses:
MGS 3100 Business Analysis Regression Feb 18, 2016
Presentation transcript:

Chapter 18 Four Multivariate Techniques Angela Gillis & Winston Jackson Nursing Research: Methods & Interpretation

Multiple Regression Multiple regression used when we wish to examine the impact of several variables on a dependent variable. It is may be used when you have a ratio level dependent variable and, preferably, ratio level independent variables. There are methods, however, for using independents measured at the nominal or ordinal levels.

Multiple Regression Cont. Multiple regression is a powerful tool because it allows the researcher to: –estimate the relative importance of independent variables in predicting variation in a dependent variable –identify an equation describing the relation between the independent and dependent variables

Multiple Regression Cont. Elements in the equation tell us the relative importance of each factor is in predicting the dependent variable. Recall from the correlation analysis (Chapter 11) the formula Y = a + bX Multiple Regression extends the equation where: Y = a + b 1 X 1 + b 2 X 2 + …b k X k

a This value represents the constant--the point where the regression line crosses the Y axis. b These coefficients represent the weightings for each of the independent variables.

Y = a + ß 1 X 1 + ß 2 X 2 + …ß k X k ß These values are knows as beta weights. A beta weight simply represents a standardized version of a b coefficient. Think of ßs as Z-score versions of the b coefficients. Recall that Z scores standardize variables

Y = a + ß 1 X 1 + ß 2 X 2 + …ß k X k To compute the relative importance of variables once we have the betas we can use the following formula: % Variance explained ß 1 x R 2 by each variable = x 100  ßs

Multiple Regression Cont. When you do your SPSS run the program will produce both b and ßvalues. The a value (called the Constant) will also be printed. R 2 This value will also be reported which tells you how much of the variance in the dependent variable is explained by the equation

Using Non-Ratio Variables Ordinal variables may be included in their raw form (un-recoded) but remember that the equation will underestimate the relative importance of non-ratio variables Nominal variables may be included by transforming them into “dummy variables” Dummy variables are recoded to “presence/absence” variables.

Dummy Variables Create new variables to replace the nominal variable so that you have one fewer variables than categories in the original variable. I.e., if you have a 3 category religion variable (Protestant, Catholic, Jew) then recode this into two new variables coded into presence/absence. (See p. 566 of text.) Presence = 1; Absence = 0.

Tips for Regression Analysis Ensure variables are independent of the dependent variable, not an alternative measure of it. Watch for highly correlated independent variables (multicollinearity). Either convert these into an index (if that makes sense) or simply select one of them.

Tips Cont. Try to achieve ratio level measurement Use Raw data: do not use recoded forms of ordinal or ratio variables Use the Backward option when using regression Interpret weightings with care.

Tips Cont. Monitor number of cases; watch out for cases where N is getting close to number of variables. (Cases = total df + 1 on table) –Repeat analysis eliminating those variables that were dropped early in the analysis: keep in last two or three before final equation –Try “Pairwise” solution –Try “Means” solution where missing values set to mean for the variable

Discriminant Analysis Very similar to Regression analysis but used in cases where the researcher has a nominal dependent variable. Results in the calculation of discriminant coefficients similar to a regression equation D = B 0 + B 1 X 1 + B 2 X B k X k

B 0 This is the constant B 1 The coefficient for the 1st variable To compute the “discriminant score” multiply the coefficient by the observed value (see Table 18.3, p. 572).

Discriminant Analysis Cont. Discriminant analysis assumes ratio level independent variables (similar to regression) but like regression dummy variables may be included. Both standardized and unstandardized coefficients are provided on the output. If you want to calculate relative contributions use the standardized version

Discriminant Analysis Cont. When discriminant is run you will get a report on the % of cases which can be correctly classified by using the information on the independent variables. The analysis relies on Lambda. This statistic measures the proportionate reduction in error that results with knowledge of the independent variables.