Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Slides:



Advertisements
Similar presentations
Topics: Multiple Regression Analysis (MRA)
Advertisements

1 1 Chapter 5: Multiple Regression 5.1 Fitting a Multiple Regression Model 5.2 Fitting a Multiple Regression Model with Interactions 5.3 Generating and.
Chapter 17 Making Sense of Advanced Statistical Procedures in Research Articles.
Chapter 17 Overview of Multivariate Analysis Methods
FACTORIAL ANOVA.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
More on ANOVA. Overview ANOVA as Regression Comparison Methods.
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Decision Tree Type of Data Qualitative (Categorical) Type of Categorization One Categorical Variable Chi-Square – Goodness-of-Fit Two Categorical Variables.
Correlation and Regression Analysis
Statistics for the Social Sciences Psychology 340 Fall 2006 Putting it all together.
Multivariate Data Analysis Chapter 4 – Multiple Regression.
1 BA 275 Quantitative Business Methods Simple Linear Regression Introduction Case Study: Housing Prices Agenda.
More about Correlations. Spearman Rank order correlation Does the same type of analysis as a Pearson r but with data that only represents order. –Ordinal.
DATA ANALYSIS III MKT525. Multiple Regression Simple regression:DV = a + bIV Multiple regression: DV = a + b 1 IV 1 + b 2 IV 2 + …b n IV n b i = weight.
Analyzing quantitative data – section III Week 10 Lecture 1.
Quantitative Business Analysis for Decision Making Simple Linear Regression.
Statistics for the Social Sciences Psychology 340 Spring 2005 Course Review.
Multiple Regression Research Methods and Statistics.
Simple Linear Regression Analysis
Chapter 9: Correlational Research. Chapter 9. Correlational Research Chapter Objectives  Distinguish between positive and negative bivariate correlations,
Statistical Analysis KSE966/986 Seminar Uichin Lee Oct. 19, 2012.
L Berkley Davis Copyright 2009 MER301: Engineering Reliability Lecture 13 1 MER301: Engineering Reliability LECTURE 13 Chapter 6: Multiple Linear.
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
Completing the ANOVA From the Summary Statistics.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
Chapter 9 Analyzing Data Multiple Variables. Basic Directions Review page 180 for basic directions on which way to proceed with your analysis Provides.
Correlation Correlation is used to measure strength of the relationship between two variables.
Statistics 101: The 95% Rule David Newman, PhD. Levels of Data Nominal Ordinal Interval Ratio Binary--- The Magic Variable Categorical Continuous.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
Correlation and Regression. Section 9.1  Correlation is a relationship between 2 variables.  Data is often represented by ordered pairs (x, y) and.
Chapter 6 Simple Regression Introduction Fundamental questions – Is there a relationship between two random variables and how strong is it? – Can.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
Scatter Diagrams scatter plot scatter diagram A scatter plot is a graph that may be used to represent the relationship between two variables. Also referred.
1 Virtual COMSATS Inferential Statistics Lecture-25 Ossam Chohan Assistant Professor CIIT Abbottabad.
Correlation. Up Until Now T Tests, Anova: Categories Predicting a Continuous Dependent Variable Correlation: Very different way of thinking about variables.
2.5 Using Linear Models A scatter plot is a graph that relates two sets of data by plotting the data as ordered pairs. You can use a scatter plot to determine.
Psychology 202a Advanced Psychological Statistics November 12, 2015.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Advanced Correlation D/RS 1013 Research Questions and Associated Techniques.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Presented by Joe Boffa Bev Bricker Courtney Doussett.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
BPA CSUB Prof. Yong Choi. Midwest Distribution 1. Create scatter plot Find out whether there is a linear relationship pattern or not Easy and simple using.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Chapter 8 Relationships Among Variables. Chapter Outline What correlational research investigates Understanding the nature of correlation What the coefficient.
Topics, Summer 2008 Day 1. Introduction Day 2. Samples and populations Day 3. Evaluating relationships Scatterplots and correlation Day 4. Regression and.
Simple linear regression and correlation Regression analysis is the process of constructing a mathematical model or function that can be used to predict.
رگرسیون چندگانه Multiple Regression
Chapter 11 Regression Analysis in Body Composition Research.
Analysis and Interpretation: Multiple Variables Simultaneously
REGRESSION G&W p
Chapter 9: Correlational Research
B&A ; and REGRESSION - ANCOVA B&A ; and
Simple Linear Regression
2. Find the equation of line of regression
Correlation and Regression
Simple Linear Regression
Simple Linear Regression
Section 5 Multiple Regression.
Sihua Peng, PhD Shanghai Ocean University
Regression Analysis.
Cases. Simple Regression Linear Multiple Regression.
Chapter 14 Multiple Regression
Regression and Correlation of Data
Presentation transcript:

Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples: T-test (independent or paired) Three samples: One-way ANOVA F-test Factorial design: Two-Way ANOVA F-test

Overview of Techniques Case 2 Independent Variable is continuous (x) Dependent Variable is continuous (y) One DV, one predictor: correlation, simple linear regression One DV, multiple predictors: partial correlation, multiple correlation, multiple regression

What if we have two predictor variables? We want to predict depression. We have measured stress and loneliness. We can ask several questions: 1) which is the stronger predictor? 2) how well do they predict depression together? 3) what is the effect of loneliness on depression, controlling for stress?

What if we have two predictor variables? Regressing depression on stress PredictorUnstandardized Coefficient Standard error Standardized Coefficient tsig Stress <.05 Regressing depression on loneliness PredictorUnstandardized Coefficient Standard error Standardized Coefficient tsig Loneliness <.05 R 2 =.0625 R 2 =.04 Which is the better predictor? How well do they predict depression together?

Multiple Correlation

How well do they predict depression together? depression loneliness R2R2

How well do they predict depression together? depression stress R2R2

How well do they predict depression together? depression loneliness stress (a) (b) (c) Multiple R 2 : (a) + (b) + (c)Pearson’s R 2 for loneliness: (a) + (b) Pearson’s R 2 for stress: (c) + (b)

Partial Correlation

depression loneliness stress (a) (b) (c) What is the effect of loneliness controlling for stress? Pearson’s R 2 for loneliness: (a) + (b) Pearson’s R 2 for stress: (c) + (b) Partial R 2 for loneliness: (a) Partial R 2 for stress: (b)

Multiple Regression

Types of effects Total effect of stress: (b) + (c) depression loneliness stress (a) (b) (c) Shared effect of stress and loneliness: (b) Unique effect of stress: (c) Slope coefficients in simple regression capture total effects Slope coefficients in multiple regression capture unique effects

Reasons for Multiple Regression 1) It allows you to directly compare the effect sizes for different predictor variables 2) Adding additional predictors that are related to your Y variable (we call them covariates) allows you to explain more of the residual variance. This makes MS error smaller and increases your power. 2) If you are worried that your key predictor is confounded with other variables, you can “partial them out” or “control for them” in your multiple regression by including them in the analysis. depression loneliness stress (a) (b) (c)

Two separate regressions Regressing depression on stress PredictorUnstandardized Coefficient Standard error Standardized Coefficient tsig Stress <.05 Regressing depression on loneliness PredictorUnstandardized Coefficient Standard error Standardized Coefficient tsig Loneliness <.05 R 2 =.0625 R 2 =.04

Regressing depression on loneliness and stress PredictorUnstandardized Partial Coefficient Standard error Standardized Partial Coefficient tsig Intercept Stress <.05 Loneliness Multiple R 2 =.0625 df = n – p - 1 A multiple regression

SourceSSdfs2s2 Model Error Total The F-test is for the whole model, doesn’t tell you about individual predictors Multiple Regression ANOVA

Categorical Predictors in Multiple Regression

Regressing depression on gender (0=female, 1=male) A dichotomous 0/1 predictor Genderdepression

Regressing depression on gender (0=female, 1=male) PredictorUnstandardized Partial Coefficient Standard error Standardized Partial Coefficient tsig Intercept <.01 Gender A dichotomous 0/1 predictor The intercept coefficient tells you the mean depression of the 0 (female) group The gender coefficient tells you what to add to get the mean depression of the 1 (male) group If the gender coefficient is significant, the groups significantly differ

Categorical and Continuous Predictors in Multiple Regression

Combining Types of Predictors T-tests and ANOVAs use group variables to predict continuous outcomes Correlations and simple regressions use continuous variables to predict continuous outcomes Multiple regressions allow you to use 1) information about group membership and 2) information about other continuous measurements, in the same analysis

Combining Types of Predictors WHY would we want this? Imagine that we have a control group and a highly-provoked group, and we also measure the “TypeA-ness” of each participant. We noticed that because of streaky random sampling, we got more TypeA people in the control group than in the provoked group. Multiple regression allows us to see if there was an effect of our manipulation, controlling for individual differences in TypeA-ness. Basically, it allows us to put a situational manipulation and a personality scale measurement into the same study.

GroupProvokeTypeAaggression Control Control085 High High11124 High11220

PredictorUnstandardized Partial Coefficient Standard error Standardized Partial Coefficient tsig Intercept Provoke <.01 Type A <.01 There is a significant effect of experimental condition and a significant effect of TypeA-ness

General Linear Model

All of the techniques we’ve covered so far can be expressed as special cases of multiple regression If you run a multiple regression with an intercept and no slope, the t-test for the intercept is the same as a single sample t-test. If you put in a dichotomous (0/1) predictor, the t-test for your slope will be the same as an independent samples t-test. If you put in dummy variables for multiple groups, your regression ANOVA will be the same as your one-way ANOVA or two-way ANOVA. If you put in one continuous predictor, your β will be the same as your r.

General Linear Model Plus multiple regression can do so much more! Looking at several continuous predictors together in one model. Controlling for confounds. Using covariates to “soak up” residual variance. Looking at categorical and continuous predictors together in one model. Looking at interactions between categorical and continuous variables.