3 basic analytical tasks in bivariate (or multivariate) analyses:

Slides:



Advertisements
Similar presentations
Managerial Economics in a Global Economy
Advertisements

Lesson 10: Linear Regression and Correlation
Kin 304 Regression Linear Regression Least Sum of Squares
13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Correlation and Linear Regression.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Chapter 15 (Ch. 13 in 2nd Can.) Association Between Variables Measured at the Interval-Ratio Level: Bivariate Correlation and Regression.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
Chapter 10 Simple Regression.
Statistics II: An Overview of Statistics. Outline for Statistics II Lecture: SPSS Syntax – Some examples. Normal Distribution Curve. Sampling Distribution.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Linear Regression and Correlation
Chapter Topics Types of Regression Models
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
Multiple Regression Research Methods and Statistics.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Simple Linear Regression Analysis
Correlation and Regression
Lecture 16 Correlation and Coefficient of Correlation
Chapter 12 Correlation and Regression Part III: Additional Hypothesis Tests Renee R. Ha, Ph.D. James C. Ha, Ph.D Integrative Statistics for the Social.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Correlation and Regression
Chapter 11 Simple Regression
Correlation and Linear Regression
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
Chapter 6 & 7 Linear Regression & Correlation
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Chapter 12 Examining Relationships in Quantitative Research Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Examining Relationships in Quantitative Research
Chapter 16 Data Analysis: Testing for Associations.
Chapter 13 Multiple Regression
Examining Relationships in Quantitative Research
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
The basic task of most research = Bivariate Analysis A.What does that involve?  Analyzing the interrelationship of 2 variables  Null hypothesis = independence.
Correlation & Regression Analysis
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Advanced Statistical Methods: Continuous Variables REVIEW Dr. Irina Tomescu-Dubrow.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
4 basic analytical tasks in statistics: 1)Comparing scores across groups  look for differences in means 2)Cross-tabulating categoric variables  look.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
CORRELATION ANALYSIS.
SOCW 671 #11 Correlation and Regression. Uses of Correlation To study the strength of a relationship To study the direction of a relationship Scattergrams.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
The simple linear regression model and parameter estimation
Simple Bivariate Regression
Regression Analysis AGEC 784.
Inference for Least Squares Lines
REGRESSION G&W p
Kin 304 Regression Linear Regression Least Sum of Squares
Correlation and Regression
Multiple Regression.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
BPK 304W Regression Linear Regression Least Sum of Squares
Quantitative Methods Simple Regression.
BIVARIATE REGRESSION AND CORRELATION
CHAPTER- 17 CORRELATION AND REGRESSION
CORRELATION ANALYSIS.
Simple Linear Regression
Simple Linear Regression
Statistics II: An Overview of Statistics
Product moment correlation
Introduction to Regression
Chapter Thirteen McGraw-Hill/Irwin
Regression Part II.
Presentation transcript:

3 basic analytical tasks in bivariate (or multivariate) analyses: Comparisons  look for differences Cross-tabs  look for contingencies Correlations  look for covariances These seem different (in when, where, & how we use them) but are fundamentally comparable (in their analytical logic)

4 basic questions in bivariate (or multivariate) analyses: Is there a relationship? (statistical significance) What is the pattern? How strong is it? (measures of association) [plus one additional non-statistical question] What does it mean? (substantive importance or theoretical interpretation)

“Correlation” (revisited) Correlation = strength of the linear association between 2 numeric variables It reflects the degree to which the association is described by a “straight-line” relationship The degree to which two variable covary or share common variance – [“covariance” = a key term] It reflects the “predictability” or “commonality” between the two variables Note: r2 (r-squared) = the proportion of variance that “shared” or common to both variables

“Regression” = closely related topic What is the relationship/difference between correlation and regression? Correlation = compute the degree to which values of both variables cluster around a straight line  It is a symmetric description (rxy = ryx) Regression = compute the equation for the “best Fitting” straight line (Y = a + bX)  It is an asymmetric description (bxy <> byx) Why is regression used? To describe the functional pattern that links 2 variables together – what are the values of a and b for X & Y? To predict values of one variable from the other

“Regression” linear regression is the computational procedure of fitting a straight line to a set of bivariate points What does regression tell us about the bivariate relationship between Y & X? Y = a + bX (basic formula for linear relation) a = the intercept b = the “slope” of the line

Regression example (continued)

“Regression” Why is it called “regression”? It is admittedly a confusing, unhelpful name Name reflects peculiarities of its historical development (in the study of genetics and the inheritability of genius)

“Regression” How to obtain the straight line that “best fits” the data? Rely on a method called “least squares”  which minimizes the sum of the squared errors (deviations) between the line and the data points Yields best-fitting line to the points Yields formulas for a and b provided in the book How to compute regression coefficients? By hand calculations: Definitional formula (the familiar one) Computational formula (no deviation scores) By SPSS: Analyze  Regression  Linear

Regression Coefficient: Definitional Formula Regression Coefficient: Computational Formula Intercept (Constant): Computational Formula

Use Example from Fox/Levin/Forde text (p. 277) (handout) “Regression” Use Example from Fox/Levin/Forde text (p. 277) (handout) Prior Charges Sentence (mos) 12 3 13 1 15 19 6 26 5 27 29 4 31 10 40 8 48

# Priors X Sentence Y X2 Y2 XY 12 144 O 3 13 9 169 39 1 15 225 19 361 6 26 36 676 156 5 27 25 729 135 29 841 87 4 31 16 961 124 10 40 100 1600 400 8 48 64 2304 384 = 40 =260 =8010 =1340

Regression Example (cont.) = 3.0 = = = 14.0

Regression example (continued)

Regression (continued) - How to interpret the results? Slope (b) = predicted change in Y for a 1-unit change in X Unstandardized b (b) = in original units/metric Standardized b (β)= in standard (Z) units Intercept (a) = predicted value of Y when X=0 Interpretable only when zero is a meaningful value of X Also called the “constant” term since it is the same for all values of X R (multiple r) = correlation between Y and the predictor(s) (predictability of Y from Xs)

Regression (continued) What are assumptions/requirements of correlation? Numeric variables (interval or ratio level) Linear relationship between variables Random sampling Normal distribution of data Homoscedasticity (equal conditional variances) What if the assumptions do not hold? May be able to transform variables May use alternative procedures

Regression (continued) How to test for significance of results? F-test for overall regression t-test for individual b coefficients What is relation between b and r? What is R? Can we use more than one independent variable? Yes – it’s called “multiple regression” Regress a single dependent variable (Y) on multiple independent variables (a linear combination that best predicts Y)

Multiple Regression - addenda Simultaneous analysis of the regression of a dependent variable on 2 or more independent variables Yi = a +b1X1 + b2 X2 + b3X3 + ei All coefficients are computed at once In this case, the b coefficients are partial regression coefficients They reflect the unique predictive ability of each variable (with the covariance of other independent variables “partialled out”)

Multiple Regression What is Multiple Regression good for?  allows us to estimate: The combined effects of multiple variables The unique effects of individual variables In this case, R & R2 measure how well the entire set of independent variables does in predicting or explaining Y. The overall F-test of the regression refers to whole set of independent variables The t-tests for the individual (partial) coefficients of each variable by itself