Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014.

Slides:



Advertisements
Similar presentations
Correlation and Linear Regression.
Advertisements

Chapter 8 The t Test for Independent Means Part 2: Oct. 15, 2013.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Correlation Correlation is the relationship between two quantitative variables. Correlation coefficient (r) measures the strength of the linear relationship.
Chapter 15 (Ch. 13 in 2nd Can.) Association Between Variables Measured at the Interval-Ratio Level: Bivariate Correlation and Regression.
Statistics for the Social Sciences
Ch 12 1-way ANOVA SPSS example Part 2 - Nov 15th.
SIMPLE LINEAR REGRESSION
Prediction/Regression
Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Apr. 15, 2008.
Reminders  HW2 due today  Exam 1 next Tues (9/27) – Ch 1-5 –3 sections: Short answers (concepts, definitions) Calculations (you’ll be given the formulas)
Statistics for the Social Sciences Psychology 340 Spring 2005 Hypothesis testing with Correlation and Regression.
PSY 307 – Statistics for the Behavioral Sciences Chapter 16 – One-Way ANOVA (Cont.)
Chapter 7 Introduction to the t Test Part 2: Dependent Samples March 4, 2008.
Hypothesis Testing Using The One-Sample t-Test
Simple Linear Regression Analysis
Review Regression and Pearson’s R SPSS Demo
Correlation and Regression Quantitative Methods in HPELS 440:210.
Pearson Correlation Example A researcher wants to determine if there is a relationship between the annual number of lost workdays for each plant and the.
Lecture 5 Correlation and Regression
Example of Simple and Multiple Regression
Chapter 11(1e), Ch. 10 (2/3e) Hypothesis Testing Using the Chi Square ( χ 2 ) Distribution.
SIMPLE LINEAR REGRESSION
Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, November 19 Chi-Squared Test of Independence.
Statistics for the Social Sciences Psychology 340 Fall 2013 Thursday, November 21 Review for Exam #4.
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
Anthony Greene1 Correlation The Association Between Variables.
Statistics for the Social Sciences Psychology 340 Fall 2013 Correlation and Regression.
Chapter 11 Correlation Pt 1: Nov. 6, Correlation Association between scores on two variables –Use scatterplots to see the relationship –Rule of.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Social Science Research Design and Statistics, 2/e Alfred P. Rovai, Jason D. Baker, and Michael K. Ponton Bivariate Linear Regression PowerPoint Prepared.
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Reasoning in Psychology Using Statistics
Welcome to MM570 Psychological Statistics Unit 4 Seminar Dr. Srabasti Dutta.
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Welcome to MM570 Psychological Statistics Unit 4 Seminar Dr. Bob Lockwood.
Ch 13: Chi-square tests Part 2: Nov 29, Chi-sq Test for Independence Deals with 2 nominal variables Create ‘contingency tables’ –Crosses the 2 variables.
Significance Tests for Regression Analysis. A. Testing the Significance of Regression Models The first important significance test is for the regression.
© The McGraw-Hill Companies, Inc., Chapter 10 Correlation and Regression.
Correlation and Regression
Correlation and Simple Linear Regression
Multiple Regression.
Hypothesis Testing Using the Chi Square (χ2) Distribution
Reasoning in Psychology Using Statistics
Reasoning in Psychology Using Statistics
Correlation and Simple Linear Regression
Reasoning in Psychology Using Statistics
Spearman Rank Order Correlation Example
Correlation and Simple Linear Regression
Statistics for the Social Sciences
Statistical Inference about Regression
Introduction to the t Test Part 2: Dependent Samples
Reasoning in Psychology Using Statistics
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
Chapter 9 Introduction to the Analysis of Variance
Prediction/Regression
SIMPLE LINEAR REGRESSION
Prediction/Regression
Introduction to the t Test Part 2: Dependent Samples
Reasoning in Psychology Using Statistics
Reasoning in Psychology Using Statistics
Introduction to Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Hypothesis Testing for Corr Same hypothesis testing process as before: 1) State research & null hypotheses – –Null hypothesis states there is no relationship between variables (correlation in pop = 0) –Notation for population corr is rho (  ) –Null:  = 0 (no relationship betw gender & ach) –Research hyp:  doesn’t = 0 (there is a signif relationship betw gender & ach)

(cont.) The appropriate statistic for testing the signif of a correlation (r) is a t statistic Formula changes slightly to calculate t for a correlation: Need to know r and sample size

Find the critical value to use for your comparison distribution – it will be a t value from your t table, with N-2 df Use same decision rule as with t-tests: –If (abs value of) t obtained > (abs value) t critical  reject Null hypothesis and conclude correlation is significantly different from 0.

Example For sample of 35 employees, correlation between job dissatisfaction & stress =.48 Is that significantly greater than 0? Research hyp: job dissat & stress are significantly positively correlated (  > 0) Null hyp: job dissat & stress are not correlated (  = 0) Note 1-tailed test, use alpha =.05

Regression Predictor and Criterion Variables Predictor variable (X) – variable used to predict something (the criterion) Criterion variable (Y) – variable being predicted (from the predictor!) –Use GRE scores (predictor) to predict your success in grad school (criterion)

Prediction Model Direct raw-score prediction model –Predicted raw score (on criterion variable) = regression constant plus the result of multiplying a raw-score regression coefficient by the raw score on the predictor variable –Formula b = regression coefficient (not standardized) a = regression constant

The regression constant ( a ) –Predicted raw score on criterion variable when raw score on predictor variable is 0 (where regression line crosses y axis) Raw-score regression coefficient ( b ) –How much the predicted criterion variable increases for every increase of 1 on the predictor variable (slope of the reg line)

Correlation Example: Info needed to compute Pearson’s r correlation xy(x-Mx)(x-Mx) 2 (y-My)(y-My) 2 (x-Mx)(y-My) Mx= 3.6 My= 4.0 0SSx= SSy= 16SP = 14.0 Refer to this total as SP (sum of products)

Formulas for a and b First, start by finding the regression coefficient (b): Next, find the regression constant or intercept, (a): This is known as the “Least Squares Solution” or ‘least squares regression’

Computing regression line (with raw scores) X Y SS Y SS X SP mean Ŷ = (x)

Interpreting ‘a’ and ‘b’ Let’s say that x=# hrs studied and y=test score (on 0-10 scale) Interpreting ‘a’: –when x=0 (study 0 hrs), expect a test score of.688 Interpreting ‘b’ –for each extra hour you study, expect an increase of.92 pts

Correlation in SPSS Analyze  Correlate  Bivariate –Choose as many variables as you’d like in your correlation matrix  OK –Will get matrix with 3 rows of output for each combination of variables Notice that the diagonal contains corr of variable with itself, we’re not interested in this… 1 st row reports the actual correlation 2 nd row reports the significance value (compare to alpha – if < alpha  reject the null and conclude the correlation differs significantly from 0) 3 rd row reports sample size used to calculate the correlation

Simple Regression in SPSS –Analyze  Regression  Linear –Note that terms used in SPSS are “Independent Variable” (this is x or predictor) and “Dependent Variable” (this is y or criterion) –Class handout of output – what to look for: “Model Summary” section - shows R 2 ANOVA section – 1 st line gives ‘sig value’, if <.05  signif –This tests the significance of the R 2 for the regression. If yes  it does predict y) Coefficients section – 1 st line gives ‘constant’ = a (listed under ‘B’ column) –Other line gives ‘unstandardized coefficient’ = b –Can write the regression/prediction equation from this info…