Tests of Significance for Regression & Correlation b* will equal the population parameter of the slope rather thanbecause beta has another meaning with.

Slides:



Advertisements
Similar presentations
1 SPSS output & analysis. 2 The Regression Equation A line in a two dimensional or two-variable space is defined by the equation Y=a+b*X The Y variable.
Advertisements

Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 9 Inferences Based on Two Samples.
Kin 304 Regression Linear Regression Least Sum of Squares
Inference for Linear Regression (C27 BVD). * If we believe two variables may have a linear relationship, we may find a linear regression line to model.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Objectives (BPS chapter 24)
Session 2. Applied Regression -- Prof. Juran2 Outline for Session 2 More Simple Regression –Bottom Part of the Output Hypothesis Testing –Significance.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
PSY 307 – Statistics for the Behavioral Sciences
Correlation and Regression. Spearman's rank correlation An alternative to correlation that does not make so many assumptions Still measures the strength.
UNDERSTANDING RESEARCH RESULTS: STATISTICAL INFERENCE.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 10: Hypothesis Tests for Two Means: Related & Independent Samples.
Overview of Lecture Parametric Analysis is used for
Tuesday, October 22 Interval estimation. Independent samples t-test for the difference between two means. Matched samples t-test.
Copyright © 2010 Pearson Education, Inc. Chapter 24 Comparing Means.
1 Inference About a Population Variance Sometimes we are interested in making inference about the variability of processes. Examples: –Investors use variance.
5-3 Inference on the Means of Two Populations, Variances Unknown
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression Analysis
Correlation A bit about Pearson’s r.
Lecture 5 Correlation and Regression
AM Recitation 2/10/11.
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
T-test Mechanics. Z-score If we know the population mean and standard deviation, for any value of X we can compute a z-score Z-score tells us how far.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Education 793 Class Notes T-tests 29 October 2003.
Sampling Distribution of the Mean Central Limit Theorem Given population with and the sampling distribution will have: A mean A variance Standard Error.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 24 Comparing Means.
QMS 6351 Statistics and Research Methods Regression Analysis: Testing for Significance Chapter 14 ( ) Chapter 15 (15.5) Prof. Vera Adamchik.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
One-sample In the previous cases we had one sample and were comparing its mean to a hypothesized population mean However in many situations we will use.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 9 Inferences Based on Two Samples.
Hypothesis Testing Using the Two-Sample t-Test
Psychology 301 Chapters & Differences Between Two Means Introduction to Analysis of Variance Multiple Comparisons.
1 Psych 5500/6500 Introduction to the F Statistic (Segue to ANOVA) Fall, 2008.
© Copyright McGraw-Hill 2000
ANOVA Assumptions 1.Normality (sampling distribution of the mean) 2.Homogeneity of Variance 3.Independence of Observations - reason for random assignment.
I271B The t distribution and the independent sample t-test.
Chapter Twelve The Two-Sample t-Test. Copyright © Houghton Mifflin Company. All rights reserved.Chapter is the mean of the first sample is the.
Chapter 12 For Explaining Psychological Statistics, 4th ed. by B. Cohen 1 Chapter 12: One-Way Independent ANOVA What type of therapy is best for alleviating.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics S eventh Edition By Brase and Brase Prepared by: Lynn Smith.
- We have samples for each of two conditions. We provide an answer for “Are the two sample means significantly different from each other, or could both.
Inferences Concerning Variances
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Comparing Means Chapter 24. Plot the Data The natural display for comparing two groups is boxplots of the data for the two groups, placed side-by-side.
Significance Tests for Regression Analysis. A. Testing the Significance of Regression Models The first important significance test is for the regression.
Homogeneity of Variance Pooling the variances doesn’t make sense when we cannot assume all of the sample Variances are estimating the same value. For two.
Hypothesis Testing Example 3: Test the hypothesis that the average content of containers of a particular lubricant is 10 litters if the contents of random.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
1. According to ______ the larger the sample, the closer the sample mean is to the population mean. (p. 251) Murphy’s law the law of large numbers the.
Inference for Two-Samples
Chapter 23 Comparing Means.
Math 4030 – 10b Inferences Concerning Variances: Hypothesis Testing
Estimation & Hypothesis Testing for Two Population Parameters
Kin 304 Regression Linear Regression Least Sum of Squares
Psychology 202a Advanced Psychological Statistics
Independent samples t-test for the difference between two means.
Quantitative Methods PSY302 Quiz 6 Confidence Intervals
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Chapter 23 Comparing Means.
Independent samples t-test for the difference between two means.
Homogeneity of Variance
Summary of Tests Confidence Limits
Chapter 24 Comparing Means Copyright © 2009 Pearson Education, Inc.
REVIEW Course Review.
Correlation and Simple Linear Regression
Presentation transcript:

Tests of Significance for Regression & Correlation b* will equal the population parameter of the slope rather thanbecause beta has another meaning with respect to regression coefficients. b is normally distributed about b* with a standard error of To test the null hypothesis that b* = 0, then When there is a single predictor variable, then testing b is the same as testing r not equal to zero. Distributed as N-2 df. Again, distributed with N-2 df

The difference between two independent slopes (like a t-test for two means) If the null hypothesis is true (b1* = b2*), then the sampling distribution of b1-b2 is normal with a mean of 0 and a standard error of… Thus, And is distributed with N1 + N2 –4 df Because we know… Therefore….

Transformed…. Thus… (is we assume homogeneity of error variance then we can pool the two estimates.) This can be substituted for the individual error variances in the above formula. Distributed with N1+N2 –4 df

The difference between independent correlations Whenis not equal to zero, the sampling distribution of r is NOT normal and its becomes more skewed more approaches 1.0 and the random error is not easily estimated. The same is the case for Fisher’s solution is that we transform r into r’ Then r’ is approximately normally distributed and the standard error is… Sometimes called the z transformation. As a z score, the critical value is 1.96

Test for difference between two related correlation coefficients Note that to apply this test the correlation is required. Distributed with N-3 df. Because the two correlations are not independent, we must take this into account (remember the issue with ANOVA). In this case, specially, we must incorporate a term that reflects the degree to which the two test themselves are related.