Adding variables. There is a difference between assessing the statistical significance of a variable acting alone and a variable being added to a model.

Slides:



Advertisements
Similar presentations
Probability models- the Normal especially.
Advertisements

C82MST Statistical Methods 2 - Lecture 5 1 Overview of Lecture Testing the Null Hypothesis Statistical Power On What Does Power Depend? Measures of Effect.
Multiple Regression.
Tests of Significance for Regression & Correlation b* will equal the population parameter of the slope rather thanbecause beta has another meaning with.
Chapter 12 Simple Linear Regression
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Objectives (BPS chapter 24)
Correlation Correlation is the relationship between two quantitative variables. Correlation coefficient (r) measures the strength of the linear relationship.
Chapter 12 Simple Linear Regression
# of people per square kilometer # of assaults SCATTERPLOT OF ASSAULTS BY # OF PEOPLE PER SQUARE KILOMETER.
Hypothesis Testing Steps of a Statistical Significance Test. 1. Assumptions Type of data, form of population, method of sampling, sample size.
Correlation. Two variables: Which test? X Y Contingency analysis t-test Logistic regression Correlation Regression.
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Chapter 10 The t Test for Two Independent Samples PSY295 Spring 2003 Summerfelt.
Hypothesis Testing IV (Chi Square)
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
1 Tests with two+ groups We have examined tests of means for a single group, and for a difference if we have a matched sample (as in husbands and wives)
- Interfering factors in the comparison of two sample means using unpaired samples may inflate the pooled estimate of variance of test results. - It is.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
Regression. Population Covariance and Correlation.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
One-Way ANOVA ANOVA = Analysis of Variance This is a technique used to analyze the results of an experiment when you have more than two groups.
Exam 1 Review. Data referenced throughout review The Federal Trade Commission annually rates varieties of domestic cigarettes according to their tar,
PY 603 – Advanced Statistics II TR 12:30-1:45pm 232 Gordon Palmer Hall Jamie DeCoster.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
The general linear test approach to regression analysis.
June 30, 2008Stat Lecture 16 - Regression1 Inference for relationships between variables Statistics Lecture 16.
- We have samples for each of two conditions. We provide an answer for “Are the two sample means significantly different from each other, or could both.
MEASURES OF GOODNESS OF FIT The sum of the squares of the actual values of Y (TSS: total sum of squares) could be decomposed into the sum of the squares.
Dependent (response) Variable Independent (control) Variable Random Error XY x1x1 y1y1 x2x2 y2y2 …… xnxn ynyn Raw data: Assumption:  i ‘s are independent.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Significance Tests for Regression Analysis. A. Testing the Significance of Regression Models The first important significance test is for the regression.
Chapter 9 Minitab Recipe Cards. Contingency tests Enter the data from Example 9.1 in C1, C2 and C3.
1 1 Slide © 2011 Cengage Learning Assumptions About the Error Term  1. The error  is a random variable with mean of zero. 2. The variance of , denoted.
Lec. 19 – Hypothesis Testing: The Null and Types of Error.
 Confidence Intervals  Around a proportion  Significance Tests  Not Every Difference Counts  Difference in Proportions  Difference in Means.
GIS and Spatial Analysis1 Summary  Parametric Test  Interval/ratio data  Based on normal distribution  Difference in Variances  Differences are just.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Part 1: Chapters 7 to 9. 95% within 2 standard deviations 68% within 1 standard deviations 99.7% within 3 standard deviations.
BUS 308 Week 2 Quiz Check this A+ tutorial guideline at 1. How is the sum of squares unlike.
Lecture #25 Tuesday, November 15, 2016 Textbook: 14.1 and 14.3
Chi Square Review.
Political Science 30: Political Inquiry
Basics of Group Analysis
Hypothesis Tests: One Sample
Post Hoc Tests on One-Way ANOVA
Post Hoc Tests on One-Way ANOVA
AP STATISTICS REVIEW INFERENCE
Quantitative Methods PSY302 Quiz Chapter 9 Statistical Significance
Correlation and Simple Linear Regression
P-value Approach for Test Conclusion
Chi-Square Test.
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Is a persons’ size related to if they were bullied
Chi-Square Test.
Multiple Regression BPS 7e Chapter 29 © 2015 W. H. Freeman and Company.
Correlation and Simple Linear Regression
Goodness of Fit The sum of squared deviations from the mean of a variable can be decomposed as follows: TSS = ESS + RSS This decomposition can be used.
Chi-Square Test.
Example on the Concept of Regression . observation
24/02/11 Tutorial 2 Inferential Statistics, Statistical Modelling & Survey Methods (BS2506) Pairach Piboonrungroj (Champ)
Chapter 14 Inference for Regression
One Way ANOVA test Determine whether there are any statistically significant differences between the means of three or more independent groups.
Statistical Inference for the Mean: t-test
Quadrat sampling & the Chi-squared test
Quadrat sampling & the Chi-squared test
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Adding variables. There is a difference between assessing the statistical significance of a variable acting alone and a variable being added to a model.

Summary Test 1 by itself is statistically significant. t=2.97, P-value=0.0074 Test 2 by itself is not statistically significant. t=2.05, P-value=0.0530

Summary Test 1 adds significantly to the model that already contains Test 2. t=2.52, P-value=0.0205 Test 2 does not add significantly to the model that already contains Test 1. t=1.51, P-value=0.1469

Adding another variable Model with Test 1, Test 2, and Test 3. Can think about this as adding Test 3 to the model that already has Test 1 and Test 2 in it.

Change in R2 Model with Test 1, Test 2, and Test 3 – R2=0.526 Model with Test 1 and Test 2 – R2=0.367 Difference=0.526–0.367=0.159

Statistical Significance Is the change in R2 statistically significant? Parameter Estimate for Test 3. t=–2.52, P-value=0.0209 Effect Test for Test 3. F=6.343, P-value=0.0209

Statistical Significance Because the P-value (0.0209) is small (< 0.05) we would reject the null hypothesis that the slope parameter is zero. Test 3 adds significantly to the model with Test 1 and Test 2.

Other Tests Does Test 1 add significantly to the model with Test 2 and Test 3? t=3.52, P-value=0.0023 F=12.424, P-value=0.0023

Statistical Significance Because the P-value (0.0023) is so small (< 0.05) we would reject the null hypothesis that the slope parameter is zero. Test 1 adds significantly to the model with Test 2 and Test 3.

Other Tests Does Test 2 add significantly to the model with Test 1 and Test 3? t=1.36, P-value=0.1909 F=1.840, P-value=0.1909

Statistical Significance Because the P-value (0.1909) is not small (> 0.05) we would fail to reject the null hypothesis that the slope parameter is zero. Test 2 does not add significantly to the model with Test 1 and Test 3.

Unanswered Questions Is Test 3, by itself, statistically significant? Does Test 3 add significantly to the model with Test 1? Does Test 3 add significantly to the model with Test 2?

Test 1, Test 2 and Test 3 Source df Sum of Squares Model 3 67628.99 Error 19 61018.75 C. Total 22 128647.74

Test 1 and Test 2 Source df Sum of Squares Model 2 47259.34 Error 20 81388.40 C. Total 22 128647.74

Test 1 and Test 3 Source df Sum of Squares Model 2 61721.24 Error 20 66926.50 C. Total 22 128647.74

Test 2 and Test 3 Source df Sum of Squares Model 2 27729.65 Error 20 100918.09 C. Total 22 128647.74